Redis experience and data types used in the Python O & M Project

Source: Internet
Author: User

Redis experience and data types used in the Python O & M Project

I would like to lament that learning things must be learned and used! I have been using redis for a few years. Today I only want to use a set as a python list. Recently I have made several projects mixed with redis, and encountered some problems and methods for improving performance in development. I will share this and learn together.

The following describes the data types of the Redis set.

Sets are a set. The concept of a set is a combination of non-repeated values. Using the Sets data structure provided by Redis, you can store some aggregated data. For example, in a Weibo application, you can store all the followers of a user in one collection, store all its fans in a set. Redis also provides operations such as intersection, union, and difference set for the set, which can be easily implemented, such as common concerns, common preferences, and friends of two degrees, you can also use different commands to choose whether to return the result to the client or save the result to a new set. The above is the application of Sina Weibo.

Create a set and add data.

 
 
  1. [root@66 ~]# redis-cli   
  2. redis 127.0.0.1:6379>    
  3. redis 127.0.0.1:6379>    
  4. redis 127.0.0.1:6379> sadd xiaorui aaa   
  5. (integer) 1   
  6. redis 127.0.0.1:6379> sadd xiaorui bbb   
  7. (integer) 1   
  8. redis 127.0.0.1:6379> sadd xiaorui ccc   
  9. (integer) 1   
  10. redis 127.0.0.1:6379>    
  11. redis 127.0.0.1:6379> SMEMBERS xiaorui   
  12. 1) "aaa" 
  13. 2) "ccc" 
  14. 3) "bbb" 
  15. redis 127.0.0.1:6379>    
  16. redis 127.0.0.1:6379> 

The set cannot write duplicate content.

 
 
  1. redis 127.0.0.1:6379> sadd xiaorui fuck_shencan   
  2. (integer) 1   
  3. redis 127.0.0.1:6379> sadd xiaorui fuck_shencan   
  4. (integer) 0   
  5. redis 127.0.0.1:6379> 

View the collection size

 
 
  1. redis 127.0.0.1:6379> SCARD xiaorui   
  2. (integer) 3   
  3. redis 127.0.0.1:6379> 

Delete

 
 
  1. redis 127.0.0.1:6379> SREM xiaorui aaa   
  2. (integer) 1   
  3. redis 127.0.0.1:6379> SMEMBERS xiaorui   
  4. 1) "ccc" 
  5. 2) "bbb" 
  6. redis 127.0.0.1:6379> 

Intersection of Two Sets

 
 
  1. redis 127.0.0.1:6379> SADD key1 a   
  2. (integer) 1   
  3. redis 127.0.0.1:6379> SADD key1 b   
  4. (integer) 1   
  5. redis 127.0.0.1:6379> SADD key1 c   
  6. (integer) 1   
  7. redis 127.0.0.1:6379> SADD key2 c   
  8. (integer) 1   
  9. redis 127.0.0.1:6379> SADD key2 d   
  10. (integer) 1   
  11. redis 127.0.0.1:6379> SADD key2 e   
  12. (integer) 1   
  13. redis 127.0.0.1:6379> SINTER key1 key2   
  14. 1) "c" 
  15. redis 127.0.0.1:6379> 

You can use a set as a redis list queue. Note that the set Member mode cannot have duplicate values. If your values are not repeated, you may also use the set as a queue.

 
 
  1. redis 127.0.0.1:6379> sadd myset one   
  2. (integer) 1   
  3. redis 127.0.0.1:6379> sadd myset two   
  4. (integer) 1   
  5. redis 127.0.0.1:6379> sadd myset three   
  6. (integer) 1   
  7. redis 127.0.0.1:6379> SPOP myset   
  8. "one" 
  9. redis 127.0.0.1:6379> SMEMBERS myset   
  10. 1) "three" 
  11. 2) "two" 
  12. redis 127.0.0.1:6379>  

Two days ago, I told my friend that my monitoring platform's memory was so bad that he suddenly said that redis's memory consumption must be very high... Nima, brother only uses his large queue. Here we will talk about the queue strength of redis. For example, million queue data records occupy 73 MB of memory left and right. About million data records have around MB of memory.
 

It is best to reduce the number of tasks blocked by redis. If more than five threads go to brpop, redis's cpu usage will be pushed to around 80%, which will seriously affect access to other processes, if you are sure that the task is not always at all times, it is best to set the Access frequency and time interval under the control of your program.
 

When using python to process redis, it is best to use a pool to significantly save speed and resources.
 

 
 
  1. >>> pool = redis.ConnectionPool(host='localhost', port=6379, db=0)  
  2. >>> r = redis.Redis(connection_pool=pool) 

The new version of redis supports pipelines, pipline! Some friends don't quite understand the advantages of the pipeline here. Although pyhton uses a connection pool when connecting to redis, it only performs keepalive for the connection. However, every command push, the pyhton command interacts with each other. After the pipline pipeline is blocked, he will merge all the commands into a pipeline and push them to the redis server. This saves a lot of trouble. This is especially suitable for high concurrency.

Gevent can be used to solve the pub sub communication performance problem of redis. You can directly import the gevent monkey.

 
 
  1. import gevent.monkey   
  2. gevent.monkey.patch_all()   
  3. #http://rfyiamcool.blog.51cto.com/1030776/1435539    
  4. import os   
  5. import sys   
  6. import fcntl   
  7. import gevent   
  8. from gevent.socket import wait_read   
  9.      
  10. from redis import Redis   
  11.      
  12. PID = os.getpid()   
  13.      
  14. red = Redis('localhost')   
  15.      
  16. def echo_stdin():   
  17.     # make stdin non-blocking   
  18.     fcntl.fcntl(sys.stdin, fcntl.F_SETFL, os.O_NONBLOCK)   
  19.     red.publish('echo', "[%i] joined" % (PID,))   
  20.     while True:   
  21.         wait_read(sys.stdin.fileno())   
  22.         l = sys.stdin.readline().strip()   
  23.         s = "[%i] %s" % (PID, l)   
  24.         # save to log   
  25.         red.rpush('echo_log', s)   
  26.         # publish message   
  27.         red.publish('echo', s)   
  28.         if l == 'quit':   
  29.             break 
  30.      
  31. def handler():   
  32.     pubsub = red.pubsub()   
  33.     # first subscribe, then print log (no race condition this way)   
  34.     pubsub.subscribe('echo')   
  35.     # print log   
  36.     for line in red.lrange('echo_log', 0, -1):   
  37.         print '.', line   
  38.     # print channel   
  39.     for msg in pubsub.listen():   
  40.         print '>', msg['data']   
  41.      
  42. gevent.spawn(handler)   
  43. gevent.spawn(echo_stdin).join() 

Of course, General set get sadd hset can also be used with redis. However, there is no advantage, because redis only enables one process to read and write data. We need to call the process by reusing the connections in the program and finally retrieving data, you might as well let him work honestly. Don't make a multi-thread, let him go. I did a stress test here. After using a gevent in python2.7, there was no significant growth in reading and writing in batches.

 
 
  1. >>> import geventredis   
  2. >>> redis_client = geventredis.connect('127.0.0.1', 6379)   
  3. >>> redis_client.set('foo', 'bar')   
  4. 'OK' 
  5. >>> for msg in redis_client.monitor():   
  6.        print msg 

Blog: http://rfyiamcool.blog.51cto.com/1030776/1435539

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.