Bulk import of MySQL table data into the Redis Zset fabric

Source: Internet
Author: User

Work has such a demand, to the user's charm value data ranking, generate a list of the Top 40, every 5 minutes to refresh the list. This requirement is very easy to achieve with Redis's zset. But the data exists in MySQL table, there is 400多万条, how to quickly put it into Redis?

In general, we think of the process of detecting data from MySQL and then depositing it into Redis, but it is not only time consuming, but it does not guarantee the accuracy of the data being written to Redis, there is a deployment difference between them. Through the Google teacher to check the original Redis provides the ability to bulk import data, the original address:

Http://baijian.github.io/2013/10/12/import-data-from-mysql-to-redis.html

Here's a script and some of my understanding:

Mysql-h192.168.2.3-uskst-p ' password ' nyx--skip-column-names--raw < data.sql |/usr/local/redis/bin/redis-cli-h 19 2.168.2.128-p 6479--pipe

Premise: In a Linux environment with MySQL service and Redis service

Mysql-h "source Database IP"-u "user name"-P "password" database name--skip-column-names--raw < Data.sql (given later) |/usr/local/redis/bin/redis-cli-h " Target Redis's IP "

-P "target Redis port"--pipe

The script above is well understood, and the script in Data.sql is given below:

SELECT CONCAT (     ' *4\r\n ',    ' $ ', Length (redis_cmd), ' \ r \ n ', Redis_cmd, ' \ r \ n ',    ' $ ', Length (redis_key), ' \ r \ n ', Redis_key, ' \ r \ n ',    ' $ ', Length (redis_increment), ' \ r \ n ', redis_increment, ' \ r \ n ',    ' $ ', LENGTH (redis_member ), ' \ r \ n ', Redis_member, ' \ R ') from (    SELECT ' Zadd ' as Redis_cmd, ' charmrank:forever:2015-07-14 00:00:00_2050-12-30 23:59:59 ' as Redis_key,    charm as redis_increment,    accountId as Redis_member from      table_name  ) as name

In this script: The beginning 4 refers to the Redis command has a few parts, such as my Zadd key score member command has 4 parts, if it is zscore key member then 3, the other parts should be understood.

The entire command was executed down more than 4 million data in less than 1 minutes ran out, quickly and accurately.

Bulk import of MySQL table data into the Redis Zset fabric

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.