[Redis] An Application Scenario for copying mysql Data to redis

Source: Internet
Author: User

Because only mysql is used for data storage, it takes a long time to complete complex data statistics. Therefore, you need to import part of the data to redis and save the result to mysql after completing data statistics.

After checking the data, you can bypass the third-party data interaction process by using pipelines. Data is directly imported from mysql to redis very quickly. In my testing environment, it took about 40 seconds to query million data records (of course, related to mysql query time ).

In addition, due to the design of mysql tables, most of the data is stored in different tables with the same structure. For example, table A stores the event data "Start application", and table B stores the event data of the user "open settings page". However, table A and table B have the same structure. Currently, when using mysql for data statistics, operations are performed in sequence. After counting A, you can then count B.

In redis, the data of tables A and B is stored in the same database, and then the data of various events is stored by using containers such as set, you can traverse all event data once to complete statistics.

Therefore, the current application scenario is: Batch export data with the same conditions for each table in mysql (for example, data of the same day), import redis, and merge the statistics. The specific export and import code is as follows (mysql2redis. sh ):

! /Bin/bashmysql_host = 192.168.x.xxmysql _ user = xiaomomysql_pwd = xiaomo database = test_dbtbls_prefix = "Taobao" # When calling shell, the input date parameter cur_dt = "$1" # traverses the table, returned table name listtable_list =$ (mysql-h $ mysql_host-u $ mysql_user-p $ mysql_pwd $ database-A-Bse "show tables") function gen_ SQL () {src_tbl = $1 mysql2redis = "select concat (\ '* 10 \ r \ n', \' $ ', LENGTH (redis_cmd),' \ r \ n', redis_cmd, '\ r \ n', \' $ ', LENGTH (redis_key),' \ r \ n', redis_key, '\ r \ n', \' $ ', LENGTH (hkey1), '\ r \ n', hkey1,' \ r \ n', \ '$', LENGTH (hval1), '\ r \ n', hval1, '\ r \ n', \' $ ', LENGTH (hkey2),' \ r \ n', hkey2, '\ r \ n', \' $ ', LENGTH (hval2), '\ r \ n', hval2,' \ r \ n', \ '$', LENGTH (hkey3), '\ r \ n', hkey3, '\ r \ n', \' $ ', LENGTH (hval3),' \ r \ n', hval3, '\ r \ n', \' $ ', LENGTH (hkey4), '\ r \ n', hkey4,' \ r \ n', \ '$', LENGTH (hval4), '\ r \ n', hval4, '\ R' \) \ FROM (\ SELECT \ 'hmset' AS redis_cmd, uniq_id AS redis_key, \ 'f1' AS hkey1, f1 AS hval1, \ 'F2' AS hkey2, f2 AS hval2, \ 'f3 'AS hkey3, f3 AS hval3, \ 'f1' AS hkey4, f4 AS hval4 \ FROM $ src_tbl WHERE dt =' $ cur_dt '\) as t "echo" $ mysql2redis "} prefix_len = $ (expr length $ tbls_prefix) for arg in $ table_listdo if [[" $ {arg: 0: $ prefix_len} "== {tbls_prefix}] # determine whether the table name matches (the table name starts with a specified prefix) then mysql2redisCmd = $ (gen_ SQL $ arg) echo $ mysql2redisCmd | mysql-u $ mysql_user-p $ mysql_pwd-h $ mysql_host $ database -- skip-column-names -- raw | redis-cli-n 1 -- pipe fi done

Call example:/bin/bash mysql2redis. sh


For more information, see redis resp protocol document http://redis.io/topics/protocol



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.