About backup of massive data tables collected by the archive engine

Source: Internet
Author: User

Back up table data by ID interval daily

#!/bin/bash

Date-d yesterday + "DATE%y-%m-%d%h:%m:%s" >>/mysql_data/t_collection/id.log
/usr/bin/mysql-uroot--password= "Admin"-D userbehavior-e "select Max (ID) from t_collection;" | Grep-v ' Max (ID) ' >>/mysql_data/t_collection/id.log
today= ' Tail-n 1/mysql_data/t_collection/id.log '
yesterday= ' tail-n 3/mysql_data/t_collection/id.log|grep-v ' DATE ' |grep-v $today '

Sql= "Use userbehavior;\n
SELECT * from T_collection where id> $yesterday and id<= $today into outfile '/mysql_data/t_collection/' date-d yester Day + "%y-%m-%d" '. csv ' \ n
Fields TERMINATED by ', ' enclosed by ' \ ' escaped by ' ' LINES TERMINATED by ' \ n '

/usr/bin/mysql-uroot--password= "Admin"-D userbehavior-e "$sql";
Cd/mysql_data/t_collection
Tar czvf ' date-d yesterday + "%y-%m-%d" '. tgz ' date-d yesterday + '%y-%m-%d ' '. csv
Rm-f/mysql_data/t_collection/' date-d yesterday + "%y-%m-%d" '. csv

Back up table data for a specific interval

Mysql-p ' password '-d userbehavior-e "select * from T_collection where id>3899999999 and id<4000000000 into outfile '/mnt/backup/t_c/40ww.csv ' fields TERMINATED by ', ' enclosed by ' \ ' escaped by ' LINES TERMINATED by ' \ n ';

The table data backup script that the program modifies to a table is as follows:

#!/bin/bash
Table=t_collection ' date-d yesterday + "%y%m%d" '
Sql= "Use userbehavior;\n
Select Id,replace (replace (Path, ' \ n ', ' @ '), ' \ R ', ' @ '), replace (replace (Content, ' \ n ', ' @ '), ' \ R ', ' @ '), Createtime from $ Table into outfile '/data/backup/mysql_data/new_collection/' date-d yesterday + "%y-%m-%d" '. csv ' \ n
Fields TERMINATED by ' | ' LINES TERMINATED by ' \ n ' "

/usr/bin/mysql-uroot--password= "Password"-D userbehavior-e "$sql";
cd/data/backup/mysql_data/new_collection/
Tar czvf ' date-d yesterday + "%y-%m-%d" '. tgz ' date-d yesterday + '%y-%m-%d ' '. csv
Rm-f ' date-d yesterday + "%y-%m-%d" '. csv
echo "$ (date +%y%m%d_%h:%m:%s) $table is bak finish" >>/data/backup/mysql_data/new_collection/t_collection_ Bak.log
CP ' date-d yesterday + "%y-%m-%d" '. tgz/mnt/gz_guanggao_29/collection/

Join Crontab

0 * * * Root sh/data/backup/scriptshell/t_collectionnew.sh


#!/bin/bash

Sql= "Use userbehavior;\n
Select a.* from Coocaadaohang a\n
where a.createtime like ' $1% ' \ n
into outfile '/data/backup/mysql_data/daohang/$1.csv ' \ n
CHARACTER SET GBK \ n
Fields terminated by ', ' \ n
Optionally enclosed by ' \ ' ' escaped ' \\\ ' \ n
Lines terminated by ' \ r \ n ';

/usr/bin/mysql-uroot--password= "Password"-D userbehavior-e "${sql}";

Join Crontab

0 * * * root/data/backup/scriptshell/coocaadaohang.sh ' date-d yesterday + "\%y-\%m-\%d" '



This article is from the "My Operations Blog" blog, be sure to keep this source http://linuxpython.blog.51cto.com/10015972/1647261

About backup of massive data tables collected by the archive engine

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.