Mysql full-volume backup, incremental backup implementation method _mysql

Source: Internet
Author: User
Tags chmod flush

MySQL full-volume backup, incremental backup. Turn on MySQL's logbin log function. Add the following code to the/etc/my.cnf file:

[Mysqld]
Log-bin = "/home/mysql/logbin.log"
Binlog-format = ROW
log-bin-index = "/home/mysql/logindex"
Binlog_ cache_size=32m
max_binlog_cache_size=512m
max_binlog_size=512m

Restart MySQL. The users and groups in the path/home/mysql are changed to MySQL.

2. Incremental backup
Create the following directories under the/home/mysql/directory:

Mkdir-p/home/mysql/backup/daily

Incremental backup Script

 Cd/home/mysql VI binlogbak.sh #!/bin/bash export Lang=en_us. UTF-8 bakdir=/home/mysql/backup/daily bindir=/home/mysql Logfile=/home/mysql/backup/binlog.log BinFile=/home/mysql /logindex.index mysqladmin-uroot-proot123 flush-logs #这个是用于产生新的mysql-bin.00000* file counter= ' wc-l $BinFile |awk ' {print
The nextnum=0 #这个for循环用于比对 $Counter $NextNum These two values to determine whether the file is present or up to date.
    For file in ' cat $BinFile ' do base= ' basename $file ' #basename用于截取mysql-bin.00000* filename, remove./mysql-bin.000005 front./ nextnum= ' expr $NextNum + 1 ' If [$NextNum-eq $Counter] then echo $base skip! >> $LogFile Else dest= $BakDir/$base if (test-e $dest) #test-E is used to detect the presence of the target file exist! to the
        Ogfile go. then echo $base exist!
        >> $LogFile Else CP $BinDir/$base $BakDir echo $base copying >> $LogFile fi fi done echo ' date + '%y year%m month%d day%h:%m:%s ' Bakup succ! >> $LogFile 

Give binlogbak.sh Execute Permissions

chmod a+x/home/mysql/binlogbak.sh

3. Full-scale backup

VI databak.sh

#!/bin/bash
export Lang=en_us. UTF-8
bakdir=/home/mysql/backup
logfile=/home/mysql/backup/bak.log
date= ' Date +%y%m%d '
begin= ' Date + "%y%m month%d%h:%m:%s"
cd $BakDir
dumpfile= $Date. sql
gzdumpfile= $Date. sql.tgz
mysqldump- uroot-proot123--all-databases--flush-logs--delete-master-logs--single-transaction > $DumpFile
TAR-CZVF $  Gzdumpfile $DumpFile
rm $DumpFile

count=$ (ls-l *.tgz |wc-l)
if [$count-ge 5]
then
file=$ (ls -L *.tgz |awk ' {print $} ' |awk ' nr==1 ')
rm-f $file
fi
#只保留过去四周的数据库内容

last= ' date + '%y year%m month%d days%h:%m: %s "'
Echo begins: $Begin end: $Last $GZDumpFile succ >> $LogFile
cd $BakDir/daily
rm-f *

Give databak.sh Execute Permissions

chmod a+x/home/mysql/databak.sh

4, open the timing task

Vi/etc/crontab

#每个星期日凌晨3:00 Perform full backup script
0 3 * * 0/home/mysql/databak.sh >/dev/null 2>&1
#周一到周六凌晨3 : 00 make an incremental backup
0 3 * * 1-6/home/mysql/binlogbak.sh >/dev/null 2>&1

Make the above timed task effective

Crontab/etc/crontab

View timed Tasks

Crontab-l

Complete.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.