Basic tutorial on how to automatically back up Mysql data under Linux system _mysql

Source: Internet
Author: User

1. First create a database backup directory:

mkdir backup
CD backup
mkdir mysqlbackup
cd mysqlbackup

2. Create a backup script

VI Mysqlautobackup

3. Scripting:

Filename= ' Date +%y%m%d '
/mysql bin directory/mysqldump--opt database name-u (MySQL account)-p (mysql password) | gzip >/backup to which directory/name$ Filename.gz


Note: The above use gzip compression, name can be written at will, note-U and MySQL Account no space between the brackets,-p and MySQL password is also.
4. Set the script to executable permissions:

chmod +x Autobackup


5. Write Scheduled tasks:

CRONTAB-E * * *
/bin/bash/script directory/mysqlautobackup


Execute the script 1 minutes a day 1 o'clock in the morning.
Restart Scheduled Tasks:

/etc/rc.d/init.d/crond restart


This completes all steps.

Some common skills
1, create the path to save the backup file/mysqldata

#mkdir/mysqldata

2. Create/usr/sbin/baktest files
Enter the following content:

Mysqldump-uroot-proot Test | gzip >/mysqldata/test ' Date +%y-%m-%d_%h%m%s '. sql.gz

3, modify the file properties so that it can perform

#chmod +x/usr/sbin/baktest

4. Modify/etc/crontab
Add the following

3 * * * root/usr/sbin/baktest

Represents a backup of 01 minutes per day 3 o'clock

5. Reboot Crond

#/etc/rc.d/init.d/crond restart

Complete.


Ps:
Full backup script:

#!/bin/bash
# NAME:QBK Full backup script
# mysql QBK scripts
# by zxsdw.com
# last modify:2015-01-21
#定义脚本存放路径 c6/> #scriptsDir =/usr/sbin
#定义用户名及密码
user=root
userpwd= password
#定义要备份的数据库
database= Database
# Define a full backup file store path
bakdir=/backup/mysql
#eMailFile = $bakDir/email.txt
#eMail =admin@zxsdw.com
# Define log file
logfile= $bakDir/mysqlbak.log
date= ' DATE +%y%m%d '
echo ' >> $LogFile
echo ' > > $LogFile
echo "--------------------------" >> $LogFile
echo $ (date + "%y-%m-%d%h:%m:%s") >> $LogFile
echo "-----------------" >> $LogFile
cd $bakDir
dumpfile= $DATE. sql.gz
mysqldump --flush-logs-u$user-p$userpwd--quick $database | gzip > $DumpFile
echo "Dump done" >> $LogFile
echo "[$DumpFile]backup success!] >> $LogFile
daily_databakdir= $bakDir/daily_backup
cd $bakDir/daily_backup find
$daily _databakdir-name "daily* "-type F-mtime +35-exec rm {} \; >/dev/null 2>&1


Incremental backup Script

#!/bin/bash # NAME:ZBK Incremental backup # mysql ZBK scripts # by zxsdw.com # last modify:2015-01-21 #定义数据库用户名及密码 user=root userpwd= Secret
Code #定义数据库 database= Database #生成一个新的mysql-bin.00000x file, and automatically create a new one if the Err log is cleared.
/usr/local/mysql/bin/mysqladmin-u$user-p$userpwd flush-logs #定义增量备份位置 Daily_databakdir=/backup/mysql/daily_backup #定义MYSQL数据日志目录 mysqldatadir=/usr/local/mysql/var #定义增量日志及目录 emailfile= $daily _databakdir/email.txt #eMail = admin@zxsdw.com #定义变量DATE格式为20150127 date= ' DATE +%y%m%d ' #定义一个总的logFile日志 logfile= $daily _databakdir/mysql$date.log # Beautify log template echo "" > $eMailFile echo "-----------------------" >> $eMailFile #时间格式为15 -01-27 01:06:17 echo $ (date + "%y-%m-%d%h:%m:%s") >> $eMailFile echo "-------------------------" >> $eMailFile #定义删除bin日志的时间范围, The format is 20150124010540 time=$ (date "D 3 day Ago" +%y%m%d%h%m%s) #定义需要增量备份数据的时间范围, in the form of 2015-01-26 01:04:11 starttime=$ (date "D" 1 day ago "+"%y-%m-%d%h:%m:%s ") ########## #开始删除操作美化日志标题 ############## echo" Deletes 3 days before the log ">&gt$eMailFile #删除三天前的bin文件, and update index records in index, beautify the log title mysql-u$user-p$userpwd-e "Purge Master logs before ${time}" &&
echo "Delete 3 days before log" |tee-a $eMailFile #查找index索引里的bin 2 File and assign value to I. Filename= ' cat $mysqlDataDir/mysql-bin.index |awk-f '/' {print $} ' for I-$filename do ######## #开始增量备份操作 to beautify the log title ###### ##### echo "$StartTime start Backup Binlog" >> $eMailFile #利用mysqlbinlog备份1天前增加的数据, and gzip compressed to an incremental backup directory/usr/local/ mysql/bin/mysqlbinlog-u$user-p$userpwd-d $database--start-datetime= "$StartTime" $mysqlDataDir/$i |gzip >> $ daily_databakdir/daily$date.sql.gz |tee-a $eMailFile Done #如果以上备份脚本执行成功, then run the following delete script if [$ = 0] Then # delete mtime>32 The incremental log backup file find $daily _databakdir-name "*.log"-type f-mtime +32-exec rm {} \; >/dev/null 2>&1 CD $daily _databakdir echo "Daily backup succeed" >> $eMailFile else echo "Daily backup FA Il ">> $eMailFile #mail-S" MySQL Backup "$eMail < $eMailFile #备份失败之后发送邮件通知 #fi结束IF判断 fi #把变量eMailFile的内Replace logFile content cat $eMailFile > $logFile #如果上面的IF判断失败, run the incremental log backup file to delete mtime>32 again find $daily _databakdir-name "*.log"- Type F-mtime +32-exec rm {} \; >/dev/null 2>&1 rsync-vzrtopg--delete--progress--password-file=/usr/local/rsync/rsync.passwd root@ $ip:/ zxs/allimg/$ (date-d -1day +%y%m%d)/zxs/allimg/gunzip </backup/mysql/daily_backup/ceshi.sql.gz |

 /usr/local/mysql/bin/mysql-u User name-p password database name--force--force parameter Ignore error

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.