Remotely automatically back up Linux servers with seven KN storage for 7 days

Source: Internet
Author: User
Tags parent directory tidy blank page vps

Constrained by the server space, we can not be on the VPS every day to back up a new website data, one is not necessary, the second is to occupy space. Let's compromise and use the name of the week, and each backup will overwrite the same day's file on the previous week. This allows you to back up only 7 copies of your data without occupying a particularly large space.

If your VPS is petty or the site data is too large, then you can compromise, only in Tuesday, six to do backup, that is, a total of three data. You can even choose a day of one weeks to do a backup, that is, a weekly backup, is also possible.

In the case of space permitting, personal recommendation to do 7 days of circular backup, when the VPS data is lost, to the maximum extent of disaster tolerance, recovery to the day before the loss, or even the early morning backup of all the data, very perfect!
②, seven cows remote backup

Seven cattle remote backup is to synchronize the local 7 days of backup data daily to seven cow cloud storage private space, even if the entire VPS crash, but also calmly recover data. The equivalent of double insurance, you know seven cows and your VPS at the same time the probability of downtime is very small.
Ii. preparatory work

①, tidy up the VPS has a database user name and password (if not clear can be directly with the root account of MySQL);

②, tidy up VPS has the root directory of the site, such as/home/wwwroot/bandwagonhost.biz

③, think about the path where you want to store the backup files locally, such as/home/wwwbackup

④, there is no seven cattle account, click here to register one, and then create a new private space (for data security, do not use the public space, only to temporarily switch to public status when recovering data), the record space name and account key:

Linux/vps Local seven day circular backup and seven KN remote backup script

Linux/vps Local seven day circular backup and seven KN remote backup script
Three, local seven days backup
①, Login VPS Writing Script

The script code is as follows:
Click Expand Code

Instructions for use:
②, change the code 2 MyPassword to your custom compression password (to ensure data security), and Save as backup.sh, perform chmod +x backup.sh assign Execute permissions.

To avoid duplicating transcoding issues, the following packages are shared [click here to]

Ps: Download the code package is not encrypted packaging, please refer to the above code, you add the "-p password" parameter.
③, execute./backup.sh--help can be obtained with the following help description:

Linux/vps Local seven day circular backup and seven KN remote backup script
④, backing up the database:

The command line example is as follows:
Shell
./backup.sh db zhangge.net zhangge_db Zhangge 123456/home/wwwbackup/bandwagonhost.biz
1

./backup.sh db zhangge.net zhangge_db Zhangge 123456/home/wwwbackup/bandwagonhost.biz

Command parameter Description:

Parameter 1:db, setting the backup type to database

Parameter 2:domain site domain name, such as zhangge.net, for backup file naming

Parameter 3:dbname the rank of the database to be backed up, such as zhangge_db

Parameter 4:mysqluser mysql user name, if not clear the root account available for MySQL

Parameter 5:mysqlpassword mysql password

Parameter 6:back_path backup file storage path

⑤ backing up Web site files:

Command-line Example:
Shell
./backup.sh file Zgboke.com/home/wwwroot/zgboke.com/home/wwwbackup/bandwagonhost.biz
1

./backup.sh file Zgboke.com/home/wwwroot/zgboke.com/home/wwwbackup/bandwagonhost.biz

Command parameter Description:

Parameter 1:file, setting the backup type to Web site file

Parameter 2:domain site domain name, such as Zgboke.com, is also used for backup file naming

Parameter 3:site_path the path to the Web site file, such as/home/wwwroot/bandwagonhost.biz

Parameter 4:back_path backup file storage path

Tips: In order to facilitate the back of the seven cow synchronization, it is recommended to store the backup files to the same parent directory, and then the site domain name to name the current directory, for example, the same VPS backup data, I put to/home/wwwbackup/, and then the respective website domain name to name, Zhangge.net's database and files I have stored in the/home/wwwbackup/zhangge.net.

Last additions: 2014/10/25, when you manually perform a command check, the following error message is reported:

Access denied for user ' dbuser ' @ ' localhost ' to the database ' db ' When using LOCK TABLES

That indicates that the user does not have lock table permissions, resulting in data export failure! This situation generally occurs when a database is executing a query action, at this time the normal account is unable to export the backup!

Workaround:

Method ①, modify the backup script above, and locate the following
Shell
MYSQLDUMP-U$MYSQLUSER-P$MYSQLPD $dbname > $back _path/$domain \_db_$today\.sql
1

MYSQLDUMP-U$MYSQLUSER-P$MYSQLPD $dbname > $back _path/$domain \_db_$today\.sql

Add the--skip-lock-tables parameter, that is, do not lock the table export (some of the data that is being updated may be lost, of course, the probability is very small in the early hours).
Shell
MYSQLDUMP-U$MYSQLUSER-P$MYSQLPD $dbname--skip-lock-tables> $back _path/$domain \_db_$today\.sql
1

MYSQLDUMP-U$MYSQLUSER-P$MYSQLPD $dbname--skip-lock-tables> $back _path/$domain \_db_$today\.sql

Method ②, perform a backup using the root account:

Perform CRONTAB-E modify the Linux scheduled task, modify the database backup plan command line to the user named MySQL root account:
Shell
./backup.sh db zhangge.net zhangge_db root rootpasswd/home/wwwbackup/zhangge.net
1

./backup.sh db zhangge.net zhangge_db root rootpasswd/home/wwwbackup/zhangge.net

Personal recommendation Method ②, the maximum guarantee the completeness of the data backup!

⑥, build a seven-day backup task plan

I. Execute crontab-l first to see if the command exists, if it does not exist, use yum-y install Vixie-cron crontabs installation Crond

II. After confirming that the Crond is already present, perform/etc/init.d/crond start to ensure the Crond service is started and perform chkconfig crond on set to boot.

III. Execute crontab-e to set up a task plan, as follows is the task plan for my VPS, please refer to add:
#backup zgboke.com:
0 3 * * */root/scripts/backup.sh db zgboke.com zgboke zgboke 123456/home/wwwbackup/zgboke.com >/dev/null 2>&1
5 3 * * */root/scripts/backup.sh file zgboke.com/home/wwwroot/zgboke.com/home/wwwbackup/zgboke.com >/dev/null 2> &1
#backup zhangge.net:
3 * * */root/scripts/backup.sh db zhangge.net zhangge Zhangge 123456/home/wwwbackup/zhangge.net >/dev/null 2>&A Mp;1
3 * * */root/scripts/backup.sh file zhangge.net/home/wwwroot/zhangge.net/home/wwwbackup/zhangge.net >/dev/null 2 >&1
1
2
3
4
5
6

#backup zgboke.com:
0 3 * * */root/scripts/backup.sh db zgboke.com zgboke zgboke 123456/home/wwwbackup/zgboke.com >/dev/null 2>&1
5 3 * * */root/scripts/backup.sh file zgboke.com/home/wwwroot/zgboke.com/home/wwwbackup/zgboke.com >/dev/null 2> &1
#backup zhangge.net:
3 * * */root/scripts/backup.sh db zhangge.net zhangge Zhangge 123456/home/wwwbackup/zhangge.net >/dev/null 2>&A Mp;1
3 * * */root/scripts/backup.sh file zhangge.net/home/wwwroot/zhangge.net/home/wwwbackup/zhangge.net >/dev/null 2 >&1

Each site is 2 rows, the first row backs up the database, and the second row backs up the Web site files.

Crontab parameter format {Time of day week + command line}

So, take line 2nd, that is, 3 o'clock in the morning every day, execute the following command line, the last >/dev/null 2>&1 means to block any log information.

Note: The script needs to write absolute path, such as/root/scripts/backup.sh and to have execute permission, can execute chmod +x again to empower

After completing the above steps, your VPS has implemented a local seven-day circular backup mechanism. Here's how to sync to seven kn.
Four or seven KN remote backup

The principle of remote backup to seven KN, which has been explained in previous posts, is unclear please refer to: https://zhangge.net/4221.html

Here are the specific practices:
①, download my tidy seven kn sync kit (available in 64-bit only):

Shell
wget Http://static.zhangge.net/diy_tools/QN_Backup_tools.zip
1

wget Http://static.zhangge.net/diy_tools/QN_Backup_tools.zip

Alternate Download:

②, unpack the toolkit:

Unzip Qn_backup_tools.zip

Once the decompression is complete, you will get a tools folder with three files, respectively:

Qrsync: Seven Cow Sync tool (only for Linux64 bit, other versions please click here to download the corresponding version to replace this file)

QRSYNC.CONF: Universal Configuration Template

CONFIG.SH: Configure initialization of auxiliary scripts

③, configuring Qrsync, and scheduling tasks
I. Can use Vim and crontab-e children's shoes to ignore config.sh, directly using Vim editor qrsync.conf:

{"src": "/home/wwwbackup", "dest": "Qiniu:access_key=your ak&secret_key=your sk&bucket=bucket_name& threshold=512000 "," deletable ": 0," Debug_level ": 1}
1

{"src": "/home/wwwbackup", "dest": "Qiniu:access_key=your ak&secret_key=your sk&bucket=bucket_name& threshold=512000 "," deletable ": 0," Debug_level ": 1}

Modify the code in the/home/wwwbackup, Your AK, Your SK, bucket_name three parameters can be.

/home/wwwbackup indicates the directory to be synchronized to the seven KN, which is the file storage path backed up for seven days above

Your AK: Your seven bull account Access_key

Your SK: The Secret_key of your seven bull account

Bucket_name: Seven kn private space name for the backup file to be stored

After you change, perform crontab-e to add the following scheduled tasks:
#Add by Qn_backup Scripts
0 4 * * */root/tools/qrsync/root/tools/qrsync.conf >/dev/null 2>&1 &
1
2

#Add by Qn_backup Scripts
0 4 * * */root/tools/qrsync/root/tools/qrsync.conf >/dev/null 2>&1 &

Ps: The code means to synchronize to seven kn every 4 o'clock in the morning, remember to modify the actual path of Qrsync and qrsync.conf.
II. Students who are not familiar with vim and crontab can execute the initialization script I have written as follows: config.sh

Shell
[Email protected]_server tools]# SH config.sh
Please input the Access_key: Enter seven kn access_key key
Please input the Secret_key: Enter seven kn secret_key key
Please input the bucket name: Enter seven kn private space
Please input the backup path: Enter the local seven-day circular backup storage path, such as/home/wwwbackup
#回车后将开始初始化配置, a message similar to the following appears successfully:
echo =========================the crontab list===============================
...
...
#Add by Qn_backup Scripts
0 4 * * */root/tools/qrsync/root/tools/qrsync.conf >/dev/null 2>&1 &
echo =========================the crontab list===============================
Configure success!
1
2
3
4
5
6
7
8
9
10
11
12
13

[Email protected]_server tools]# SH config.sh
Please input the Access_key: Enter seven kn access_key key
Please input the Secret_key: Enter seven kn secret_key key
Please input the bucket name: Enter seven kn private space
Please input the backup path: Enter the local seven-day circular backup storage path, such as/home/wwwbackup
#回车后将开始初始化配置, a message similar to the following appears successfully:
echo =========================the crontab list===============================
...
...
#Add by Qn_backup Scripts
0 4 * * */root/tools/qrsync/root/tools/qrsync.conf >/dev/null 2>&1 &
echo =========================the crontab list===============================
Configure success!

After completing all of the above steps, your VPS has a local seven-day circular backup and seven KN remote backup!
V. Additional instructions
①, precautions

I. Given the applicability of the script, the compression will contain the absolute path of the file, and the original absolute path will be restored when decompressed.

For example, to unzip the Zhangge.net_7.zip, all files will be taken with the path before the backup:/home/wwwroot/bandwagonhost.biz/

Read the script, and do not want to take the path of compression, you can modify the next, let the script only compress the site directory.

For example, the 29th line of the script can be modified as follows:
Shell
CD $site _path && CD.
Zip--version >/dev/null && zip-9r $back _path/$domain \_$today\.zip $domain
1
2

CD $site _path && CD.
Zip--version >/dev/null && zip-9r $back _path/$domain \_$today\.zip $domain

Note that the site Directory name must match the input domain! Do not understand the script, not recommended changes, with the absolute path does not affect the post-manual recovery!

II. The backup file will have the permissions of the script performer, such as the backup script is executed as root, then the file owner and the permissions of the compressed package is root, then when restoring the site data, you must use chmod Restore permissions to the current Web user, For example, lnmp an installation environment, the Web user is WWW, then unzip the backup file, remember to do: chown-r www:www/site path to restore Owner permissions, or open the site may appear blank page!
②, affixed to my VPS planning task, for reference:

Linux/vps Local seven day circular backup and seven KN remote backup script
③, local backup:

Linux/vps Local seven day circular backup and seven KN remote backup script

Ps: The data in the file represents the day of the week, with a DB representation database, without the presentation of the website files, each site will eventually have 14 files (7 database backup, 7 Web site file backup).
④, seven cows backup:



Consistent with local data, the seven KN sync tool also has the advantage of determining whether a file is updated with a hash value, rather than synchronizing the same file repeatedly, so the daily seven-cow synchronization will only synchronize the backup package that was generated on the day.
Vi. Outlook

This article is about to come to an over, think carefully can find, or there is a certain improvement of space.

For example, you can also add a site recovery mechanism. Synchronize some account and other configuration data to seven KN, when the VPS completely burst, the data can not be retrieved, directly through the script and other tools to download data from seven cows, and then follow the written configuration data, automatically restore the data to the specified date. There is absolutely no need for excessive manual intervention. Think it's very powerful, don't you think?

If a set of fully automatic recovery mechanism, not only can be in the VPS full-blown when the rapid recovery, can also be used for automatic site relocation, there is no feeling like the recent launch of the multi-backup service?

In fact, to achieve is also very simple, write a script can be achieved, just consider the practicality, environmental compatibility and other factors, temporarily do not spend energy to achieve.

Remotely automatically back up Linux servers with seven KN storage for 7 days

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.