Linux (Centos) fully automated offsite backup data (WEB+MYSQL)

Source: Internet
Author: User

Before the article begins, ask the webmaster a question: What is important to the webmaster? In fact, for the webmaster, a lot of things are very important. But we are now excluding external factors, to narrow the scope of the web system itself, which is very important? The website data is one of them.

The website data includes the program file itself, the attachment data (Pictures, documents, videos, etc.) and database files which are constantly generated during the operation of the website, and the three chunks are the website data.

Take yourself for example, in this two years more webmaster experience, the most let me pain is one day to get up, has been hard to open the website, check the DNS is also normal, finally asked what happened to the space operator, the result is learned: Server hard disk damage, All data can not be restored (under this note: The General small webmaster bought the machine or space is very low configuration, so do not want to have any data backup service, to add money to buy), the result is a burst of loss. If the site has just started to run soon after this happened or it will not be so heartache, from the beginning again is not difficult. If it is a website that has been running for several years, a sudden accident can be said to be a heavy blow.

Just become a webmaster in the first few months, I have no consideration of the data backup, until one day in the group to see a group of friends sad Experience-the website completely collapsed, the data are all not!! That is operating for more than 3 years of an old station, it fell down, is really smelling the listener tears (a bit exaggerated).

I have seen such a living model, the consequences of not doing data backup is very serious. So I now summed up an experience: if you really want to do a site, you must do a good job of data backup, and preferably offsite backup, some friends think in the machine to do a backup on the local music, rest assured that it is a big mistake. For backup data, there is a real problem: each time the server on its own to pack the relevant data, and then download it back with FTP? This time will be tired, so today in the Linux (Centos 5.5) environment talk about how to achieve fully automatic remote (offsite) to backup the entire station data.

To realize the automatic remote backup function required by the SOFTWARE is: Server (local environment) to install LFTP, remote port to open the FTP service, to log on the directory has read and write permissions, with the above 2 conditions can be.

If the Lftp tool is not installed on the machine, it can be done with the following command:

#yum Install Lftp

Assuming that the relevant script file is stored in the/apps/script directory

#vi/apps/script/autobackup.sh//Enter the following:

#!/bin/bash

#author: www.5ishare.com

echo "Backup Job Start"

#设置要备份的网站目录, which is the root directory of the Web site

File=/apps/wwwdata

#设置本地备份目录, for storing packaged files

backpath=/apps/bkdata/

dbuser= Database user Name

#数据密码, remember to use single quotation marks will be the entire password, if there is single quotation marks in the password with double quotation marks, hey

dbpwd= ' Database Password '

Dbname= database name

domain= the domain name or IP address on the remote side

USER=FTP User Name

Userpwd= ' FTP user password '

bkdate=$ (Date +%y%m%d)

Sqlfile_sql=${dbname}_${bkdate}.sql

Sqlfile_gz=${dbname}_${bkdate}.tar.gz

sqlfp=${backpath}${bkdate}/$sqlfile _sql

If [-E $backpath ${bkdate}]

Then

Echo $backpath is exists.

Else

Mkdir-p $backpath ${bkdate}

Fi

#执行数据库备份,/apps/mysql/bin/mysqldump Modify the path to your actual environment

/apps/mysql/bin/mysqldump-u $dbuser-p$dbpwd $dbname > ${SQLFP}

#对备份之后的数据库文件压缩, or compress directly by adding parameters while the database is being backed up

Tar zcf $backpath ${bkdate}/$sqlfile _gz-c ${backpath}${bkdate} ${sqlfile_sql}

#删除未压缩SQL的文件

Rm-f ${SQLFP}

#生成的文件名格式: Web_ Site Directory name _20130419.tar.gz

Fn=web_${file##*/}_${bkdate}.tar.gz

Tar zcf $backpath ${bkdate}/$fn-C ${file%/*} ${file##*/}

#以下操作为FTP自动连接并把本地文件上传到异地服务器上

/usr/bin/lftp ${domain} << END

User ${user} ${userpwd}

LCD ${backpath}

Mirror-r ${bkdate}

Exit

END

echo "Backup Job done"

------------Divider Line--------------

After saving exits, add Execute permissions for the script

#chmod +x/apps/script/autobackup.sh

#crontab –e//Add timed tasks to set backup intervals based on specific requirements

#0 4 1,16 * */apps/script/autobackup.sh

I set the monthly 1st, 16th of the 0-4 o'clock to perform the backup task. It's best not to do this in broad daylight, because it makes the whole system noticeably slower.

After the above configuration, webmaster will not worry about data backup, it can be said once and for all. The above only for the actual application of personal writing, and their respective needs may not be the same, but the essence of the same, as long as a slight modification can be, if there are other questions welcome you to exchange messages to learn.

Linux (Centos) fully automated offsite backup data (WEB+MYSQL)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.