Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Before the beginning of the article, ask the webmaster a question: What is very important to the webmaster? In fact, many things are very important for stationmaster. But we now exclude external factors, the scope of the site to reduce the system itself, which is very important? The website data is one of them.
The website data includes the program file itself, the website running process unceasingly produces the attachment data (the picture, the document, the video and so on) as well as the database file, three big chunks summarizes for the website data.
Take yourself for example, in the past two years more webmaster experience, the most I am suffering is one day to get up, has been painstakingly built the site can not open, check the DNS is also normal, and finally asked what happened to the space operator, learned that the results of the system: the server hard disk damage, all data can not be restored ( In this note: The General small stationmaster buy machine or space are very low configuration, so do not want to have any data backup service, to add money to buy, the result is a bursts of loss. If the site has just been running on the run soon this kind of thing or will not be so heartache, from scratch is not difficult. If it is a website that runs for a number of years, a sudden accident can be said to be a heavy blow.
Just become a webmaster in the first few months, I do not think about the data backup, until one day in the group to see a group of friends sad experience-the site completely collapsed, no data!! That is operating for more than 3 years of an old station, it fell down, really smell the sad listener tears (a bit exaggerated).
I saw such a living model, do not do data backup is very serious consequences. So I now summed up an experience: if you really want to do a Web site, you must do a good job of data backup, and the best off-site backup, some friends think in the machine to do a backup on the music, rest assured that the big mistake. For backup data, but also face a practical problem: each time on their own server to the relevant data packaging, and then use FTP to download back? Such a long time will be tired, so today in the Linux (Centos 5.5) environment to talk about how to achieve automatic remote (offsite) to the whole station data backup.
To implement the automatic remote backup function requires the software has: server (local environment) to install LFTP, remote terminal to open the FTP service, the directory has read and write permissions, with the above 2 conditions can be.
If the Lftp tool is not installed on the machine, you can do so by using the following command:
#yum Install Lftp
Suppose the relevant script files are stored in the/apps/script directory
#vi/apps/script/autobackup.sh//Enter the following
#!/bin/bash
#author: www.5ishare.com
echo "Backup Job Start"
#设置要备份的网站目录, that is, the root directory of the Web site
File=/apps/wwwdata
#设置本地备份目录, for storing packaged files
backpath=/apps/bkdata/
Dbuser= Database username
#数据密码, remember to enclose the entire password in single quotes and double quotes if the cipher has single quotes, hehe
dbpwd= ' Database Password '
Dbname= database name
domain= the domain name or IP address of the remote end
USER=FTP username
Userpwd= ' FTP user password '
bkdate=$ (Date +%y%m%d)
Sqlfile_sql=${dbname}_${bkdate}.sql
Sqlfile_gz=${dbname}_${bkdate}.tar.gz
sqlfp=${backpath}${bkdate}/$sqlfile _sql
If [-E $backpath ${bkdate}]
Then
Echo $backpath is exists.
Else
Mkdir-p $backpath ${bkdate}
Fi
#执行数据库备份,/apps/mysql/bin/mysqldump Modify the path to your actual environment
/apps/mysql/bin/mysqldump-u $dbuser-p$dbpwd $dbname > ${SQLFP}
#对备份之后的数据库文件压缩, or compress directly by adding parameters at database backup
Tar zcf $backpath ${bkdate}/$sqlfile _gz-c ${backpath}${bkdate} ${sqlfile_sql}
#删除未压缩SQL的文件
Rm-f ${SQLFP}
#生成的文件名格式: Web_ Site Directory name _20130419.tar.gz
Fn=web_${file##*/}_${bkdate}.tar.gz
Tar zcf $backpath ${bkdate}/$fn-C ${file%/*} ${file##*/}
#以下操作为FTP自动连接并把本地文件上传到异地服务器上
/usr/bin/lftp ${domain} << End
User ${user} ${userpwd}
LCD ${backpath}
Mirror-r ${bkdate}
Exit
End
echo "Backup Job done"
------------Divider--------------
Add execution permissions to the script after saving exits
#chmod +x/apps/script/autobackup.sh
#crontab –e//Add timed task, set backup interval according to specific requirements
#0 4 1,16 * */apps/script/autobackup.sh
I set the monthly 1st, 16th 0 4 o'clock to perform the backup task. It's best not to do this in broad daylight, because it slows down the whole system significantly.
After the above configuration, webmaster will no longer worry about data backup troubles, can be said to be once and for all. The above only for the actual application of writing, and their respective needs may not be the same, but the essence of the same, as long as a slight modification can be, if there are other questions to welcome you to exchange a message to learn.
This article address: http://www.5ishare.com/tech/system/368842.shtml