It is necessary to automatically back up a piece of something on a Linux server at regular intervals, and then use the shell script to write it. Here to backup Apache server as an example, in order to facilitate the walkthrough, the use of Apache here is installed with Yun, if you want to the source of Apache installation of automatic backup, just change the corresponding code.
[Email protected] ~]# VI bak_web1.sh
#!/bin/bash
#name: bak_web.sh
#description: This is used-to-back web script
Mypath= "/var/bak_web"///To put back up things in the specified directory, this directory can be shared by the SMB, or can be built on the LVM
mylog= "/var/log/weblog.txt"///To make a record of each backup, each time the records are stored centrally under the specified file (that is, the log), where the log directory and the backup directory are based on the actual specified.
Time= ' Date +%y-%m-%d-%h:%m '///define the time variables to be used each time the backup is performed
if [!-d "$mypath"];then///Determine if the backup directory exists, create this directory if it does not exist
mkdir "$mypath"
Fi
Mkdir-p $mypath/$time/conf////Create a directory for the content to be backed up, respectively, to facilitate management
Mkdir-p $mypath/$time/web
cd/etc/httpd/conf/
Tar zcf $mypath/$time/conf/httpd.conf.tar.gz./httpd.conf//The Apache configuration file is packaged and compressed, named Httpd.conf.tar.gz, and placed in $mypath/$time Under the/conf directory
Ret=$?
If [$ret-eq 0];then//To determine whether the above packaging compression action is successful, if done, add a "success" record, otherwise write a "failure" record
echo "at $time tar htpd.conf.tar.gz successed" >> $mylog
Else
echo "at $time Tar httpd.conf.tar.gz failed" >> $mylog
Fi
cd/var/www/
Tar zcf $mypath/$time/web/html.tar.gz./html//The files under the root directory of Apache are packaged and compressed, named Html.tar.gz, placed in $mypath/$time/web directory, Similarly, you can back up database db
Ret=$?
If [$ret-eq 0];then//To determine if the above Apache root package compression is successful, if successful, append a "success" record to $mylog, otherwise write a "failed" record
echo "at $time tar html.tar.gz successed" >> $mylog
Else
echo "at $time Tar html.tar.gz failed" >> $mylog
Fi
The script is done here, with Bash bak_web1.sh performing the following, discovering that new content is generated under/var/bak_web/and that there are new records in the/var/log/weblog.txt log file.
Add the script to crontab and let it be executed automatically at a certain time. I set this to be executed every 1 minutes, the script is placed in the/root/directory, the script does not execute once, the/var/bak_web/directory and the/var/log/weblog.txt log file will have new content added.
[Email protected] ~]# CRONTAB-E
*/1 * * * * bash/root/bak_web1.sh
With tail-f/var/log/weblog.txt You can try to view log changes
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M01/40/CF/wKiom1PPcIHxrnPpAALCrqAEP1Q163.jpg "title=" Web.jpg "alt=" Wkiom1ppcihxrnppaalcrqaep1q163.jpg "/>
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M01/40/D0/wKioL1PPchHx_gigAAOFoQDEFmw595.jpg "title=" Web2.jpg "alt=" Wkiol1ppchhx_gigaaofoqdefmw595.jpg "/>
The result shows that this script can run healthily and efficiently on time.
This article from "Personal Feelings" blog, declined reprint!
Linux backup Scripts (web as an example)