Linux Server local and Baidu Cloud backup script small test

Source: Internet
Author: User
Tags bz2 file system file upload file upload script mail ssh backup linux

Local single File upload script, name uf

This is done on the local test, using the Bpcs_uploader script implementation, just for simple encapsulation, automatically improve the cloud file path.

Technical Highlights: Use DirName to get the directory where the files are located, use PWD to get the full path of the file, and as a cloud file path.

#!/bin/bash

cur_dir=$ (CD "$ (dirname") "; pwd)

name=$ (basename "$")

/home/grm/bin/bpcs_uploader/bpcs_uploader.php Upload $ awin$cur_dir/$name

Local folder upload script, named UD

Bpcs_uploader script can only implement a single file upload, use this script to achieve catalog bulk upload.

Technical points: Through the Find command output directory All files, with xargs-t-n1 to achieve a single output, so you can traverse all the files in the directory, and as a parameter to give script UF, through the Continuous Call script UF implementation of bulk uploads.

#!/bin/bash

Find $1-name ' *.* ' |xargs-t-n1/home/grm/bin/uf

Server database daily backup script, named backupday.sh (changed from bird Brother's Linux private kitchen)

Technical points: Basic general operation, note the use of Find command-mtime parameters

#!/bin/bash

# =========================================================

# Please enter, you want to put the backup data into that separate directory

basedir=/backup/daily/

# =========================================================

Path=/bin:/usr/bin:/sbin:/usr/sbin; Export PATH

Export Lang=c

basefile1= $basedir/mysql.$ (date +%y-%m-%d). tar.bz2

basefile2= $basedir/cgi-bin.$ (date +%y-%m-%d). tar.bz2

[!-D "$basedir"] && mkdir $basedir

# 1. MysQL (database directory in/var/lib/mysql)

Cd/var/lib

Tar-jpc-f $basefile 1 MySQL

# 2. Periodically delete old backups

Days=30

Find $basedir-name "mysql*"-type f-mtime + $DAYS-exec rm {} \;

Code and other configuration weekly backup scripts, named backupweek.sh

#!/bin/bash

# ====================================================================

# User Parameter Input location:

# basedir= The directory you use to store the data that this script expects to back up (please standalone file system)

Basedir=/backup/weekly

# ====================================================================

# Please don't change the bottom! You can use the default value!

Path=/bin:/usr/bin:/sbin:/usr/sbin; Export PATH

Export Lang=c

d=$ (date + "%y-%m-%d")

# Configure the configuration files for the services to be backed up and the directories to back up

postfixd= $basedir/postfix

vsftpd= $basedir/vsftp

sshd= $basedir/ssh

wwwd= $basedir/www

others= $basedir/others

userinfod= $basedir/userinfo

# to determine if the directory exists, and if it does not exist, create it.

For the dirs in $postfixd $vsftpd $sshd $wwwd $others $userinfod

Todo

[!-D "$dirs"] && mkdir-p $dirs

Done

# 1. The system's main service configuration files are backed up separately, and all/etc are backed up.

cd/etc/

Tar-jpc-f $vsftpd/vsftpd. $D. tar.bz2 vsftpd

cd/etc/

Tar-jpc-f $sshd/sshd. $D. tar.bz2 sshd SSH

cd/etc/

Tar-jpc-f $wwwd/httpd. $D. tar.bz2 httpd

Cd/var/www

Tar-jpc-f $wwwd/html. $D. tar.bz2 html

CD/

Tar-jpc-f $others/etc. $D. tar.bz2 etc

# 2. Regarding user parameters

Cp-a/etc/{passwd,shadow,group} $userinfod

Cd/var/spool

Tar-jpc-f $userinfod/mail. $D. tar.bz2 Mail

CD/

Tar-jpc-f $userinfod/home. $D. TAR.BZ2 Home

Cd/var/spool

Tar-jpc-f $userinfod/cron. $D. tar.bz2 Cron at

See more highlights of this column: http://www.bianceng.cnhttp://www.bianceng.cn/Servers/cloud-computing/

# 3. Periodically delete old backups

Days=30

Find $vsftpd-name "vsftpd*"-type f-mtime + $DAYS-exec rm {} \;

Find $sshd-name "sshd*"-type f-mtime + $DAYS-exec rm {} \;

Find $WWWD-name "ht*"-type f-mtime + $DAYS-exec rm {} \;

Find $others-name "etc*"-type f-mtime + $DAYS-exec rm {} \;

Find $userinfod-name "cron*"-type f-mtime + $DAYS-exec rm {} \;

Find $userinfod-name "home*"-type f-mtime + $DAYS-exec rm {} \;

Find $userinfod-name "mail*"-type f-mtime + $DAYS-exec rm {} \;

Automatic upload Script auto_upload_daily.sh

Where upload.sh code is the same as local script UF. In short, script UF is the foundation of cloud backup.

#!/bin/bash

Local_data=/backup/daily

mysql_backup=mysql.$ (date + "%y-%m-%d"). tar.bz2

upload.sh $LOCAL _data/$MYSQL _backup

Automatic upload Script auto_upload_weekly.sh

#!/bin/bash

Local_data=/backup/weekly

d=$ (date + "%y-%m-%d")

Http=www/httpd. $D. tar.bz2

html=www/html. $D. tar.bz2

Etc=others/etc. $D. tar.bz2

Hom=userinfo/home. $D. tar.bz2

Mail=userinfo/mail. $D. tar.bz2

passwd=userinfo/passwd. $D. tar.bz2

Shadow=userinfo/shadow. $D. tar.bz2

Sshd=ssh/sshd. $D. tar.bz2

Vsftpd=vsftpd/vsftpd. $D. tar.bz2

Crona=userinfo/cron. $D. tar.bz2

upload.sh $LOCAL _data/$HTTP

upload.sh $LOCAL _data/$HTML

upload.sh $LOCAL _data/$ETC

upload.sh $LOCAL _data/$HOM

upload.sh $LOCAL _data/$MAIL

upload.sh $LOCAL _data/$PASSWD

upload.sh $LOCAL _data/$SHADOW

upload.sh $LOCAL _data/$CRONA

upload.sh $LOCAL _data/$SSHD

upload.sh $LOCAL _data/$VSFTPD

Finally, start the timed task again

Repeat 3 uploads to prevent network problems and upload failures

# crontab-l

1 * * */bin/backupday.sh 2>>/backup/errors.log

1 * * 0/bin/backupwk.sh 2>>/backup/errors.log

2 * * */bin/auto_upload_daily.sh 2>>/backup/errors.log

4 * * */bin/auto_upload_daily.sh 2>>/backup/errors.log

6 * * */bin/auto_upload_daily.sh 2>>/backup/errors.log

2 * * 0/bin/auto_upload_weekly.sh 2>>/backup/errors.log

4 * * 0/bin/auto_upload_weekly.sh 2>>/backup/errors.log

6 * * 0/bin/auto_upload_weekly.sh 2>>/backup/errors.log

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.