Linux Server local and Baidu Cloud backup script small test

Source: Internet
Author: User
Tags file upload script

Local single File upload script, named UF

This is a test done on this machine, usingbpcs_uploaderScript implementation, just for a simple package, automatically improve the cloud file path.
Technical points: UsingdirnameGet the directory where the files are located, usingpwdGets the full path to the file and serves as the path to the cloud file.

#!/bin/bashcur_dir=$ (CD "$ (dirname") "; pwd" name=$ (basename "$")/home/grm/bin/bpcs_uploader/bpcs_uploader.php Upload $ awin$cur_dir/$name
Local folder upload script, named UD

bpcs_uploaderA script can only implement a single file upload, which allows for bulk upload of catalogs.
Technical points: ThroughfindCommand output directory for all files, withxargs -t -n1Implement a single output so that you can traverse all files in the directory and assign the script as argumentsuf, by constantly invoking the scriptufImplement bulk uploads.

#!/bin/bashfind $1-name ' * * ' |xargs-t-n1/home/grm/bin/uf
Server database daily backup script, named backupday.sh (changed from Bird's Linux private dish)

Technical points: Basic is the general operation, notefindCommand-mtimeUse of parameters

#!/bin/bash# =========================================================# Please input, you want to put the backup data into that separate directory to basedir=/backup/ daily/# =========================================================path=/bin:/usr/bin:/sbin:/usr/sbin; Export Pathexport lang=cbasefile1= $basedir/mysql.$ (date +%y-%m-%d). tar.bz2basefile2= $basedir/cgi-bin.$ (Date +%y-% m-%d). tar.bz2[!-D "$basedir"] && mkdir $basedir # 1. MySQL (database directory in/var/lib/mysql) cd/var/lib  tar-jpc-f $basefile 1 mysql # 2. Delete old backups regularly days=30find $basedir-name "mysql*" -type f-mtime + $DAYS-exec rm {} \;
Code and other configuration weekly backup scripts, named backupweek.sh
#!/bin/bash# ====================================================================# User Parameter Input location: # basedir= The directory you use to store the data that this script is expected to back up (please stand-alone file system) Basedir=/backup/weekly # =========================================================== Please don't change it under =========#! With default values! Path=/bin:/usr/bin:/sbin:/usr/sbin; Export Pathexport lang=cd=$ (date + "%y-%m-%d") # Configures the profile of the service to be backed up, as well as the backup directory postfixd= $basedir/postfixvsftpd= $basedir/ vsftpsshd= $basedir/sshwwwd= $basedir/wwwothers= $basedir/othersuserinfod= $basedir/userinfo# to determine the existence of the directory, if not exist to be created. For dirs in $postfixd $vsftpd $sshd $wwwd $others $userinfoddo [!-D "$dirs"] && mkdir-p $dirsdone # 1. Backup the configuration files of main system services, and also backup/etc all.    Cd/etc/tar-jpc-f $vsftpd/vsftpd. $D. tar.bz2 vsftpdcd/etc/tar-jpc-f $sshd/sshd. $D. tar.bz2 sshd sshcd/etc/ Tar-jpc-f $wwwd/httpd. $D. tar.bz2 httpdcd/var/www tar-jpc-f $wwwd/html. $D tar.bz2 HTML cd/tar-jpc-f $others/E TC. $D. tar.bz2 etc# 2. For user parameters Cp-a/etc/{passwd,shadow,group} $userinfodcd/var/spool tar-jpc-f $useRinfod/mail. $D. tar.bz2 mailcd/tar-jpc-f $userinfod/home $D. tar.bz2 homecd/var/spool tar-jpc-f $userinfod/cron . $D. tar.bz2 Cron at# 3. Delete old backups regularly days=30find $vsftpd-name "vsftpd*"-type f-mtime + $DAYS-exec rm {} \;find $sshd-name "sshd*"-type F-mtime + $D Ays-exec RM {} \;find $WWWD-name "ht*"-type f-mtime + $DAYS-exec rm {} \;find $others-name "etc*"-type F-mtime + $DA Ys-exec RM {} \;find $userinfod-name "cron*"-type f-mtime + $DAYS-exec rm {} \;find $userinfod-name "home*"-type F- Mtime + $DAYS-exec rm {} \;find $userinfod-name "mail*"-type f-mtime + $DAYS-exec rm {} \;
Automatic upload Script auto_upload_daily.sh

Where the upload.sh code and the local scriptufSame. In short, the scriptufIs the basis for cloud backup.

#!/bin/bashlocal_data=/backup/dailymysql_backup=mysql.$ (date + "%y-%m-%d"). tar.bz2upload.sh $LOCAL _data/$MYSQL _ BACKUP
Automatic upload Script auto_upload_weekly.sh
#!/bin/bashlocal_data=/backup/weeklyd=$ (date + "%y-%m-%d") http=www/httpd. $D. tar.bz2html=www/html. $D. tar.bz2ETC= Others/etc. $D. Tar.bz2hom=userinfo/home. $D. Tar.bz2mail=userinfo/mail $D. tar.bz2passwd=userinfo/passwd.$ D.tar.bz2shadow=userinfo/shadow. $D. tar.bz2sshd=ssh/sshd. $D. tar.bz2vsftpd=vsftpd/vsftpd $D. tar.bz2CRONA= Userinfo/cron. $D. tar.bz2upload.sh $LOCAL _data/$HTTPupload. Sh $LOCAL _data/$HTMLupload. Sh $LOCAL _data/$ETCupload. Sh $LOCAL _data/$HOMupload. Sh $LOCAL _data/$MAILupload. Sh $LOCAL _data/$PASSWDupload. Sh $LOCAL _data/$SHADOWupload. SH $ local_data/$CRONAupload. Sh $LOCAL _data/$SSHDupload. Sh $LOCAL _data/$VSFTPD
Finally, start the scheduled task again

# crontab-l01 1 * * */bin/backupday.sh 2>>/BACKUP/ERRORS.LOG20 1 * * 0/BIN/BAC kupwk.sh 2>>/BACKUP/ERRORS.LOG01 2 * * */bin/auto_upload_daily.sh 2>>/BACKUP/ERRORS.LOG01 4 * */bin/auto_u pload_daily.sh 2>>/BACKUP/ERRORS.LOG01 6 * * */bin/auto_upload_daily.sh 2>>/BACKUP/ERRORS.LOG20 2 * * 0/bin/ auto_upload_weekly.sh 2>>/BACKUP/ERRORS.LOG20 4 * * 0/bin/auto_upload_weekly.sh 2>>/BACKUP/ERRORS.LOG20 6 * * 0/bin/auto_upload_weekly.sh 2>>/backup/errors.log 
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.