Small and medium site simple backup strategy
Based on Drupal's small and medium line website, we can use the Backup_migrate module, which provides the function of the regular backup, the backup time, how many backup and so on, and so on, after setting up, the regular cron can backup successfully. The General Drupal Station, we only need to use SVN can, on the server side, we put back the good data to SVN, we can achieve the purpose of the backup. Because Drupal's backup module can set up the number of copies that are reserved for backups, it does not cause too many backup files, resulting in svn being large.
The following is a simple backup script, placed in the root directory of the site, and then added to crontab daily execution.
#!/bin/bash
Date #start date
drush_php=/bin/php #php path
export drush_php
drush cron
SVN st Sites /default/files/backup_migrate/scheduled/| grep ' ^! ' | awk ' {print $} ' | Xargs svn Delete--force
svn add sites/default/files/backup_migrate/scheduled/*
svn ci sites/default/files/ backup_migrate/scheduled/-m ' Add backup Files '
date #end Date
The crontab settings are as follows
Copy Code code as follows:
0 0 * * * cd/www/web/html/&& bash cron.sh > Cron.log 2>&1
MySQL backup strategy for large site
If the database is slightly larger site, using SVN temporary backup is slightly thin, then need to use the MySQL backup strategy, we need to backup the entire database compression, and then periodically transferred to the backup database or to other cloud servers, here a simple PHP sample code.
#!/usr/bin/php-q <?php $to = "gaoxinzhao@gmail.com";
$hostname = EXEC ('/bin/hostname ');
$MYCNF = "/HOME/ROBBIN/.MY.CNF";
$ignore = Array (' Information_schema ', ' Test ', ' MySQL ', ' wdcpdb ');
function TRIMW ($str) {$str = Str_replace (Array ("n", "R", "T", "", "O", "Xob"), "", $str);
return $str; } if (!file_exists ($mycnf)) {Mail ($to, "No. MY.CNF exists on $hostname", "MySQL cannot dump because. MY.CNF is missing
On $hostname. ");
Exit ("Cant get user creds");
$myconf = file_get_contents ($mycnf) or Die ("Failed to open bmesh_admin ' my.cnf"); Preg_match ("/buser (. *)/", $myconf, $matches) or Die (mail ($to, "No username in. my.cnf on $hostname", "MySQL cannot dump
On $hostname "));
$USR = (explode (' = ', $matches [0]));
$user = TRIMW ($usr [1]); Preg_match ("/bpassword (. *)/", $myconf, $matches) or Die (mail ($to, "No password in. my.cnf on $hostname", "MySQL cannot
Dump on $hostname "));
$pass = (explode (' = ', $matches [0]));
$password = TRIMW ($pass [1]); Mysql_connect("localhost", $user, $password) or die ("Could not connect:".
Mysql_error ());
mysql_select_db ("MySQL");
$result = mysql_query ("show Databases");
$bpath = "/home/robbin/backup/mysql";
$btime = Date ("y-m-d h:i:s");
$bstamp = Strtotime ($btime);
$byear = Date ("Y", $bstamp);
$bmonth = Date ("M", $bstamp);
$bday = Date ("D", $bstamp);
$btod = Date ("H-i-s", $bstamp);
while ($res = Mysql_fetch_array ($result)) {$myDb = $res ["Database"];
if (In_array ($myDb, $ignore)) continue;
$mdir = "$bpath/$byear/$bmonth/$bday/$btod/$myDb";
$out = ' mkdir-p $mdir '; $myFile = $myDb.
". SQL";
$BLDCMD = "CD $mdir;";
$bldCmd. = "Mysqldump-u$user-p$password--single-transaction--add-drop-table-r-c-q $myDb > $myFile;";
$bldCmd. = "chmod 644 $myFile;";
$bldCmd. = "Chown root:root $myFile;";
$bldCmd. = "Gzip-9 $myFile";
Print "Backing up $myDbn";
Print "Securing $myDbn";
$out = ' $BLDCMD ';
$out = ' chmod $bpath/$byear ';
print "$outn"; PRint "Backups are in $bpathn";
Settings for Crontab
Copy Code code as follows:
0 1 * * */home/robbin/bin/mysql_backup.php
In addition, we need to transfer the backed-up data to other servers on a regular basis to prevent the server from crashing and causing data loss. Backup timely site only to ensure that this is just the author of a little bit of operation to share, we have a better backup strategy, welcome to share.