With the continuous application of databases in website construction, most websites are now inseparable from the database support, because databases can not only store website content and other information, but also store user submitted information. As this information is very valuable, it is almost irreparable if it is lost. All webmasters should not only back up webpage html files, images or program code on the server, but also back up database files first. You can use phpMyAdmin to manually back up and download files to a specified location for the simplest database backup.
With the continuous application of databases in website construction, most websites are now inseparable from the database support, because databases can not only store website content and other information, but also store user submitted information. As this information is very valuable, it is almost irreparable if it is lost.
All webmasters should not only back up webpage html files, images or program code on the server, but also back up database files first.
You can use phpMyAdmin to manually back up and download files to a specified location for the simplest Database Backup. However, it is troublesome for webmasters who need to back up the database on a daily basis, so this article explains how to use the clock program of a foreign host and shell command code to regularly back up database files every day and download them to the specified directory.
Shell backup database
First, create a file named "backup. sh" and enter the following command:
#! /Bin/bash # Set a value that we can use for a datestamp DATE = 'date + % Y-% m-% d' $ # Our Base backup directory BASEBACKUP = "/backup/daily" For DATABASE in 'cat/backup/db-list.txt' Do # This is where we throw our backups. FILEDIR = "$ BASEBACKUP/$ DATABASE" # Test to see if our backup directory exists. # If not, create it. If [! -D $ FILEDIR] Then Mkdir-p $ FILEDIR Fi Echo-n "Exporting database: $ DATABASE" Mysqldump -- user = root -- opt $ DATABASE | gzip-c-9> $ FILEDIR/export database-1_date. SQL .gz Echo "...... [Done Exporting to local backup, now exporting for remote backup]" Cp $ FILEDIR/export database-. SQL .gz/backup/uploads/export database. SQL .gz Echo "...... [Done]" Done # AutoPrune our backups. This will find all files # That are "MaxFileAge" days old and delete them. MaxFileAge = 4 Find $ BASEBACKUP-name '*. gz'-type f-mtime + $ MaxFileAge-exec rm-f {}\; |
To put it simply, this command will retrieve the database row by row from the db-list.txt file (a list text file with a database name recorded) and output the + compressed data file (datastamped) to the/backup/daily/databasename/directory; the command then stores a non-datestamped file under the/backup/uploads/directory and overwrites the original file.
To save resources, the command automatically deletes the database files four days ago.
Clock command (Cron Jobs)
After the backup, you must use the clock command to back up the database and store it in an automatic manner for daily execution. First, you need to activate backup. sh;
10 4 *** sh/backup. sh
This command enables the server to execute database backup at every morning. of course, the execution time depends on you.
Then enter the following command in the clock command:
10 6 */usr/local/bin/ncftpput-Ef/home/admin/ncftpputlogin // backup/uploads /*
This command tells a program named ncftpput (the program is located in the ncftputlogin directory) to upload the database files to the/backup/uploads directory at every morning. The ncftpputlogin directory contains the user information of the server. if ncftp is not installed on your server, you can install one by yourself, which is very simple.
Summary
If your database file is large, it may occupy a lot of server resources during backup, so the webmaster should try to select the minimum period of time for the visitor to execute the automatic backup command, such as around 4-5 a.m. In addition, pay attention to leave enough time for the backup and then execute the storage command.
Because a server may be suffering from natural disasters, different servers or downloading the backup database files to a local device is more secure.
This article from centos Configuration tutorial | CentOS installation, configuration, learning tutorial website original link: http://www.centos.ws/centos/linux/901.html