I used to back up mysql. I often used EMSSQLManager to export SQL script files or dbf files. To migrate the local database to a remote database, directly execute the script on the server.
I used to back up mysql. I often used EMS SQL Manager to export SQL script files or dbf files. To migrate the local database to a remote database, directly execute the script on the server.
However, we recently found that this visualization operation has some problems, that is, when the number of data entries exceeds a certain number of EMS SQL managers, it will be suspended, and we do not know if it is a software problem ...... Of course, at the beginning, I split large database files into small copies and imported them multiple times.
I just found that my colleague used the mysqldump tool provided by mysql. (Shame, but I usually seldom access databases)
The operation method is recorded here:
1. enter the bin directory and run the following command:
Mysqldump-hlocalhost-uroot-padmin local_db> a. SQL
2. the. SQL file is generated in the bin directory. continue with the command:
Mysql remote_db <a. SQL 110.110.110.110-uroot-padmin
The following figure shows the synchronization principle. For more information, see.
1. backup/dump ur local database at am and
2. ftp the SQL script to the server side with a bash script at am
3. load the SQL script in the server side at am.
It's not clear .. :)
Your problem is loose.
1. Back up the local database first, use cronjob in unix/linux, and use schedule in windows at regular intervals
Start the bash or bat file of the backup at AM.
2. It takes a considerable amount of time to back up the database, so it is set to start another bash/bat file at, which is written in
If there are no more than five lines of data on the ftp, you can upload the backed up file to the server. This also takes some time.
3. Start the third bat/bash file at on the server and load the uploaded file to the database.