Shell batch import MySQL database, batch delete database
Due to project collaborative development, my colleague sent an SQL file of the project and opened it in a directory containing hundreds of databases. Each database is used as a separate file.
Each SQL file contains
Create database if not exists 'xxxx'
USE 'xxxx ';
He can directly use mysql_workbench 6 or a later version to directly import folders to import Multiple SQL files. However, on my server, it is impossible to install a GUI workbench for batch import of this SQL statement.
So write a shell and change the host, username, and password in it to your corresponding one.
Dbname. The default value is "sample", which is the sample table in mysql. the actual data is not imported here, because the SQL file I want to import already has a db. if you want to import multiple tables to a database, you can modify the dbname for your database name.
#! /Bin/bash
# Author rainbow <rainysia@gmail.com>
# Date 10:26:27
Set-e
LC_ALL = C
LANG = C
Unset TZ
TZBase = $ (LC_ALL = c tz = UTC0 date-R)
UTdate = $ (LC_ALL = c tz = UTC0 date-d "$ TZBase ")
TZdate = $ (unset TZ; LANG = C date-d "$ TZBase ")
File_path = "/home/db/test/" # SQL folder to be imported
Host = "192.168.85.123" # mysql host to be imported
Username = "dbroot" # mysql username
Password = "db1t #2 w $ 3r @ 4 # t" # mysql password
Dbname = "sample" # mysql Database Name
Now = $ (date "+ % s") # timing
Mysql_source (){
For file_name in 'LS-A $1'
Do
Seg_start_time = $ (date "+ % s ")
If [-f "$1 $ file_name"]; then
Command = "source $1 $ file_name"
Mysql-h $ {host}-u $ {username}-p $ {password }$ {dbname}-e "$ command"
Echo "source:" \ "$1 $ file_name \" "is OK, It takes" 'expr $ (date "+ % s")-$ {seg_start_time} '"seconds"
Fi
Done
Echo "All SQL is done! Total cost: "'expr $ (date" + % s ")-$ {now} '" seconds"
}
Echo "Universal Time is now: $ UTdate ."
Echo "Local time is now: $ TZdate ."
Mysql_source $ file_path
Test,
Root @ debian:/home/sh #./mysql_source.sh
Universal Time is now: Fri Nov 14 03:10:49 UTC 2014.
Local time is now: Fri Nov 14 11:10:49 CST 2014.
Source: "/home/db/test/hml2. SQL" is OK, It takes 18 seconds
Source: "/home/db/test/hml3. SQL" is OK, It takes 19 seconds
Source: "/home/db/test/hml4. SQL" is OK, It takes 18 seconds
All SQL is done! Total costs: 55 seconds
Next, we found that the data can be imported, and the colleague needs to reinstall an app and delete all the tables in a database. because she has insufficient permissions and cannot directly Delete the database, she can only manually drop the table one by one. however, there are over 2000 tables.
Originally intended to use
Select concat ('drop table if exists', table_name, ';') FROM information_schema.tables WHERE table_schema = 'cs _ china_111 database name ';
To delete all tables.
So I still use the shell above to change the path as needed. Now I will write the SQL statement of the table to be dropped first.
Export the database first. In the terminal, use mysqldump dbroot as the username of my database, cs_china_1111 as the database to be exported, and-d-add-drop-table as the only structure to export data without exporting data.
# Mysqldump-udbroot-p cs_china_1111-d -- add-drop-table> cs_china_1111. SQL
Then use grep to filter it once, because the exported SQL contains the drop table
# Find./-name "cs_china_1111. SQL" | xargs grep "drop table if"> cs_china_1111_drop_table. SQL
Next, use the previous shell to change dbname to the name of the database to be deleted and run it.
-------------------------------------- Split line --------------------------------------
Install MySQL in Ubuntu 14.04
MySQL authoritative guide (original book version 2nd) Clear Chinese scan PDF
Ubuntu 14.04 LTS install LNMP Nginx \ PHP5 (PHP-FPM) \ MySQL
Build a MySQL Master/Slave server in Ubuntu 14.04
Build a highly available distributed MySQL cluster using Ubuntu 12.04 LTS
Install MySQL5.6 and Python-MySQLdb in the source code of Ubuntu 12.04
MySQL-5.5.38 universal binary Installation
-------------------------------------- Split line --------------------------------------
This article permanently updates the link address: