MySQL Import and export

Source: Internet
Author: User
Tags mysql command line mysql import

MySQL exports import data in the following ways:

1) Select ... into outfile

2) Load Data

3) mysqldump

4) Mysqlimport

5) MySQL One, table level import, export

About character handling has the following related clauses:

1) field terminated by ': Use tab-delimited between fields.

2) [optionally] enclosed by ': what symbol is used for field values, if optionally option is specified, enclosed by ' specifies the string type field to be valid.

3) escaped by ': Defines the escape character, which defaults to ' \ '.

4) lines terminated by ': Defines line breaks, and Linux defaults to \ n.

Text file format:

Data: can use scientific notation

String: A special character in a string must use a backslash character as the identifying identifier to distinguish between the various separators

Date: Yasuteru 2017-05-30 format string representation

Time: Represented by a 15:12:00 format string

Timestamp: 20170520151200 format integer Treats

Null value: Assuming "\" as the escape character, "'" is identified as a string prefix, then \ n represents a null value in the export operation.

1. Export table data using SELECT ... into outfile

Using SELECT ... into outfile you can perform table-level export operations, and the output file cannot exist before the output.

For example:

Mysql> select * into outfile '/datas/column_charset_00.sql ' from Column_charset;

Mysql> select * into outfile '/datas/column_charset_01.sql ' fields terminated by ', ' from Column_charset;

Mysql> SELECT * to outfile '/datas/column_charset_03.sql ' fields terminated by ', ' lines terminated by ' \ n ' from Colu Mn_charset;

Mysql> select * into outfile '/datas/column_charset_04.sql ' fields terminated by ', ' optionally enclosed by ' "' Lines t erminated by ' \ n ' from Column_charset;

2. Import data into the table using load

Select ... into outfile command to export the table data to a file, the load Data command is better than the select ... into outfile command instead, you can import data from a file into a table.

Parameter local effect: If the export file is on the server, use the command load data infile ... Import data, otherwise you can use the load data local infile ... The command imports the client-side local file data into the table.

You need to be aware of the settings of the MySQL server character set when importing data using the Load Data command. If the load data infile does not support character sets in some MySQL versions, then MySQL assumes that the import character set is character_set_database for the specified character set, and if the Character_set_database parameter is not set, CHARACTER_SET_SERVER Specifies the character set by default, you can change the server character set settings using the set Character_set_database or set names commands, or you can specify the character set in the load Data command.

For example:

mysql> load Data infile '/datas/column_charset_00.sql ' into table column_charset;

--Import data using character_set_database default specified character set.

mysql> load Data infile '/datas/column_charset_00.sql ' into table column_charset character set UTF8;

--the character set in the command is imported into the tab-delimited file.

mysql> load Data infile '/datas/column_charset_01.sql ' into table column_charset character set UTF8 fields terminated b Y ', ';

--Import ', ' separate files into the table.

mysql> load Data infile '/datas/column_charset_03.sql ' into table column_charset character set UTF8 fields terminated b Y ', ' lines terminated by ' \ t ';

--Import ', ' delimited columns, tab delimited rows of files.

mysql> load Data infile '/datas/column_charset_04.sql ' into table column_charset character set UTF8 fields terminated b Y ', ' enclosed by ' "' lines terminated by ' \ n ';

Or

mysql> load Data infile '/datas/column_charset_04.sql ' into table column_charset character set UTF8 fields terminated b Y ', ' Optionally enclosed by ' "' lines terminated by ' \ n ';

--Import ', ' delimited column, ' \ n ' separates the file of the line, ' "' refers to the field of the file.

mysql> load Data infile '/datas/column_charset_04.sql ' into table column_charset character set UTF8 fields terminated b Y ', ' Optionally enclosed by ' "' lines terminated by ' \ n ' (C1,C2,C3,C4);

--Specifies the order of the fields of the imported table.

mysql> load Data infile '/datas/column_charset_04.sql ' into table column_charset character set UTF8 fields terminated b Y ', ' Optionally enclosed by ' "' lines terminated by ' \ n ' (c1,c2,c3,c4) set c1= ' dbking ';

--load data also specifies the update column.

mysql> Load Data local infile '/datas/column_charset_04.sql ' into table column_charset character set UTF8 fields termin Ated by ', ' optionally enclosed by ' "' lines terminated by ' \ n ' (c1,c2,c3,c4) set c1= ' dbking ';

mysql> Load Data local infile '/datas/column_charset_04.sql ' replace into table Column_charset character set UTF8 field s terminated by ', ' optionally enclosed by ' "' lines terminated by ' \ n ' (c1,c2,c3,c4) set c1= ' Chavin ';

--Import the local file into the table.

Local Data optimization:

Compared to the normal MySQL command, the load Data command is much faster to import, generally up to tens of thousands of records per second, if you want to have a higher speed, you can perform the following optimizations. The InnoDB and MyISAM storage engines are described here separately:

For the InnoDB mode, the recommended optimization methods are:

1) Set the Innodb_buffer_pool_size value to a larger number.

2) Set the innodb_log_file_size to a larger number.

3) Sets the uniqueness constraint that ignores the level two index, set unique_checks=0.

4) Set ignore external check constraint, set foreign_key_checks=0.

5) Set not to log binary files, set sql_log_bin=0.

6) Import data by primary key order.

7) for the InnoDB engine table, you can set the autocommit=0 before importing.

8) cut large files into multiple small files for import, such as split.

For the MyISAM mode, the recommended optimization methods are:

1) bulk_insert_tree_size, Myisam_sort_buffer_size, Key_buffer_size set the larger.

2) first disable key (ALTER TABLE ... disable keys), then import the data, and then enable key (ALTER TABLE ... enable keys). Once the key is re-enabled, the index is re-created in bulk, and it is much more efficient to create the index in bulk than one article at a cost. ALTER TABLE ... disable the keys command disables only non-unique indexes, and unique indexes and primary keys cannot be disabled unless you manually remove it.

3) files separated using load Data,tab are easier to parse and faster than other methods. Ii. Exporting data using mysqldump

Mysqldump export is generally SQL files, also become dump files or dump files, we can use the MySQL tool or Mysqlimport tool to import mysqldump export files.

For example:

To export the Chavin database:

Mysqldump-uroot-pmysql Chavin > Chavin00.sql

Mysqldump-uroot-pmysql--complete-insert--force--insert-ignore--add-drop-database--hex-blob--database chavin > Chavin02.sql

To export some tables in the Chavin library:

Mysqldump-uroot-pmysql Chavin--tables column_charset column_collate > Chavin01.sql

Export the Chavin library, using the SQL and data separation modes:

Mysqldump-uroot-pmysql--tab=/datas/chavin00 Chavin

Export the Chavin library, use SQL and data separation, data fields using "," delimited:

Mysqldump-uroot-pmysql--tab=/datas/chavin01--fields-terminated-by= ', '--fields-enclosed-by= ' "' Chavin

Export all databases:

Mysqldump-uroot-pmysql--all-database--add-drop-database >db00.sql

Export XML Format data:

Mysqldump-uroot-pmysql--xml Chavin >chavin.03.xml

Export the database and increase the compression function:

Mysqldump-uroot-pmysql--hex-blob Chavin|gzip >chavin04.sql.gz

To export a full library:

Mysqldump-uroot-pmysql--flush-logs--master-data=2--hex-blob-r-e-f--all-databases 2>>/datas/full-log |gzip > mysql-full.gz

Export Data Structures only:

mysqldump-uroot-pmysql-d--add-drop-table Chavin > Chavin11.sql

Mysqldump-uroot-pmysql--no-data--add-drop-table chavin > Chavin12.sql Import data exported by mysqldump

1. Use the MySQL command line tool to import files exported by mysqldump.

For example:

Import file Chavin.sql:

Mysql-uroot-pmysql Restore01 < Chavin00.sql

To import a compressed export file:

GZIP-DC chavin04.sql.gz | Mysql-uroot-pmysql chavin08

Import files and ensure client, connection, file character Set consistency:

Mysql-uroot-pmysql--default-character-set=utf8 Restore02 < Chavin00.sql

2. Mysqlimport tools can be used to import data.

3. Recovering data using the source command

Mysql> source/datas/chavin10.sql. Exporting data using the MySQL Tools batch function

1, Export Column_charset table:

mysql-uroot-poracle-h192.168.108.128-p3306--batch--default-character-set=utf8-e "SELECT * from Chavin.column_chars ET; "> Output.txt

mysql-uroot-poracle-h192.168.108.128-p3306--default-character-set=utf8--batch "--execute=select * from Column_ CharSet, Chavin > Output03.txt

mysql-uroot-poracle-h192.168.108.128-p3306--default-character-set=utf8--batch-e "select * from Column_charset;" Cha Vin > Output04.txt

2. Vertical display of query results

mysql-uroot-poracle-h192.168.108.128-p3306--default-character-set=utf8--vertical "--execute=select * from Chavin.column_charset; "> Output00.txt

3. Generate HTML format output

mysql-uroot-poracle-h192.168.108.128-p3306--default-character-set=utf8--html "--execute=select * from Chavin.column_charset; "> output01.html

4. Generating output in XML format

mysql-uroot-poracle-h192.168.108.128-p3306--default-character-set=utf8--xml "--execute=select * from Chavin.column _charset; "> Output02.xml five, operating system split tool cutting data files

The split command works by cutting files. The-l parameter specifies how many lines to cut, and does not specify the default to cut one copy per 1000 lines.

# split-l Output03.txt split/output_split_sub_

# ll split/

Total 20

-rw-r--r--1 root root 880 Jan 05:39 Output_split_sub_aa

-rw-r--r--1 root root 896 Jan 05:39 Output_split_sub_ab

-rw-r--r--1 root root 896 Jan 05:39 Output_split_sub_ac

-rw-r--r--1 root root 896 Jan 05:39 Output_split_sub_ad

-rw-r--r--1 root root 05:39 output_split_sub_ae

When you cut large files into small files, you can increase efficiency by importing them in parallel through multiple clients.

MySQL Import and export

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.