Data import and Export in MySQL

Source: Internet
Author: User

In order to popularize the basic knowledge of MySQL, deliberately get this chapter, mainly found that the first contact people do not know how to do, or do not understand, so here is the details of the

============================================================

Data import

1.mysqlimport Command line Import data

When using the Mysqlimport command to import data, the data source file name is consistent with the target table, you do not want to change the file name, you can copy the creation of a temporary file, such as below.

Create a text users.txt that reads as follows:

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M01/6C/59/wKiom1VG8VvyJSnwAAB6YYoyAIw669.jpg "title=" 1.png " alt= "Wkiom1vg8vvyjsnwaab6yyoyaiw669.jpg"/>

Create a table users

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M01/6C/55/wKioL1VG8q_hcDpfAACUWsehht4443.jpg "title=" 2.png " alt= "Wkiol1vg8q_hcdpfaacuwsehht4443.jpg"/>

Using Mysqlimport to import data from Users.txt into the users table

PS F:\>  mysqlimport -u root -p123456 zz --default-character-set=gbk -- Fields-terminated-by= ', '  f:\users.txtzz.users: Records: 3  Deleted: 0   skipped: 0  warnings: 0----------------------------- Verifying----------------------------------mysql> select * from users\g***********************  1. row ***************************   id: 1003 name:  Harry Email:  [email protected]*************************** 2. row ***************************    id: 1001 name:  Zhang San email: [email protected]************************** * 3. row ***************************   id: 1002 name:  John Doe Email:  [email protected] 

Columns, use the--fields-terninated-by parameter to specify the delimiter for each column, for example:

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M00/6C/59/wKiom1VG8ZexZtkoAAFBhQ7oYUg243.jpg "title=" 4.png " alt= "Wkiom1vg8zexztkoaafbhq7oyug243.jpg"/>

If a delimiter appears in the column values, for example 1004 "#李 # White" # "[Email protected]"

PS f:\> mysqlimport-u root-p7758520 zz--fields-terminated-by= ' # '--fields-enclosed-by=\ ' F:\users.txt

If you encounter multiple rows for a record, you can use--lines-terminated-by=name to specify the terminator of the row

PS f:\> mysqlimport-u root-p7758520 zz--fields-terminated-by= ' # '--fields-enclosed-by=\ '--lines-terminated-by= ' x Xx\n ' F:\users.txt

2. Importing data using the Load data statement

load Data statement is as follows:

load data [low_priority | concurrent] [local] infile  ' file_name '      [REPLACE | IGNORE]    INTO TABLE tbl_name     [character set charset_name]    [{fields | columns}         [TERMINATED BY  ' String ']         [[OPTIONALLY] ENCLOSED BY  ' char ']         [ESCAPED BY  ' char ']    ]    [lines         [STARTING BY  ' String ']         [TERMINATED BY  ' String ']    ]    [ignore number  {lines | rows}]    [(Col_name_or_user_var,...)]     [set col_name = expR,...] 

Just beginning to see this syntax startled, so long, in fact, not so complicated, generally just remember the load DATA INFILE file_name into TABLE tb_name This can be, example:

First create a table sql_users, using the Users table above to copy

Mysql> CREATE TABLE Sql_users as select * from users; Query OK, 1 row affected (0.06 sec) records:1 duplicates:0 warnings:0mysql> truncate TABLE sql_users; Query OK, 0 rows Affected (0.00 sec) mysql> select * from Sql_users; Empty Set (0.00 sec)

Text Sql_users.txt

1004# Li Bai #[email protected]1005# du mu #[email protected]1006# du fu #[email protected]1007# su shi #[email protected]

Importing data using the Load data Infilee statement

mysql> load data infile  ' F:\sql_users.txt '  into table sql_users  fields terminated by  ' # '; query ok, 4 rows affected  (0.00 sec) records: 4  deleted: 0   skipped: 0  warnings: 0mysql> select * from sql_users;+ ------+------+--------------------+| id   | name | email               |+------+------+--------------------+|  1004 |  li Bai  | [email protected]| 1005 |  du mu  | [email  protected]| 1006 |  du fu  | [email protected]| 1007 |  su shi  |  [email protected]  |+------+------+--------------------+4 rows in set  (0.00  SEC)

  If you encounter a string that is not recognized when you import data, the character set is generally problematic, and the charset option is used to resolve the

mysql>  load data infile  ' F:\sql_users.txt '  into table sql_users  fields  terminated by  ' # '; error 1366  (HY000): incorrect string value:  ' \xc0\xee\xb0\xd7 '  for  column  ' name '  at row 1--------------------------------character set is not the same-----------------------MySQL > load data infile  ' F:\sql_users.txt '  into table sql_users   character set gbk fields terminated by  ' # '; query ok, 4 rows affected  (0.03 sec) records: 4  deleted: 0   skipped: 0  warnings: 0 

The LOAD Data infile command defaults to importing data on the service, if you want to import the client's data, you can specify local, then MySQL will read the data from the client, which will be slower than the operation on the server, because the client's data needs to be transferred over the network to the server.

mysql> Load Data local infile ' f:\sql_users.txt ' into table sql_users fields terminated by ' # ';

You can use the Ignore or replace option if you want to ignore record values that are duplicates of the primary key value or replace duplicate values, but there are two ignore keywords in the load DATA infile command syntax, one for this feature, and the last to specify the first n records to ignore.

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M02/6C/59/wKiom1VG8e3jvNrVAAInfPim2Hc600.jpg "title=" 5.png " alt= "Wkiom1vg8e3jvnrvaainfpim2hc600.jpg"/>

If you do not want to import the first n rows of the data file, use ignore n lines to handle

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M00/6C/55/wKioL1VG84CjiNb_AAGOj2zGNhE857.jpg "title=" 6.png " alt= "Wkiol1vg84cjinb_aagoj2zgnhe857.jpg"/>

If you record a certain character in a data file and do not want to be imported, you can use lines starting by to resolve it, but if a row of records does not contain these characters, the row record is ignored.

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M00/6C/59/wKiom1VG8jLAEhrBAAJO-qUkBb0151.jpg "title=" 7.png " alt= "Wkiom1vg8jlaehrbaajo-qukbb0151.jpg"/>

Data file for Excel file processing, first save the Excel file in CSV format, so that the fields are separated by commas, and then processed.

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M01/6C/55/wKioL1VG87_hEyfXAAJH9ojE6Ec208.jpg "title=" 8.png " alt= "Wkiol1vg87_heyfxaajh9oje6ec208.jpg"/>

There are special symbols in the data file column values, which are handled using enclosed by. For example, a delimiter in a column value

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M02/6C/55/wKioL1VG893ya5IFAAI8hvPpk3E453.jpg "title=" 8.png " alt= "Wkiol1vg893ya5ifaai8hvppk3e453.jpg"/>

The problem of line breaks when importing data, in the above example, after several data are imported into the table, the result of the query is a bit awkward, I do not know that you have noticed.

In a Windows system, a newline character in text format consists of "\r+\n", whereas in a Linux system, the line break is "\ n". As a result of this problem, the workaround is to specify a newline character lines TERMINATED by.

mysql> LOAD DATA INFILE ' F:\stu.csv ' into TABLE stu CHARACTER SET GBK fields TERMINATED by ', ' enclosed by ' "' LINES TE rminated by ' \ r \ n ' IGNORE 1 LINES;

The number of columns in the table is the number of columns in the extra data file, and the workaround is to specify the fields to import into the table as follows

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M01/6C/59/wKiom1VG8pPBmfY6AAKwc3y6i-E654.jpg "title=" 8.png " alt= "Wkiom1vg8ppbmfy6aakwc3y6i-e654.jpg"/>

If the number of columns in the table is less than the number of columns in the data file, the workaround can specify the user variable to receive the extra column values, as follows

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M02/6C/55/wKioL1VG9CKBG5RVAAI02wZwTwA553.jpg "title=" 8.png " alt= "Wkiol1vg9ckbg5rvaai02wzwtwa553.jpg"/>

If the number of columns in a table differs from the data file, and some field types are inconsistent, how do you solve them? Here's how:

------------------Text----------------------

PS f:\> more. \stu.csv

School number, name, class

4010404, Xiang, "A1012", 20,male, School of Information

4010405, Showaiji, "A1013", 22,female, outer court

4010406, Zhongxiaoxi, "A1014", 24,male, Accounting Institute

4010407, Zhong, "A1015", 26,female, business School

--------------------processing-------------------------

Mysql> desc Stu; Table structure

+--------+-------------+------+-----+---------+-------+

| Field | Type | Null | Key | Default | Extra |

+--------+-------------+------+-----+---------+-------+

| Sno | Int (11) | NO | PRI |       NULL | |

| sname | varchar (30) |     YES | |       NULL | |

| Class | varchar (20) |     YES | |       NULL | |

| Age | Int (11) |     YES | |       NULL | |

| Gender | Tinyint (4) |     YES | |       NULL | |

+--------+-------------+------+-----+---------+-------+

Rows in Set (0.01 sec)


mysql> LOAD DATA INFILE ' F:\stu.csv ' into TABLE stu CHARACTER SET GBK fields TERMINATED by ', ' enclosed by ' "' LINES TE R

minated by ' \ r \ n ' IGNORE 1 LINES (Sno,sname,class,age, @GENDER, @x) SET gender=if (@GENDER = ' MALE ', 1,0);

Query OK, 4 rows affected (0.09 sec)

Records:4 deleted:0 skipped:0 warnings:0


Mysql> SELECT * from STU;

+---------+--------+-------+------+--------+

| Sno | sname | Class | Age | Gender |

+---------+--------+-------+------+--------+

| 4010404 | Xiang |   A1012 |      20 | 1 |

| 4010405 | Showaiji |   A1013 |      22 | 0 |

| 4010406 | Zhongxiaoxi |   A1014 |      24 | 1 |

| 4010407 | Zhong |   A1015 |      26 | 0 |

+---------+--------+-------+------+--------+

Rows in Set (0.00 sec)

Data export

Data export is relatively simple, as long as the Select ... into outfile statements, such as

Mysql> SELECT * from STU to OUTFILE "F:\stu_bak.txt" CHARACTER SET GBK fields TERMINATED by ' # # ' LINES TERMINATED by ' \ r \ n ';

Query OK, 4 rows Affected (0.00 sec)

-------------------------------Stu_bak.txt-----------------------

PS f:\> more. \stu_bak.txt

4010404# #祝小贤 # #A1012 # #20 # #1

4010405# #肖小杰 # #A1013 # #22 # #0

4010406# #钟小喜 # #A1014 # #24 # #1

4010407# #钟小惠 # #A1015 # #26 # #0


There is also a select ... Into DumpFile, this statement also exports the data to a file, but cannot format the statement, such as Fields,lines, which is the original output of the data to the file. But only one record can be exported, which is of little use.



Data import and Export in MySQL

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.