Data import and export in mysql

Source: Internet
Author: User
QQ Group 288567073, no commercial ads, daily dry goods ebook + video sharing lychee FM mobile phone client search "Kick Talk Show" can subscribe to our video summary home: edu.51cto.comlecturerindexuser_id-4626073.html in order to popularize the basic knowledge of mysql, this chapter was specially written to discover that

QQ Group 288567073, no commercial ads, daily dry goods ebook + video sharing lychee FM mobile phone client search "Kick Talk Show" can subscribe to our video summary home: http://edu.51cto.com/lecturer/index/user_id-4626073.html in order to popularize the basic knowledge of mysql, this chapter was specially written to discover that

QQ group: 288567073, no commercial ads, daily e-books + video sharing

Lychee FM mobile client searches for "Kick-off Talk Show" to subscribe to us

Video summary homepage:Http://edu.51cto.com/lecturer/index/user_id-4626073.html


In order to popularize the basic knowledge of mysql, I made this chapter specifically, mainly because I found that the first contact did not know how to get it, or could not understand it. So let's go into details here.

========================================================== ================================

Data Import

1. mysqlimport command line import data

When using the mysqlimport command to import data, the data source file name must be consistent with the target table. If you do not want to change the file name, you can copy a temporary file, as shown in the following example.

Create an example users.txt with the following content:

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/121224F33-0.jpg "title =" 1.png" alt = "wKiom1VG8VvyJSnwAAB6YYoyAIw669.jpg"/>


Create a table users

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/1212243S6-1.jpg "title =" 2.png" alt = "wKioL1VG8q_hcDpfAACUWsehht4443.jpg"/>

Use mysqlimportto import data from users.txt to the users table

Ps f: \> mysqlimport-u root-p123456 zz -- default-character-set = gbk -- fields-terminated-by = ', 'f: \ users.txt zz. users: Records: 3 Deleted: 0 Skipped: 0 Warnings: 0 ----------------------------- verify -------------------------------- mysql> select * from users \ G *************************** 1. row ************************** id: 1003 name: Wang Wu email: wangwu@163.com. row ************************** id: 1001 name: Zhang San email: zhangsan@163.com *************************** 3. row ************************** id: 1002 name: Li Si email: lisi@hotmail.com

The -- fields-terninated-by parameter is used to specify the delimiter of each column. For example:

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/1212244251-2.jpg "title =" 4.png" alt = "wKiom1VG8ZexZtkoAAFBhQ7oYUg243.jpg"/>

If the delimiter appears in the column value, for example 1004 "# Li # White" # "libai@hotmail.com"

PS F:\> mysqlimport -u root -p7758520 zz  --fields-terminated-by='#' --fields-enclosed-by=\"  f:\users.txt

If a record contains multiple rows, you can use -- lines-terminated-by = name to specify the row Terminator.

PS F:\> mysqlimport -u root -p7758520 zz  --fields-terminated-by='#' --fields-enclosed-by=\"  --lines-terminated-by='xxx\n' f:\users.txt

2. Use the Load Data statement to import Data

The usage syntax of the Load Data statement is as follows:

LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name'    [REPLACE | IGNORE]    INTO TABLE tbl_name    [CHARACTER SET charset_name]    [{FIELDS | COLUMNS}        [TERMINATED BY 'string']        [[OPTIONALLY] ENCLOSED BY 'char']        [ESCAPED BY 'char']    ]    [LINES        [STARTING BY 'string']        [TERMINATED BY 'string']    ]    [IGNORE number {LINES | ROWS}]    [(col_name_or_user_var,...)]    [SET col_name = expr,...]

At the beginning, I was shocked to see this syntax. It was not so complicated as it was so long. Generally, I just need to remember the LOAD DATA INFILE file_name INTO TABLE tb_name. For example:

First, create a table SQL _users and copy it using the above users table.

mysql> create table sql_users as select * from users;Query OK, 1 row affected (0.06 sec)Records: 1  Duplicates: 0  Warnings: 0mysql> truncate table sql_users;Query OK, 0 rows affected (0.00 sec)mysql> select * from sql_users;Empty set (0.00 sec)

SQL _users.txt

1004 # Li Bai # libai@hotmail.com1005 # Du Fu # dumu@hotmail.com1006 # dufu@hotmail.com1007 # Su Shi # sushi@hotmail.com

Use the load data infilee statement to import DATA

Mysql> load data infile 'f: \ SQL _users.txt 'into table SQL _users fields terminated by' # '; Query OK, 4 rows affected (0.00 sec) Records: 4 Deleted: 0 Skipped: 0 Warnings: 0 mysql> select * from SQL _users; + ------ + ------------------ + | id | name | email | + ------ + ---------------------- + | 1004 | Li Bai | libai@hotmail.com | 1005 | Du Fu | dumu@hotmail.com | 1007 | Su Shi | sushi@hotmail.com | + ------ + ------------------ + 4 rows in set (0.00 sec)

If the character set is abnormal when the character string cannot be identified during data import, use the charset option to solve the problem.

Mysql> load data infile 'f: \ SQL _users.txt 'into table SQL _users fields terminated by' # '; ERROR 1366 (HY000): Incorrect string value: '\ xC0 \ xEE \ xB0 \ xd7' for column 'name' at row 1 -------------------------------- character set is different --------------------- mysql> load data infile' f: \ SQL _users.txt 'into table SQL _users character set gbk fields terminated by '#'; Query OK, 4 rows affected (0.03 sec) Records: 4 Deleted: 0 Skipped: 0 Warnings: 0

By default, the load data infile command is used to import DATA to the service. If you want to import DATA from the client, you can specify LOCAL, and mysql will read DATA from the client, this method is a little slower than the operation on the server, because the client data needs to be transmitted to the server over the network.

mysql> load data local infile 'f:\sql_users.txt' into table sql_users  fields terminated by '#';

If you need to IGNORE record values that are repeated with primary key values or REPLACE duplicate values, you can use the IGNORE or REPLACE option. However, the load data infile command syntax contains two IGNORE keywords, the previous one is used for this function, and the next one is used to specify the first N records to be ignored.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/1212244P3-3.jpg "title =" 5.png" alt = "wKiom1VG8e3jvNrVAAInfPim2Hc600.jpg"/>

If you do not want to import the first N rows of the data file, use ignore n lines for processing.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/12122441V-4.jpg "title =" 6.png" alt = "wkiol1vg84cjinb_aagj2zgnhe857.jpg"/>

If the line header in the data file contains some characters and you do not want to import them, you can use lines starting by. However, if a line record does not contain these characters, this row of records will also be ignored.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/12122452I-5.jpg "title =" 7.png" alt = "wKiom1VG8jLAEhrBAAJO-qUkBb0151.jpg"/>

Data files are processed in Excel files. First, the Excel files are saved in CSV format. fields are separated by commas before processing.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/12122464c-6.jpg "title =" 8.png" alt = "wKioL1VG87_hEyfXAAJH9ojE6Ec208.jpg"/>

The column values in the data file contain special characters and are processed using enclosed. For example, the column value contains delimiters.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/1212246124-7.jpg "title =" 8.png" alt = "wKioL1VG893ya5IFAAI8hvPpk3E453.jpg"/>

Line breaks during data import. In the preceding example, after several data are imported to the table, the query results are somewhat awkward.

In Windows, line breaks in the text format consist of "\ r + \ n", while in linux, the line breaks are "\ n ". Therefore, when the above problem occurs, the solution is to specify the line break lines terminated.

mysql> LOAD DATA INFILE 'F:\stu.csv' INTO TABLE stu CHARACTER SET GBK FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;

The number of columns in the table is the number of columns in the redundant data file. The solution is to specify the fields to be imported to the table, as shown below:

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/1212244314-8.jpg "title =" 8.png" alt = "wKiom1VG8pPBmfY6AAKwc3y6i-E654.jpg"/>

If the number of columns in a table is less than the number of columns in the data file, you can specify user variables to receive extra column values.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/12122440E-9.jpg "title =" 8.png" alt = "wKioL1VG9CKBG5RVAAI02wZwTwA553.jpg"/>

If the number of columns in a table is different from that in a data file and some field types are inconsistent, how can this problem be solved? The method is as follows:

------------------ Text ----------------------

Ps f: \> MORE. \ stu.csv

Student ID, name, class

4010404, Zhu xiaoxian, "A1012", 20, male, Information Institute

4010405, Xiao Xiaojie, "A1013", 22, female, external hospital

4010406, Zhong xiaoxi, "A1014", 24, male, Accounting Institute

4010407, Zhong Xiaohui, "A1015", 26, female, Business School

-------------------- Processing -------------------------

Mysql> desc stu; // table structure

+ -------- + ------------- + ------ + ----- + --------- + ------- +

| Field | Type | Null | Key | Default | Extra |

+ -------- + ------------- + ------ + ----- + --------- + ------- +

| Sno | int (11) | NO | PRI | NULL |

| Sname | varchar (30) | YES | NULL |

| Class | varchar (20) | YES | NULL |

| Age | int (11) | YES | NULL |

| Gender | tinyint (4) | YES | NULL |

+ -------- + ------------- + ------ + ----- + --------- + ------- +

Rows in set (0.01 sec)


Mysql> load data infile 'f: \ stu.csv 'into TABLE stu character set gbk fields terminated by', 'enabled by' "'Lines TER

Minated by '\ r \ n' IGNORE 1 LINES (SNO, SNAME, CLASS, AGE, @ GENDER, @ x) set gender = IF (@ GENDER = 'male ', 1, 0 );

Query OK, 4 rows affected (0.09 sec)

Records: 4 Deleted: 0 Skipped: 0 Warnings: 0


Mysql> SELECT * from stu;

+ --------- + -------- + ------- + ------ + -------- +

| Sno | sname | class | age | gender |

+ --------- + -------- + ------- + ------ + -------- +

| 4010404 | Zhu xiaoxian | A1012 | 20 | 1 |

| 4010405 | Xiao Xiaojie | A1013 | 22 | 0 |

| 4010406 | Zhong xiaoxi | A1014 | 24 | 1 |

| 4010407 | Zhong Xiaohui | A1015 | 26 | 0 |

+ --------- + -------- + ------- + ------ + -------- +

Rows in set (0.00 sec)

Data Export

Data export is relatively simple, as long as the SELECT... into outfile statement is used, such

Mysql> SELECT * from stu into outfile "F: \ stu_bak.txt" character set gbk fields terminated by '# 'Lines terminated by' \ r \ n ';

Query OK, 4 rows affected (0.00 sec)

---------------------------------Stu_bak.txt -----------------------

Ps f: \> MORE. \ stu_bak.txt

4010404 # Zhu xiaoxian # A1012 #20 #1

4010405 ## Xiao Xiaojie # A1013 ## 22 ## 0

4010406 ## Zhong xiaoxi # A1014 #24 ## 1

4010407 ## Zhong Xiaohui # A1015 ## 26 ## 0


There is also a SELECT... into dumpfile statement, which exports data to a file, but cannot format statements, such as FIELDS and LINES. It outputs original data to a file. However, only one record can be output, which is of little use.

650) this. width = 650; "src =" http://www.68idc.cn/help/uploads/allimg/151111/1212244W6-10.jpg "title =" qq .png "alt =" wKioL1Y6nrbRFtzqAADqcjpg4hg407.jpg "/>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.