Import and export of MySQL tables

Source: Internet
Author: User
Tags types of tables

⒈select. into outfile export data/load data infile Import
⑴ Creating a new table, defining the table structure
Ⅰ method One: in the same database
① copy table structure and data to a new table
CREATE TABLE newtable SELECT * from oldtable
② copy table structure to new table only
CREATE TABLE newtable SELECT * from oldtable WHERE 1=2
Ⅱ method Two: Fit between library and library, or single library
① copy table structure to new table only
Export:
Mysqldump-uroot-pdbpasswd-d dbname [tablename]>createtable.sql;
Import:
Mysql>source/path/createtable.sql or Mysql-u root-p <createtable.sql;
Mysql> Source/usr/yzx_loadtest/createbus_dt.sql;
Query OK, 0 rows affected (0.01 sec)
⑵ Exporting/Importing table data
Number of exports
SELECT * from Mytbl to outfile '/tmp/mytbl2.txt ' fields terminated by ', ' enclosed by ' ' lines terminated by ' \ n ';
SELECT * from Bus_dt to outfile '/usr/yzx_loadtest/bus_dtbackdata.txt ' fields terminated by ', ' enclosed by ' "' Lines te rminated by ' \ n ';
Import data
LOAD DATA [Low_priority | CONCURRENT] [LOCAL] INFILE ' file_name.txt ' [REPLACE |     IGNORE] into TABLE tbl_name [fields [TERMINATED by ' string '] [[[Optionally] enclosed by ' char '] [escaped by ' char '] [LINES [Starting by ' string '] [TERMINATED by ' string ']
Load data infile '/usr/yzx_loadtest/bus_dtbackdata.txt ' into table Bus_dt fields terminated by ', ' enclosed by ' "' Lines t erminated by ' \ n ';
About null values
Null value is processed as \ n

⑶ How to save import/export time for big data sheets


Ⅰ for MyISAM types of tables, you can quickly import large amounts of data in the following ways. ALTER TABLE tblname DISABLE KEYS; Loading the data ALTER TABLE tblname ENABLE KEYS; These two commands are used to open or close an update for a non-unique index of the MyISAM table. When importing large amounts of data into a non-empty MyISAM table, you can increase the efficiency of the import by setting these two commands. For importing large amounts of data into an empty MyISAM table, the default is to import the data first and then create the index, so you don't have to set it up.


Ⅱ for InnoDB types of tables, there are several ways to improve the efficiency of the import:
① because tables of the InnoDB type are saved in the order of the primary key, the imported data is arranged in the order of the primary key, which effectively improves the efficiency of the imported data. If the InnoDB table does not have a primary key, an internal column is created by default as the primary key, so if you can create a primary key for the table, you can use this advantage to improve the efficiency of importing data.
② performs a set unique_checks=0 before importing data, turns off the uniqueness check, performs a set Unique_checks=1 after the import is complete, and restores the uniqueness check to improve the efficiency of the import.
③ If an app uses autocommit, it is recommended to perform set autocommit=0 before importing, turn off Autocommit, execute set autocommit=1 after import, turn on autocommit, or increase the efficiency of the import.


Ⅲ for the table-level tens of millions or billions, need to file for import;
Why? The load data file needs to write a binary log, and if it is a master-slave copy, there will be a delay, and if it fails, the fallback will still take time.
To view the number of rows inserted into the data:
[Email protected] yzx_loadtest]# wc-l bus_dtbackdata.txt
1021 Bus_dtbackdata.txt
Splitting with split:
[Email protected] yzx_loadtest]# split-l bus_dtbackdata.txt bus_dtbackdata.txt001
[Email protected] yzx_loadtest]# ls-l
Total 48
-rw-rw-rw-1 MySQL MySQL 17247 Apr 9 11:45 bus_dtbackdata.txt
-rw-r--r--1 root root 7907 Apr 9 13:40 BUS_DTBACKDATA.TXT001AA
-rw-r--r--1 root root 8939 Apr 9 13:40 Bus_dtbackdata.txt001ab
-rw-r--r--1 root root 401 Apr 9 13:40 bus_dtbackdata.txt001ac
-rw-r--r--1 root root 1640 Apr 9 11:50 createbus_dt.sql
[Email protected] yzx_loadtest]#

Use after partitioning:
Load data infile '/USR/YZX_LOADTEST/BUS_DTBACKDATA.TXT001AA ' into table Bus_dt fields terminated by ', ' enclosed by ' "' Li NES terminated by ' \ n ';
Load data infile '/usr/yzx_loadtest/bus_dtbackdata.txt001ab ' into table Bus_dt fields terminated by ', ' enclosed by ' "' Li NES terminated by ' \ n ';

Note: You must ensure that the imported library is the same character set as the exported library
Show variables like ' char% '
Set ... =gbk;

This article is from the "Wind Water" blog, please be sure to keep this source http://linuxybird.blog.51cto.com/5689151/1638053

Import and export of MySQL tables

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.