Slow mysql Data Import Solution

Source: Internet
Author: User
Tags mysql import
Mysql import data too slow solution mysql using mysql-usetest; mysql-setnamesutf8; mysql-sourceD: ceshi. when SQL is used to import data, the execution speed is quite slow. "You can save EXCEL as csv format and use the loaddata method. This method is faster than insert." innodb_flush_log_at_trx is not used.

Mysql-use test; mysql-set names utf8; mysql-source D:/ceshi. when SQL is used to import data, the execution speed is quite slow. "You can save EXCEL as csv format, and then use the load data method. This method is faster than insert." innodb_flush_log_at_trx is not used.

Slow mysql Data Import Solution
In mysql
Mysql-> use test;
Mysql-> set names utf8;
Mysql-> source D:/ceshi. SQL
The data import speed is quite slow.

"You can save the EXCEL file as a csv file and use the load data method. This is faster than insert"

This method has not been tried

Change the innodb_flush_log_at_trx_commit parameter to 0 and then restart the database, which is much faster than you did.

This can be


Export (Backup): mysqldump-u username-p databasename> exportfilename

Import (Restore): method 1 mysql-u username-p databasename

Method 2 enter the MySQL Database Console use database name, and then: source importfilename

Importing data is very slow
Export the scheme in JQ1 and import it to JQ2 (the exported data file is 90 MB ). The above two methods are used, but they are not successful, or they are very slow (it is estimated that they can be completed in 1 or 2 days ).

Solution ):

View the mysql parameters of JQ2:
Show variables like 'max _ allowed_packet ';
Show variables like 'net _ buffer_length ';
The two results are: 1047552 and 16384, respectively.

Export data from JQ1:
Mysqldump-uroot-pXXX solution name -- skip-opt -- create-option -- set-charset -- default-character-set = gbk-e
-- Max_allowed_packet = 1047552 -- net_buffer_length = 16384> exported file path and file name

Note: max_allowed_packet and net_buffer_length cannot be greater than the set value of the target database. Otherwise, an error may occur.

-E uses the multiline INSERT syntax that includes several VALUES lists;
-- Max_allowed_packet = maximum size of the cache for communications between the XXX client/server;
-- Net_buffer_length = XXX TCP/IP and socket communication buffer size, create a line with length up to net_buffer_length.

This parameter specifies the size of a cache zone to store the SQL statements sent by users. If the received SQL statement is greater than the cache, the size is automatically increased until max_allowed_packet

Import exported data to JQ2
./Mysql-uroot-pXXX -- default-character-set = gbk solution name <导出的文件路径和文件名
The import process is much faster than executing Multiple SQL statements each time.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.