Php+mysql BULK INSERT Data related issues

Source: Internet
Author: User
Tags bulk insert mysql command line
Php+mysql BULK INSERT Data issues
I wrote a code for bulk import data.
The process is to read the data in a data table before inserting it into the new table.
For the first time, I read them all and then insert them together. The result is that the amount of data is too large.

The second time, I modified the code, first import the first 150, when the user click "Continue" when the import of the next 150, so repeatedly, found that the general import 450 to 600 of the time will be error. If you wait a few seconds after importing 300 and then click "Continue". The import will be fine.

I would like to ask you prawns, what is the best way to solve this kind of reading from one data table large data import to another data table method???

------Solution--------------------
Locate the Max_execution_time in the php.ini file and change the value to a larger one.
Estimate how much time your data has been run and set the appropriate values.
I did it a while ago, and I set it to 30000.
------Solution--------------------
Typically, you use the MySQL command line to export and then import it as a MySQL command line. (Mysqldump,mysqlimport,source ...)
Or use phpMyAdmin code to run the export import (hit the big text to pay attention to php.ini some of the settings, memory_limit,upload_max_filesize ...)
------Solution--------------------
Connection to MySQL, after the insertion operation, timely exit.
may be related to MySQL connection number.
------Solution--------------------
Learn!!
------Solution--------------------
Learn

------Solution--------------------
There should be no such situation, maybe your code is not set,
Generally, the PHP timeout is set to 0
Set_time_limit (0); to achieve
And then every read one, it inserts a bar,
But in the loop, remember to control the memory,
In addition, if there are large characters in the record, it is best to generate the SQL file, then use the source guide,
I use the loop to read and then import the most processed and million records, but also basically nothing, once done. Like hundreds of of you make a mistake, certainly not.
------Solution--------------------
MySQL Enterprise Edition Manager, the one that supports large data volumes
  • Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.