Asynchronous-PHP imports large amounts of data

Source: Internet
Author: User
There is a need to import the excel data to the data table, but the data volume is large. at most, tens of thousands of data entries need to be imported at a time. the imported data is not directly written to the database, the interface is used. the interface can process up to 1000 records at a time, which takes a long time. now I have an idea to make this one... there is a need to import the excel data to the data table, but the data volume is large. at most, tens of thousands of data entries need to be imported at a time. the imported data is not directly written to the database, the interface is used. the interface can process up to 1000 entries at a time, which takes a long time. now I have an idea. to make this a task-type import, each import creates an import task, make all tasks into queues and import tasks in sequence. What are the best methods and suggestions?

Reply content:

There is a need to import the excel data to the data table, but the data volume is large. at most, tens of thousands of data entries need to be imported at a time. the imported data is not directly written to the database, the interface is used. the interface can process up to 1000 entries at a time, which takes a long time. now I have an idea. to make this a task-type import, each import creates an import task, make all tasks into queues and import tasks in sequence. What are the best methods and suggestions?

You have a good idea. Unfortunately, PHP is not suitable for this kind of things. we suggest using Java or other languages to implement your ideas.

PHP directly submits the task parameters to the database.

Java accesses the database every MS in the background to check whether there are any tasks.

Php is 30 seconds by default; you can add set_time_limit (99999999 );
A few hundred thousand pieces of data should be finished in less than one minute.

Crontab + task list for background asynchronous tasks, which is suitable for this scenario

  1. Mysql load data infile: quickly import data to a temporary table

  2. The php daemon synchronizes data in batches to the interface (or crontab writes a scheduled task)

Concatenate php into SQL and directly import the mysql background source

It is best to import data in segments instead of importing data at once.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.