There is a need to import the excel data to the data table, but the data volume is large. at most, tens of thousands of data entries need to be imported at a time. the imported data is not directly written to the database, the interface is used. the interface can process up to 1000 records at a time, which takes a long time. now I have an idea to make this one... there is a need to import the excel data to the data table, but the data volume is large. at most, tens of thousands of data entries need to be imported at a time. the imported data is not directly written to the database, the interface is used. the interface can process up to 1000 entries at a time, which takes a long time. now I have an idea. to make this a task-type import, each import creates an import task, make all tasks into queues and import tasks in sequence. What are the best methods and suggestions?
Reply content:
There is a need to import the excel data to the data table, but the data volume is large. at most, tens of thousands of data entries need to be imported at a time. the imported data is not directly written to the database, the interface is used. the interface can process up to 1000 entries at a time, which takes a long time. now I have an idea. to make this a task-type import, each import creates an import task, make all tasks into queues and import tasks in sequence. What are the best methods and suggestions?
You have a good idea. Unfortunately, PHP is not suitable for this kind of things. we suggest using Java or other languages to implement your ideas.
PHP directly submits the task parameters to the database.
Java accesses the database every MS in the background to check whether there are any tasks.
Php is 30 seconds by default; you can add set_time_limit (99999999 );
A few hundred thousand pieces of data should be finished in less than one minute.
Crontab + task list for background asynchronous tasks, which is suitable for this scenario
Mysql load data infile: quickly import data to a temporary table
The php daemon synchronizes data in batches to the interface (or crontab writes a scheduled task)
Concatenate php into SQL and directly import the mysql background source
It is best to import data in segments instead of importing data at once.