I now use php to export csv files, and then query the data in batches. then, the foreach loop writes a file in one row. after writing, execute the download method to download the file, my current problem is that if there is too much data, the php program is finished and the file is still being written ,... I now use php to export csv files, and then query the data in batches. then, the foreach loop writes a file in one row. after writing, execute the download method to download the file, my current problem is that if there is too much data, the php program has been executed and the file is still being written. the download method cannot be executed, so the file cannot be downloaded. Is there any way to solve this problem?
Reply content:
I now use php to export csv files, and then query the data in batches. then, the foreach loop writes a file in one row. after writing, execute the download method to download the file, my current problem is that if there is too much data, the php program has been executed and the file is still being written. the download method cannot be executed, so the file cannot be downloaded. Is there any way to solve this problem?
This data volume usually times out for a single php request.
We recommend that you use the php file to process only 10 pieces of data at a time, append the data to the csv file, and return the results quickly in a single request.
Then write a js program. ajax calls the php file cyclically until all data is processed, triggering the webpage download action. In this way, the front-end web page can display the processing progress in real time.
Because csv is essentially plain text, you can use JavaScript and php to initiate multiple requests to batch upload strings, and finally merge JavaScript to generate files (the link is in the form of data)
The file is not completely generated, and the button is disable.