Handling of problems generated when importing big data using php + ajax _ PHP Tutorial

Source: Internet
Author: User
Handling of problems arising when importing big data using php + ajax. Let's talk about the problems from the first to the second. Question 1 according to my original idea, upload the file first and then read the file. Here the problem arises. when the file size is large, the upload speed is slow. The problems encountered by the import will be explained from the first to the second.

Question 1According to my original idea, upload the file first and then read the file. The problem is that when the file size is large, the upload speed is slow. as a result, the operations that the customer sees remain in the waiting status and are not user-friendly.

Solution: I did this. there is a better way for God to introduce it. I will upload the file first, save the file to a specific folder, and call it import. then I will return the name of the file. This ensures that the file is successfully uploaded. In addition, I can use js to give the customer a prompt when returning the name. Then ajax requests php to read the file and insert it into the database. But the problem arises.

Question 2When I use ajax to request php to read files and insert data into the database, the problem is that ajax requests are always broken in 1 minute. I think this should be the reason for the maximum execution time of php max_execution_time. The result is changed to 300 seconds. In this case, I think it will be the maximum get time of apache max_input_time. I will add an ini_set result in the code and use ini_get to view max_input_time. the setting with ini_set is invalid, or 60 seconds, I checked a lot of information online, but I still don't know why. If you know something, please reply to me. Cainiao thanked me first. No way. I can only modify the php. ini configuration on the server. The manager said he would not allow the modification. he secretly changed it for testing. the modification was finally made. After the modification, the test still fails. The execution times out in one minute. I am really wondering. I don't know why. Please advise. No way.

This method does not work, and a 5 MB file can only be read by a branch. The next step is to make a modification to the code. the branch reads the code in this way. first, ajax requests are made, and then 2000 pieces of data are read each time before processing the 2000 pieces of data, insert a database (This article introduces a useful branch-based function ). After each ajax execution, a status character is returned, and the number of rows read this time, and then the next read. The last read is complete. There is also a problem in the process: I encountered this problem when I re-checked each row of data. this is the case. I made a loop on the obtained content, then check whether each row exists. when I judge whether $ count is greater than 0, when it already exists, I use continue to execute the next loop. However, when I import 10000 entries, the system always reports an internal server error when I import 8000 entries. It's boring. you can only use if else for the result. Wondering. A small reminder: Do not insert data one by one when inserting a database. it is best to inset into aaa ('XX', 'XXX') values ('20170101', '20170101 '), ('20140901', '20160901 '). This is much faster.

The row number reading function. we recommend that you use the SplFileObject class library. If you have any questions, please kindly advise.

The code is as follows:


Function getFileLines ($ filename, $ startLine, $ endLine, $ method = 'RB '){
$ Content = array ();
$ Filename = DATA_PATH.DS. 'import'. DS. $ filename;
$ Count = $ endLine-$ startLine;
$ Fp = new SplFileObject ($ filename, $ method );
$ Fp-> seek ($ startLine); // go to row N, and the seek method parameter starts counting from 0.
For ($ ii = 0; $ ii <= $ count; ++ $ ii ){
$ Content [] = $ fp-> current (); // current () get the content of the current row
$ Fp-> next (); // next row
}
Return array_filter ($ content); // array_filter: false, null ,''
}

Bytes. Question 1 according to my original idea, upload the file first and then read the file. The problem is that when the file size is large, the upload speed is slow...

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.