Curl obtains api interface data, while loop page ++ receives and processes 100 data records per page. There are tens of thousands of data records in total. The server always times out. How can this problem be solved? Max_execution_timeset_time_limit curl obtains api interface data, while loop page ++ receives and processes data, 100 entries per page. There are tens of thousands of data records in total. The server always times out. How can this problem be solved?
Max_execution_time set_time_limit
Reply content:
Curl obtains api interface data, while loop page ++ receives and processes 100 data records per page. There are tens of thousands of data records in total. The server always times out. How can this problem be solved?
Max_execution_time set_time_limit
You can't take it anymore. Just take it easy, sleep ~
If you are using a spider to capture the page content, it is recommended that you use a program to capture the content through the proxy IP address. Otherwise, it will be troublesome for people to see that you frequently crawl and directly seal the IP address;
If it's internal, as @ ChanneW said, sleep takes a long time and takes a little longer.
There is indeed such a problem, the best solution is to make a collection application in combination with the database, the collection of successful changes to the status of 1, the acquisition of the changes to the status of 0, and then through the scheduled script, round Robin collection
Analyze the cause of failure. If there is a large amount of traffic during the working hours, you can stagger the time period, for example, collecting data at night/early morning, hoping to help you.
To prevent timeout, you can also perform a non-stop refresh jump. Request again.