Curl gets API interface data, while loop page++ receives processing data, 100 data per page. The total data is a hundred thousand of articles. Always server timeout, how to solve?
Max_execution_time Set_time_limit
Reply content:
Curl gets API interface data, while loop page++ receives processing data, 100 data per page. The total data is a hundred thousand of articles. Always server timeout, how to solve?
Max_execution_time Set_time_limit
People can't stand it, you slow it down, sleep a bit ~
If the spider is sitting on the page content, it is recommended to use the program through the proxy IP to crawl, or else people see you crawl frequent direct IP, on the trouble;
If it's internal, like @ChanneW said, sleep for a long time, slow catch.
Indeed there is such a problem, the best solution is to combine the database to do a collection of applications, the success of the acquisition of the modified State is 1, the acquisition of the modified state is 0, and then through the timing of the script, polling collection
Analysis of the reasons for the failure, is not working time section of the traffic is larger, resulting in, can stagger the time period, such as Night/morning collection, hoping to help you.
To prevent timeouts, you can also perform a non-stop refresh jump. Re-make the request.