The $excel_contetns that Xunhua out like this is what I want to collect.
When I collected 50 pages below, fortunately, when the number of pages, on the acquisition time-out, but also asked the hero, I how to deal with it???
------Solution--------------------
Depositing $url into a database
Successive reads from the database $value performing the acquisition, in the event of a new URL, repeat the first step
------Solution--------------------
Don't stick to the "no database", the essence of the problem is to split a single task into multiple tasks to complete
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.