PHP using Curl functions can complete a variety of transfer file operations, such as Analog browser to send Get,post request, etc., limited by the PHP language itself does not support multithreading, so the development of the crawler program is not efficient, this time often need to use the Curl Multi Functions it can realize concurrent multithreading access to multiple URL addresses. Since the Curl Multi function is so powerful, can you use Curl Multi functions to write concurrent multi-threaded download files, of course, here are my code:
Code 1: Write the obtained code directly to a file
?
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 The |
<?php $urls = Array (' http://www.sina.com . cn/', ' http://www.sohu.com/', ' http://www.163.com/'); Set the page URL to crawl $save _to= '/test.txt '; Write the crawled code to the file $st = fopen ($save _to, "a"); $MH = Curl_multi_init (); foreach ($urls as $i => $url) {$conn [$i] = Curl_init ($url); curl_setopt ($conn [$i], curlopt_useragent, "mozilla/ 4.0 (compatible; MSIE 7.0; Windows NT 6.0) "); curl_setopt ($conn [$i], Curlopt_header, 0); curl_setopt ($conn [$i], curlopt_connecttimeout,60); curl_setopt ($conn [$i], Curlopt_file, $st); Sets the crawled code to be written to the file Curl_multi_add_handle ($MH, $conn [$i]); }//initialization do {curl_multi_exec ($MH, $active);} while ($active); Execute foreach ($urls as $i => $url) {curl_multi_remove_handle ($MH, $conn [$i]); Curl_close ($conn [$i]);}//End cleanup Curl_multi_close ($MH); FClose ($st);?> |
Code 2: Put the obtained code first into a variable and then write to a file
?
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 The |
|
The above is the entire contents of this article, I hope you can enjoy.