PHP uses CURL to capture webpages with multiple threads and phpcurl to capture webpages with multiple threads

Source: Internet
Author: User

PHP uses CURL to capture webpages with multiple threads and phpcurl to capture webpages with multiple threads

PHP uses Curl Functions to complete various file transfer operations, such as simulating a browser to send GET and POST requests. Due to the fact that php does not support multithreading, it is inefficient to develop crawler programs, at this time, you often need to use Curl Multi Functions to implement concurrent Multi-threaded access to multiple URLs. Since Curl Multi Function is so powerful, can I use Curl Multi Functions to write concurrent Multi-threaded download files? Of course, the following code is provided:

Code 1: Write the obtained Code directly to a file

<? Php $ urls = array ('HTTP: // www.sina.com.cn/', 'HTTP: // www.sohu.com/', 'HTTP: // www.163.com /'); // set the URL of the page to be crawled $ save_to = '/test.txt'; // write the captured code to the file $ st = fopen ($ save_to, ""); $ mh = curl_multi_init (); foreach ($ urls as $ I => $ url) {$ conn [$ I] = curl_init ($ url ); curl_setopt ($ conn [$ I], CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"); curl_setopt ($ conn [$ I], CURLOPT_HEADER, 0); c Url_setopt ($ conn [$ I], CURLOPT_CONNECTTIMEOUT, 60); curl_setopt ($ conn [$ I], CURLOPT_FILE, $ st ); // set to write the crawled code to the file curl_multi_add_handle ($ mh, $ conn [$ I]);} // initialize do {curl_multi_exec ($ mh, $ active );} while ($ active); // execute foreach ($ urls as $ I =>$ url) {curl_multi_remove_handle ($ mh, $ conn [$ I]); curl_close ($ conn [$ I]);} // end cleaning curl_multi_close ($ mh); fclose ($ st);?>

Code 2: Put the obtained code into a variable before writing it to a file.

<? Php $ urls = array ('HTTP: // www.sina.com.cn/', 'HTTP: // www.sohu.com/', 'HTTP: // www.163.com /'); $ save_to = '/test.txt'; // write the captured code to the file $ st = fopen ($ save_to, "a"); $ mh = curl_multi_init (); foreach ($ urls as $ I =>$ url) {$ conn [$ I] = curl_init ($ url); curl_setopt ($ conn [$ I], CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"); curl_setopt ($ conn [$ I], CURLOPT_HEADER, 0); curl_setopt ($ con N [$ I], CURLOPT_CONNECTTIMEOUT, 60); curl_setopt ($ conn [$ I], CURLOPT_RETURNTRANSFER, true); // The setting does not write the crawling code to the browser, instead, it is converted to the string curl_multi_add_handle ($ mh, $ conn [$ I]);} do {curl_multi_exec ($ mh, $ active);} while ($ active ); foreach ($ urls as $ I =>$ url) {$ data = curl_multi_getcontent ($ conn [$ I]); // get the crawled code string fwrite ($ st, $ data); // write a string to a file. Of course, you can also not write files, such as saving to the database} // get data variables, and write the file foreach ($ urls as $ I => $ url) {curl_multi_remove_handle ($ mh, $ conn [$ I]); curl_close ($ conn [$ I]);} curl_multi_close ($ mh); fclose ($ st);?>

The above is all the content of this article. I hope you will like it.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.