PHP uses CURL to Capture webpages with multiple threads _ PHP Tutorial

Source: Internet
Author: User
PHP uses CURL to capture web pages with multiple threads. PHP uses CURL to implement multi-threaded web page capturing. PHP uses CurlFunctions to complete various file transfer operations, such as simulating a browser to send GET and POST requests. due to the restrictions of php, PHP uses CURL to implement multi-threaded web page capturing.

PHP uses Curl Functions to complete various file transfer operations, such as simulating a browser to send GET and POST requests. due to the fact that php does not support multithreading, it is inefficient to develop crawler programs, at this time, you often need to use Curl Multi Functions to implement concurrent Multi-threaded access to multiple URLs. Since Curl Multi Function is so powerful, can I use Curl Multi Functions to write concurrent Multi-threaded download files? of course, the following code is provided:

Code 1: Write the obtained code directly to a file

?

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

$ Urls = array (

'Http: // www.sina.com.cn /',

'Http: // www.sohu.com /',

'Http: // www.163.com /'

); // Set the URL of the page to be crawled

$ Save_to = '/test.txt'; // write the captured code to this file

$ St = fopen ($ save_to, "");

$ Mh = curl_multi_init ();

Foreach ($ urls as $ I =>$ url ){

$ Conn [$ I] = curl_init ($ url );

Curl_setopt ($ conn [$ I], CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0 )");

Curl_setopt ($ conn [$ I], CURLOPT_HEADER, 0 );

Curl_setopt ($ conn [$ I], CURLOPT_CONNECTTIMEOUT, 60 );

Curl_setopt ($ conn [$ I], CURLOPT_FILE, $ st); // you can write the crawled code to a file.

Curl_multi_add_handle ($ mh, $ conn [$ I]);

} // Initialization

Do {

Curl_multi_exec ($ mh, $ active );

} While ($ active); // execute

Foreach ($ urls as $ I =>$ url ){

Curl_multi_remove_handle ($ mh, $ conn [$ I]);

Curl_close ($ conn [$ I]);

} // End cleaning

Curl_multi_close ($ mh );

Fclose ($ st );

?>

Code 2: Put the obtained code into a variable before writing it to a file.

?

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

$ Urls = array (

'Http: // www.sina.com.cn /',

'Http: // www.sohu.com /',

'Http: // www.163.com /'

);

$ Save_to = '/test.txt'; // write the captured code to this file

$ St = fopen ($ save_to, "");

$ Mh = curl_multi_init ();

Foreach ($ urls as $ I =>$ url ){

$ Conn [$ I] = curl_init ($ url );

Curl_setopt ($ conn [$ I], CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0 )");

Curl_setopt ($ conn [$ I], CURLOPT_HEADER, 0 );

Curl_setopt ($ conn [$ I], CURLOPT_CONNECTTIMEOUT, 60 );

Curl_setopt ($ conn [$ I], CURLOPT_RETURNTRANSFER, true); // you can convert the crawler code into a string instead of writing it to the browser.

Curl_multi_add_handle ($ mh, $ conn [$ I]);

}

Do {

Curl_multi_exec ($ mh, $ active );

} While ($ active );

Foreach ($ urls as $ I =>$ url ){

$ Data = curl_multi_getcontent ($ conn [$ I]); // Obtain The crawled code string

Fwrite ($ st, $ data); // write a string to a file. Of course, you can also not write files, such as saving them to the database.

} // Get the data variable and write it to the file

Foreach ($ urls as $ I =>$ url ){

Curl_multi_remove_handle ($ mh, $ conn [$ I]);

Curl_close ($ conn [$ I]);

}

Curl_multi_close ($ mh );

Fclose ($ st );

?>

The above is all the content of this article. I hope you will like it.

Ghost PHP uses Curl Functions to complete various file transfer operations, such as simulating a browser to send GET and POST requests, which is limited by the php language itself...

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.