The bottom layer of curl is implemented by a command-line tool for obtaining remote files or transferring files, and more so, to simulate get/post form submissions. Users can also upload files, crawl files, support FTP/FTPS,HTTP/HTTPS and other protocols, in general, these types of servers, curl can be crawled.
Today, I tried to use curl to crawl a boring web of pictures. On the code.
1<?PHP2 $url= ' http://wuliaoo.com/tuhua/';3 $ch=curl_init ();4 5curl_setopt ($ch, Curlopt_url,$url);6curl_setopt ($ch, curlopt_returntransfer,1);//sets the information obtained by curl_exec () to be returned as a character stream7curl_setopt ($ch, CURLOPT_SSLVERSION,CURL_SSLVERSION_TLSV1);//Set the Curl option to TLS8 9 $data= Curl_exec ($ch);Ten $info= Curl_getinfo ($ch); OneCurl_close ($ch); A //Var_dump (Htmlentities ($data)); Print the obtained web page source code, but without adding htmlentities function, the page output - Preg_match_all('/]*src= (\ ' |\ ') (. *?) \\1[^>]*>/i ',$data,$array);//match Regular, all IMG tags - the $path= './aaimage/';//The catalogue was built in advance. - foreach($array[2] as $k=$v) { - if(fopen($array[2] [$k], ' R ')){ - Ob_clean();//empty the contents of the PHP cache. This step is actually not necessary, but for insurance, I added. + $ch=curl_init (); -curl_setopt ($ch, Curlopt_url,$v); +curl_setopt ($ch, Curlopt_returntransfer, 1); Acurl_setopt ($ch, Curlopt_connecttimeout, 30); at $file= Curl_exec ($ch); -Curl_close ($ch); - $filename=PathInfo($v, pathinfo_basename);//get file name and suffix - - $resource=fopen($path.$filename, ' a '); - fwrite($resource,$file); in fclose($resource); - } to +}
In fact, we can encapsulate this piece of code well. You can also change the access page content to file_get_contents (). You can also get the contents of a Web page.
functionDown_pic ($url=NULL){ $data=file_get_contents($url); Preg_match_all('/]*src= (\ ' |\ ') (. *?) \\1[^>]*>/i ',$data,$array); $path= './aaimage/'; foreach($array[2] as $k=$v) { if(fopen($array[2] [$k], ' R ')){ $ch=Curl_init (); curl_setopt ($ch, Curlopt_url,$v); curl_setopt ($ch, Curlopt_returntransfer, 1); curl_setopt ($ch, Curlopt_connecttimeout, 30); $file= Curl_exec ($ch); Curl_close ($ch); $filename=PathInfo($v,pathinfo_basename); $resource=fopen($path.$filename, ' a '); fwrite($resource,$file); fclose($resource); } }}//This is a picture of getting 5 pages of the site for($i= 1;$i<=5;$i++){ $url= ' http://wuliaoo.com/tuhua/page/'.$i; Down_pic ($url);}
Gets the number of pages each site is not fixed.
This is the first page. page without parameters or with parameters can jump to this one.
This is the URL of the second page
So use the For loop to get the words directly to the following parameters can be. Of course, there are many areas of code that can be optimized. Please correct me.
Above
PHP Curl Grab Mesh