Share PHP Source Batch crawl Remote Web page picture and save to local implementation method _php instance

Source: Internet
Author: User
Tags explode file url php source code readfile save file

As a copy-station workers, when the site has a copyright when even encrypted when the Webzip also flameout, how to pick up the page pictures and background pictures. Sometimes, may think of using Firefox, this browser seems to be a powerful bug, the article has copyright, the right to screen, Firefox will not be affected.

But as a php-loving developer, it's more like doing it yourself. So, I wrote the following one source, PHP remote crawl picture applet. You can read the CSS file and grab the background picture in the CSS code, which is also written for capturing the pictures in CSS.

<?php header ("content-type:text/html;
    Charset=utf-8 "); Error_reporting (e_error|
 e_warning);  Global Configuration $fromFile = "Aaa.css";  The file to crawl $savePath = "TTTTTTTTT"; Save path $fromDomain = "http://www.xxx.com/";
 To crawl the domain//read the CSS style and isolate all image urls $str = file_get_contents ($fromFile); 
 $STRARR = Explode ("url (", $str);
 $i = 0;
 foreach ($strArr as $val) {$val 1 = explode (")", $val); if (Strpos ($val 1[0], ' jpg ') | | Strpos ($val 1[0], ' png ') | |
 Strpos ($val 1[0], ' gif ') $IMGURL [$i + +] = $val 1[0];
 //ps: The above can be a regular, but I think it is good//start to crawl foreach ($imgUrl as $url) {if ($url = =) Continue;
 $filename = $savePath. $url;
 $url = $fromDomain. $url;
 GetImage ($url, $filename);
 function GetImage ($url, $filename) {Ob_start (); $context = stream_context_create (Array (' HTTP ' => array (' follow_location ' => false//don ' t fo
 Llow redirects));
 Make sure that the fopen wrappers in the php.ini has been activated ReadFile ($url, False, $context);
        $img = Ob_get_contents ();
 Ob_end_clean (); $FP 2 = @fOpen ($filename, "a");
 Fwrite ($fp 2, $img);
 Fclose ($fp 2);
 echo $filename. "Ok√<br/>"; }?>

Then there is no accident, you will find that the folder you specify is full of pictures, haha.

ps:php get remote pictures and download them to local

Share a function code that uses PHP to get remote pictures and save remote picture downloads to Local:

* * Function: php perfect download remote picture save to Local * parameter: file URL, save file directory, save file name, use download method * Use the remote file name when saving the filename is empty * * function getImage ($url, $save _di R= ', $filename = ', $type =0) {if (trim ($url) = = ') {return array (' file_name ' => ', ' Save_path ' => ', ' Error ' =>1) 
  ; 
  ' If (Trim ($save _dir) = = ') {$save _dir= './'; 
    } if (Trim ($filename) = = ") {//save filename $ext =strrchr ($url, '. '); 
    if ($ext!= '. gif ' && $ext!= '. jpg ') {return array (' file_name ' => ', ' Save_path ' => ' ", ' Error ' =>3); 
  $filename =time (). $ext; 
  } if (0!==strrpos ($save _dir, '/')) {$save _dir.= '/'; //Create the Save directory if (!file_exists ($save _dir) &&!mkdir ($save _dir,0777,true)) {return array (' file_name ' => '), ' s 
  Ave_path ' => ', ' Error ' =>5); 
    ///Get remote files by the method if ($type) {$ch =curl_init (); 
    $timeout = 5; 
    curl_setopt ($ch, Curlopt_url, $url); 
    curl_setopt ($ch, curlopt_returntransfer,1); 
    curl_setopt ($ch, Curlopt_connecttimeout, $timeout); 
    $img =curl_exec ($ch); Curl_closE ($ch); 
    }else{Ob_start (); 
    ReadFile ($url); 
    $img =ob_get_contents (); 
  Ob_end_clean (); 
  }//$size =strlen ($IMG); 
  File size $fp 2= @fopen ($save _dir. $filename, ' a '); 
  Fwrite ($fp 2, $img); 
  Fclose ($fp 2); 
  Unset ($img, $url); 
Return Array (' file_name ' => $filename, ' Save_path ' => $save _dir. $filename, ' error ' =>0);  }

The above content is small to share the PHP source code batch Crawl remote Web page pictures and save to the local implementation method, I hope you like.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.