PHP bulk Download specified path picture instance

Source: Internet
Author: User
Tags readfile
know the path saved by the server slices, how to bulk download to local, where 850 folders and 014 file names are matched from 000, the file exists on the download, otherwise skipped.

The following code can only download the specified page: (Hope the Great God, thanked!) )

<?php$url = "http://site.com/"; $content =file_get_contents ($url); $reg = "//";p Reg_match_all ($reg, $content, $ matches);

The following code can only download the specified page: (, hope the great God advice, thanked!)

<?php$url = "http://site.com/"; $content =file_get_contents ($url); $reg = "//";p Reg_match_all ($reg, $content, $ matches); $path = './imgdownload '; if (!file_exists ($path)) {mkdir ($path, 0777);}    for ($i = 0; $i < count ($matches [1]), $i + +) {/*explode $url _arr[$i] = explode ('/', $matches [1][$i]);    $last = count ($url _arr[$i])-1;    *///STRRCHR $filename = STRRCHR ($matches [1][$i], '/');    Downimage ($matches [1][$i], $path. $filename); Downimage ($matches [1][$i], $path. ' /'. $url _arr[$i] [$last]);}    function Downimage ($url, $filename = "") {if ($url = = "") return false;        if ($filename = = "") {$ext =strrchr ($url, ".");        if ($ext! = ". gif" && $ext! = ". jpg" && $ext! = ". png" && $ext! = "JPEG") return false;    $filename =date ("Ymdhis"). $ext;    } ob_start ();    Make file, output from URL goes to buffer ReadFile ($url);  File_get_contents ($url); This method is not possible!!!    Only with ReadFile $img = Ob_get_contents ();    Ob_end_clean (); $fp = @fopen ($filename, "a");//append fwrite ($fp, $img);    Fclose ($FP); return $filename;}

You can use curl_multi_* series functions to request a remote address in bulk. Put all the requested pages into an array, each time 20 concurrent requests.
Suggestions to modify the next program ideas, first batch according to the page address to download the image address, the second step batch processing picture download.
Two parts can be culr_multi_* processed concurrently with the series functions.

The code is too bad, and there are a few mistakes that can help you to write a copy.

You can do it multiple times in a loop addPage .

If you already have a URL, you can also directly adjust it download .

for ($i = 0; $i <, + + $i)    download (' http://cdn.image.com/static/'. $i. '. png ', Download_path. $i. '. png ');

Efficiency is a bit low, you can consider the use of culr_multi_* series of functions.

 $referer) {//echo $url, "\ n"; $filename = PathInfo ($url, Pathinfo_filename). '.' .        PathInfo ($url, pathinfo_extension);    Download ($url, $path. $filename, $referer); }//Error_reporting (1);}    function AddPage ($page, & $urls) {$cur = Extractpage ($page);        for ($i = 0, $n = count ($cur), $i < $n, + + $i) {$j = $cur [$i];    if (!isset ($urls [$j])) $urls [$j] = $page;    }}function extractpage ($page, $reg = '//') {$content = file_get_contents ($page);    $content = ";    Preg_match_all ($reg, $content, $matches); return $matches [1];}    function Download ($url, $file, $referer = ") {$url = Abs_url ($url, $referer);    echo $url, "\ n";                         $opts = [' http ' = = [' method ' = ' GET ', ' header ' = ' ' accept-language:en\r\n '.                        "Cookie: \ r \ n". "Referer:". $url.    "\ r \ n"];    $context = Stream_context_create ($opts); File_put_contents($file, file_get_contents ($url, False, $context)); return $file;}    function Abs_url ($url, $referer) {$com = Parse_url ($url);    if (!isset ($com [' scheme ')) $com [' scheme '] = ' http ';        if ($referer!== ") {$ref = Parse_url ($referer);                if (!isset ($com [' Host '])) {if (Isset ($ref [' Host ')]) {$com [' host '] = $ref [' Host '];            if (!isset ($ref [' path ')) $ref [' path '] = '/'; if (Isset ($com [' path '][0]) {if ($com [' path '][0]!== '/') $com [' path '] = $ ref[' path '.            $com [' Path '];            } else if (Isset ($ref [' Host ')]) {$com [' path '] = $ref [' path '];        }} else {if (!isset ($com [' path ')]) $com [' path '] = '; }} return Unparse_url ($com);} function Unparse_url ($com) {return (Isset ($com [' Host '])? ($com [' Scheme ']. '://' . $com [' Host ']): '). $com [' Path ']. (Isset ($com [' query '])? '?'. $com[' query ']: ');} 
  • Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.