Php batch download of images, online search code, not easy to use

Source: Internet
Author: User
Php batch download of images, online search code, not easy to use
Function project_statistics () {echo "11111"; $ url =" http://www.xxx.com:81/upload/image "; $ Content = file_get_contents ($ url); $ reg =" // "; preg_match_all ($ reg, $ content, $ matches); $ path = '.. /download/img '; if (! File_exists ($ path) {mkdir ($ path, 0777);} function downImage ($ url, $ filename = "") {if ($ url = "") return false; if ($ filename = "") {$ ext = strrchr ($ url ,". "); if ($ ext! = ". Gif" & $ ext! = ". Jpg" & $ ext! = ". Png" & $ ext! = "Jpeg") return false; $ filename = date ("YmdHis "). $ ext;} ob_start (); // make file that output from url goes to buffer readfile ($ url); // file_get_contents ($ url); this method cannot work !!! Only readfile $ img = ob_get_contents (); ob_end_clean (); $ fp = @ fopen ($ filename, "a"); // append fwrite ($ fp, $ img ); fclose ($ fp); return $ filename;} for ($ I = 0; $ I <count ($ matches [1]); $ I ++) {/* explode $ url_arr [$ I] = explode ('/', $ matches [1] [$ I]); $ last = count ($ url_arr [$ I]) -1; * // strrchr $ filename = strrchr ($ matches [1] [$ I], '/'); downImage ($ matches [1] [$ I], $ path. $ filename); // downImage ($ matches [1] [$ I], $ path. '/'. $ url_arr [$ I] [$ last]);}

Thank you for your advice. This is to download all images uploaded to the server to the local device. This also needs to be modified


Reply to discussion (solution)

The code you provided is to collect and download all the image tags on the target webpage ..

If you want to download the image uploaded to the server, you can use the downImage function in the above code to transfer the url of the server slice to the download link.

Can you give an example? Haha

Yes? Slice url ??
Cycle? All URLs, and then use file_get_contents? You can.

For example:
$ Url = "http://www.abc.com/1.jpg ';
$ C = file_get_contents ($ url );
File_put_contents (basename ($ url), $ c, true );




Refresh the browser as you write it. it is blank. Is it necessary to use any function to download it? Thank you.
Another question is: if there are multiple images, will it put all the addresses in a download list similar to thunder?

I wrote:
$ Url = "http://www.hrb.com: 81/upload/image/1.jpg ";
$ C = file_get_contents ($ url );
File_put_contents (basename ($ url), $ c, true );

Die ();

Attach the real image address:

? After all, in the local? Why ?? Find 1.jpg.

$url = "https://ss0.bdstatic.com/5a21bjqh_Q23odCf/static/superplus/img/logo_white_ee663702.png";$c = file_get_contents($url);file_put_contents(basename($url), $c, true); echo '';



A little confused. The local directory of the same level does not have the upload/image directory.
How can I find this? it will automatically create a folder under drive C or drive D of the client, and then put 1.jpg in it?



A little confused. The local directory of the same level does not have the upload/image directory.
How can I find this? it will automatically create a folder under drive C or drive D of the client, and then put 1.jpg in it?



That's you? Php object ?.? The same PHP file? .



When you see the file, it feels a bit like moving the file. Can I download it to another directory on my computer? If these images are on a network server

For example, to download to the desktop, just change the parameter?

For example, to download to the desktop, just change the parameter?



Which server is used ?? OK ?? Php, only? Which one can be obtained ?.

If you are Ben? Is it a server? Or apache? Yes? Restricted to your desktop. Unless you? Set desktop files ?? Apache? Yes ?.
Then file_put_contents ?, Routing ?? Inbound
File_put_contents ('/desktop/'. basename ($ url), $ c, true );




That is to say, it is not good to use the above methods to download images in batches or files in batches, right?

The entire folder should be compressed, and the package address should be given for users to download. Is that true?

If yes ?? Customer? End-to-end ?,? But zip ratio? Okay.


Thanks

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.