PHP download/capture remote images to local

Source: Internet
Author: User
Tags php download save file

/** * Download remote picture to local * * @param string $url remote file address * @param string $filename the saved file name (the file name is randomly generated when empty, otherwise the original file name) * @param array $fil  EType allowed File types * @param string $dirName The path saved by the file (the remainder of the path is automatically generated based on the time system) * @param int $type How to get the file remotely * @return JSON returns the file name, save path * @author blog.snsgou.com */function download_image ($url, $fileName = ", $dirName, $fileType = array (' jpg ', ' gif ', ' PNG ') ), $type = 1) {if ($url = =) {return false;} Gets the file's original filename $defaultfilename = basename ($url);//Gets the file type $suffix = substr (STRRCHR ($url, '. '), 1); if (!in_array ($suffix, $ FileType)) {return false;} Set the saved file name $filename = $fileName = = "? Time (). Rand (0, 9). ‘.‘ . $suffix: $defaultFileName;//Get Remote file resource if ($type) {$ch = Curl_init (); $timeout = 30;curl_setopt ($ch, Curlopt_url, $url); curl_setopt ($ch, Curlopt_returntransfer, 1); curl_setopt ($ch, Curlopt_connecttimeout, $timeout); $file = Curl_exec ($ch ); Curl_close ($ch);} Else{ob_start (); ReadFile ($url); $file = Ob_get_contents (); Ob_end_clean ();} Set file save path//$dirName = $dirName. ‘/‘ . Date (' Y ', tIME ()). ‘/‘ . Date (' m ', Time ()). ‘/‘ . Date (' d ', Time ()), $dirName = $dirName. ‘/‘ . Date (' Ym ', Time ()), if (!file_exists ($dirName)) {mkdir ($dirName, 0777, true);} Save File $res = fopen ($dirName. ‘/‘ . $fileName, ' a '); Fwrite ($res, $file); fclose ($res); return Array (' fileName ' = = $fileName, ' savedir ' = $dirName);}

Actual Combat experience:

Some of the blog's images are directly referenced by other sites, these days do not know how to, estimated that the other side did the anti-theft chain operation, resulting in the blog image display does not come out, no way, had to use PHP bulk collection down, and batch replacement blog post image address:

/** * Bulk Download the images in the blog to local */public function index () {global $_g; $blogModel = model (' blog ', ' blog '); $list = $blogModel->order ( ' Gid desc ')->limit ()->findpage (); $page = GET_GPC (' page ')? GET_GPC (' page '): 1; $totalPages = $list [' totalpages ']; $page = $page + 1;if ($page > $totalPages) {die (' Update complete! ‘);} foreach ($list [' data '] as $val) {$content = $val [' content ']; $excerpt = $val [' excerpt '];$_g[' iscontentupdate '] = $_g[' Isexcerptupdate '] = false;/* content */$content = Preg_replace_callback ("/src=\" (http:\/\/images\.cnblogs\.com\/cnblogs_ Com[^\ "]+" \ "/", function ($matches) {global $_g;$_g[' iscontentupdate '] = true;//download remote picture to local $res = Download_image ($ Matches[1], ' old ', ' d:/php/xampp/htdocs/emlog/data/upload ');//return the downloaded Image URL address return ' src= '/data/upload/'. Date (' Ym ', Time ()). ‘/‘ . $res [' FileName ']. ' "';}, $content);/* Summary */$excerpt = Preg_replace_callback ("/src=\ "(http:\/\/images\.cnblogs\.com\/cnblogs_com[^\"]+ ) \ "/", function ($matches) {global $_g;$_g[' isexcerptupdate '] = true;//download remote picture d to local $res = Download_image ($matches [1], ' old ', ' d:/php/xampp/htdocs/emlog/data/upload ');//return the downloaded Image URL address return ' src= '/data/ Upload/'. Date (' Ym ', Time ()). ‘/‘ . $res [' FileName ']. ' "';}, $excerpt);/* Update database */$where = array (' gid ' = $val [' gid ']); $data = Array (); if ($_g[' iscontentupdate ') {$data [' Content '] = $content;} if ($_g[' isexcerptupdate ') {$data [' excerpt '] = $excerpt;} if ($data) {$blogModel->where ($where)->save ($data);}} /* Update next page */$url = URL (' blog/main/index ', Array (' page ' = + $page)); $msg = ' updating '. $page. ‘/‘ . $totalPages; Redirect ($url, 2, $msg);}

PHP download/capture remote pictures to local

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.