The {code...} code is as above. The source image can be opened, but the downloaded code is damaged. I tried to add the file type to the header. PHP encoding is also UTF-8, And it is useless. Adding ob_flush () to active ob_clean () is useless. Changing to the fopen function is also corrupt. Thank you !!...
The code above indicates that the source image can be opened, but the download to the local device is damaged.
I tried to add the file type to the header. PHP encoding is also UTF-8, And it is useless.
Adding ob_flush () to active ob_clean () is useless.
Changing to the fopen function is also corrupt.
Thank you !!
Supplement: this is also a corrupt http://segmentfault.com/q/1010000000156959.
Reply content:
The code above indicates that the source image can be opened, but the download to the local device is damaged.
I tried to add the file type to the header. PHP encoding is also UTF-8, And it is useless.
Adding ob_flush () to active ob_clean () is useless.
Changing to the fopen function is also corrupt.
Thank you !!
Supplement: this is also a corrupt http://segmentfault.com/q/1010000000156959.
The reason is very simple: the image is gzip.
Use file_get_contents ("compress. zlib: //". $ url );
$ch = curl_init('http://example.com/1b776066fa782b78.jpg');$fp = fopen('/my/folder/1b776066fa782b78.jpg', 'wb');curl_setopt($ch, CURLOPT_FILE, $fp);curl_setopt($ch, CURLOPT_HEADER, 0);curl_exec($ch);curl_close($ch);fclose($fp);
Header ("content-type: image/your_image_type ");
It should have been the opposite party's server that determined that the data obtained using file_get_contents () is incorrect.
Curl is available for testing.
Write a UDF
function curl_get_contents($url){ $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_HEADER, 0); $output = curl_exec($ch); curl_close($ch); return $output;}
Use curl_get_contents instead of file_get_contents.
Use the hexadecimal editor to open the downloaded image and view the file header.
Casperjs indicates that the question you answered above is too complicated. In case someone else adds a header to judge it, it will go down again.
Thanks for your answers. I have tested every code and it is still bad locally.
It should be because of some server problems or problems with my local virtual machine.
By default, all external links are open for that website. There should be no leeching problem. Try another server.
In short, thanks to the three answers. I have adopted the first answer. Thank you!
// Download and save the image
Function save_image ($ inPath, $ outPath)
{// Download images from remote server
$ ImgUrl = $ inPath;
$ In = fopen ($ inPath, "rb ");
$ Out = fopen ($ outPath, "wb ");
// Verify if (strpos ($ imgUrl, "http") at the beginning of http ")! = 0) {$ this-> stateInfo = $ this-> getStateInfo ("ERROR_HTTP_LINK"); return ;} // get the request header and check the dead chain $ heads = get_headers ($ imgUrl); if (! (Stristr ($ heads [0], "200") & stristr ($ heads [0], "OK") {return false ;} while ($ chunk = fread ($ in, 8192) {$ re = fwrite ($ out, $ chunk, 8192) ;} fclose ($ in ); fclose ($ out); if ($ re) {var_dump ('success'); return true;} else {var_dump ('false'); return false ;}
}
// Create a folder
Function mkdirs ($ dir, $ mode = 0777)
{
If (is_dir ($ dir) | @ mkdir ($ dir, $ mode )){
Return true;
}
If (! Mkdirs (dirname ($ dir), $ mode )){
Return false;
}
Return @ mkdir ($ dir, $ mode );
}
// Store images
Function download_img ($ url, $ dir_prefix)
{
$ Path = explode ('/', $ url );
$ Domain = 'HTTP: // '. $ path [2];
$ Newpath = $ dir_prefix.explode ($ domain, $ url) [1];
// Mkdir
$ D = explode ('/', $ newpath );
$ Dir = explode ($ d [count ($ d)-1], $ newpath) [0];
Mkdirs ($ dir );
$ Filename = $ newpath;
If (file_exists ($ filename )){
Echo "The file $ filename exists ";
} Else {
$ Img = save_image ($ url, $ newpath );
}return $domain;
}