I used php to develop a small module that checks whether a website is running normally. my idea is to construct an http request and check the status code returned by it. If it is 200, it proves that the website can be accessed normally.
A function called curl is used to obtain the status code,
However, a problem was found. When I run only once, I can return code 200. if I continuously detect multiple websites, I only return 0. if the website fails, I can return code 404.
But the normal website still returns 0. who has used this function? can you tell me?
Reply to discussion (solution)
We recommend that you paste your code for analysis.
$ch = curl_init();curl_setopt($ch,CURLOPT_URL,"http://bbs.csdn.net/topics/390781797");curl_setopt($ch,CURLOPT_FOLLOWLOCATION,1); curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);curl_setopt($ch, CURLOPT_HEADER, 1);$html = curl_exec($ch);curl_close($ch);list($header, $body) = explode("\r\n\r\n", $html, 2);var_dump(http_parse_headers($header));
Http_parse_headers () needs to be downloaded separately. refer:
Http://stackoverflow.com/questions/6368574/how-to-get-the-functionality-of-http-parse-headers-without-pecl
$ch = curl_init();curl_setopt($ch,CURLOPT_URL,"http://bbs.csdn.net/topics/390781797");curl_setopt($ch,CURLOPT_FOLLOWLOCATION,1); curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);curl_setopt($ch, CURLOPT_HEADER, 1);$html = curl_exec($ch);curl_close($ch);list($header, $body) = explode("\r\n\r\n", $html, 2);var_dump(http_parse_headers($header));
Http_parse_headers () needs to be downloaded separately. refer:
Http://stackoverflow.com/questions/6368574/how-to-get-the-functionality-of-http-parse-headers-without-pecl
$ Ch = curl_init (); $ list1 = array ('www .baidu.com ', 'www .taobao.com', 'www .alipay.com ', 'localhost/kk/', 'www .yy.net '); curl_setopt ($ ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt ($ ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt ($ ch, CURLOPT_HEADER, 1); for ($ I = 0; $ I
How about this? I want to perform batch detection, but it seems very slow. it will only take a few seconds for feedback.
We recommend that you paste your code for analysis.
Thank you for reminding me.
You can also use the get_headers function, as shown in the DEMO below.
Header ('content-Type: text/html; Charset = UTF-8 '); $ result = @ get_headers ('http: // www.qqhaowan.com'); if ($ result) {print_r ($ result); if (strpos ($ result [0], '000000') {echo 'website accessible! ';} The else {echo' website cannot be accessed! ';}} Else {echo' the target URL cannot be opened! ';}
You can also use the get_headers function, as shown in the DEMO below.
Header ('content-Type: text/html; Charset = UTF-8 '); $ result = @ get_headers ('http: // www.qqhaowan.com'); if ($ result) {print_r ($ result); if (strpos ($ result [0], '000000') {echo 'website accessible! ';} The else {echo' website cannot be accessed! ';}} Else {echo' the target URL cannot be opened! ';}
Hey, I'll try it.
You can also use the get_headers function, as shown in the DEMO below.
Header ('content-Type: text/html; Charset = UTF-8 '); $ result = @ get_headers ('http: // www.qqhaowan.com'); if ($ result) {print_r ($ result); if (strpos ($ result [0], '000000') {echo 'website accessible! ';} The else {echo' website cannot be accessed! ';}} Else {echo' the target URL cannot be opened! ';}
It seems that the for loop won't work ..
My idea is to construct an http request and check its return status code. If it is 200, it proves that the website can be accessed normally.
A function called curl is used to obtain the status code,
However, a problem was found. When I run only once, I can return code 200. if I continuously detect multiple websites, I only return 0. if the website fails, I can return code 404.
But the normal website still returns 0. who has used this function? can you tell me?
Tested and can be detected cyclically.
';}function checksite($url){ $result = @get_headers($url); if($result){ if(strstr($result[0], '200')!=''){ return true; } } return false;}?>
My idea is to construct an http request and check its return status code. If it is 200, it proves that the website can be accessed normally.
A function called curl is used to obtain the status code,
However, a problem was found. When I run only once, I can return code 200. if I continuously detect multiple websites, I only return 0. if the website fails, I can return code 404.
But the normal website still returns 0. who has used this function? can you tell me?
Tested and can be detected cyclically.
';}function checksite($url){ $result = @get_headers($url); if($result){ if(strstr($result[0], '200')!=''){ return true; } } return false;}?>
Yes, you can. Is it because the time above is too short to receive a response?