Six ways to invoke remote URLs in PHP summary _php tips

Source: Internet
Author: User
Tags curl fread http post save file strlen net domain
Example code 1: getting content with file_get_contents
Copy Code code as follows:

<?php
$url = ' http://www.baidu.com/';
$html =file_get_contents ($url);
Print_r ($http _response_header);
EC ($html);
Printhr ();
Printarr ($http _response_header);
Printhr ();
?>


Example code 2: Open the URL with fopen and get the content
Copy Code code as follows:

?
$FP =fopen ($url, ' R ');
Printarr (Stream_get_meta_data ($fp));
Printhr ();
while (!feof ($fp)) {
$result. =fgets ($FP, 1024);
}
echo "URL body: $result";
Printhr ();
Fclose ($FP);
?>

Example code 3: Using the File_get_contents function to get the URL by post
Copy Code code as follows:

<?php
$data =array (' foo ' => ' Bar ');
$data =http_build_query ($data);

$opts =array (
' HTTP ' =>array (
' Method ' => ' POST ',
' Header ' => ' content-type:application/x-www-form-urlencoded\r\n '.
"Content-length:". strlen ($data). " \ r \ n ",
' Content ' => $data
),
);
$context =stream_context_create ($opts);
$html =file_get_contents (' http://localhost/e/admin/test.html ', false, $context);
echo$html;
?>

Example code 4: Open the URL with the Fsockopen function to get the complete data, including header and body
Copy Code code as follows:

?
Functionget_url ($url, $cookie =false) {
$url =parse_url ($url);
$query = $url [path]. $url [query];
EC ("Query:". $query);
$FP =fsockopen ($url [host], $url [port]? $url [port]:80, $errno, $errstr, 30);
if (! $fp) {
Returnfalse;
}else{
$request = "get$queryhttp/1.1\r\n";
$request. = "Host: $url [host]\r\n";
$request. = "connection:close\r\n";
if ($cookie) $request. = "Cookie: $cookie \ n";
$request. = "\ r \ n";
Fwrite ($fp, $request);
while (! @feof ($fp)) {
$result. = @fgets ($fp, 1024);
}
Fclose ($FP);
Return$result;
}
}
Gets the HTML portion of the URL and removes the header
Functiongeturlhtml ($url, $cookie =false) {
$rowdata =get_url ($url, $cookie);
if ($rowdata)
{
$body =stristr ($rowdata, "\r\n\r\n");
$body =substr ($body, 4,strlen ($body));
Return$body;
}
Returnfalse;
}
?>

Example code 5: Open the URL with the Fsockopen function to get the complete data in post, including header and body
Copy Code code as follows:

?
Functionhttp_post ($URL, $data, $cookie, $referrer = "") {
Parsing the given URL
$URL _info=parse_url ($URL);

Building referrer
if ($referrer = = "")//if not given with this script. As referrer
$referrer = "111";

Making string from $data
foreach ($dataas $key=> $value)
$values []= "$key =" UrlEncode ($value);
$data _string=implode ("&", $values);

Find out which the port is needed-if to given use standard (=80)
if (!isset ($URL _info["Port"))
$URL _info["Port"]=80;

Building Post-request:
$request. = "POST". $URL _info["path"]. " Http/1.1\n ";
$request. = "Host:". $URL _info["host"]. " \ n ";
$request. = "Referer: $referer \ n";
$request. = "content-type:application/x-www-form-urlencoded\n";
$request. = "Content-length:". strlen ($data _string). " \ n ";
$request. = "connection:close\n";
$request. = "Cookie: $cookie \ n";
$request. = "\ n";
$request. = $data _string. " \ n ";

$FP =fsockopen ($URL _info["host"), $URL _info["Port");
Fputs ($fp, $request);
while (!feof ($fp)) {
$result. =fgets ($FP, 1024);
}
Fclose ($FP);
Return$result;
}
Printhr ();
?>

Example code 6: Before using the Curl library, you might want to check the php.ini to see if the curl extension has been turned on before using the Curl Library
Copy Code code as follows:

?
$ch = Curl_init ();
$timeout = 5;
curl_setopt ($ch, Curlopt_url, ' http://www.baidu.com/');
curl_setopt ($ch, Curlopt_returntransfer, 1);
curl_setopt ($ch, Curlopt_connecttimeout, $timeout);
$file _contents = curl_exec ($ch);
Curl_close ($ch);
echo $file _contents;
?>

About the Curl Library:
Curl Official website http://curl.haxx.se/
Curl is a transfer file tool using URL syntax that supports FTP, FTPS, HTTP htpps SCP SFTP TFTP TELNET DICT file and LDAP. Curl supports SSL certificates, HTTP POST, HTTP put, FTP uploads, Kerberos, HTT-based uploads, proxies, cookies, user + password certificates, file transfer recovery, HTTP proxy channels, and a number of other useful tips
Copy Code code as follows:

?
Functionprintarr (Array$arr)
{
echo "<br> Row field Count:". Count ($arr). <br> ";
foreach ($arras $key=> $value)
{
echo "$key = $value <br>";
}
}
?>

======================================================
PHP Crawl Remote Web site data code
Now there may be a lot of program enthusiasts will encounter the same question, is how to like the search engine to crawl other people's Web site HTML code, and then collect the code into their own useful data! Let me introduce you to some simple examples today.

Ⅰ. Example of crawling a remote page title:
The following is a code fragment:
Copy Code code as follows:

<?php
/*
+-------------------------------------------------------------
+ Crawl the page title of the code, directly copy this code fragment, save as a. php file execution can.
+-------------------------------------------------------------
*/

Error_reporting (7);
$file = fopen ("http://www.jb51.net/", "R");
if (! $file) {
echo "<font color=red>unable to open remote file.</font>\n";
Exit
}
while (!feof ($file)) {
$line = fgets ($file, 1024);
if (Eregi ("<title> (. *) </title>", $line, $out)) {
$title = $out [1];
echo "". $title. "";
Break
}
}
Fclose ($file);

End
?>

Ⅱ. Example of fetching HTML code for a remote Web page:

The following is a code fragment:
Copy Code code as follows:

? Php
/*
+----------------
+dnsing Sprider
+----------------
*/

$fp = Fsockopen ("www.dnsing.com", $errno, $errstr, 30);
if (! $fp) {
echo "$errstr ($errno) <br/>\n";
} else {
$out = "get/http/1.1\r\n";
$out. = "host:www.dnsing.com\r\n";
$out. = "Connection:close \r\n\r\n";
Fputs ($fp, $out);
while (!feof ($fp)) {
Echo fgets ($FP, 128);
}
Fclose ($FP);
}
End
?>
The above two pieces of code are directly copy back to run to know the effect, the example above is just crawling the embryonic page data, to make it more suitable for their use, the situation varies. So, this program enthusiasts themselves to do a good research.

===============================

A slightly meaningful function is: Get_content_by_socket (), Get_url (), Get_content_url (), get_content_object several functions, and may be able to give you some idea.
<?php

Get all content URLs to save to file
function Get_index ($save _file, $prefix = "Index_") {
$count = 68;
$i = 1;
if (file_exists ($save _file)) @unlink ($save _file);
$fp = fopen ($save _file, "A +") or Die ("Open"). $save _file. "Failed");
while ($i < $count) {
$url = $prefix. $i. ". HTM ";
echo "Get". $url. " ...";
$url _str = Get_content_url (Get_url ($url));
echo "ok\n";
Fwrite ($fp, $url _str);
+ + $i;
}
Fclose ($FP);
}

Get Target Multimedia Object
function Get_object ($url _file, $save _file, $split = "| |:* *:--|") {
if (!file_exists ($url _file)) die ($url _file. "Not exist");
$file _arr = file ($url _file);
if (!is_array ($file _arr) | | | empty ($file _arr)) die ($url _file. "Not content");
$url _arr = Array_unique ($file _arr);
if (file_exists ($save _file)) @unlink ($save _file);
$fp = fopen ($save _file, "A +") or Die ("Open save File"). $save _file. "Failed");
foreach ($url _arr as $url) {
if (empty ($url)) continue;
echo "Get". $url. " ...";
$html _str = Get_url ($url);
echo $html _str;
echo $url;
Exit
$obj _str = get_content_object ($html _str);
echo "ok\n";
Fwrite ($fp, $obj _str);
}
Fclose ($FP);
}

Traverse directory to get file contents
function Get_dir ($save _file, $dir) {
$DP = Opendir ($dir);
if (file_exists ($save _file)) @unlink ($save _file);
$fp = fopen ($save _file, "A +") or Die ("Open save File"). $save _file. "Failed");
while (($file = Readdir ($DP))!= false) {
if ($file!= "." && $file!= "...") {
echo "Read file". $file. " ...";
$file _content = file_get_contents ($dir. $file);
$obj _str = get_content_object ($file _content);
echo "ok\n";
Fwrite ($fp, $obj _str);
}
}
Fclose ($FP);
}


Get the specified URL content
function Get_url ($url) {
$reg = '/^http:\/\/[^\/].+$/';
if (!preg_match ($reg, $url)) Die ($url. "Invalid");
$fp = fopen ($url, "R") or Die ("Open URL:".) $url. "Failed.");
while ($FC = Fread ($fp, 8192)) {
$content. = $FC;
}
Fclose ($FP);
if (empty ($content)) {
Die ("Get URL:".) $url. "Content failed.");
}
return $content;
}

Use the socket to get the specified page
function Get_content_by_socket ($url, $host) {
$fp = Fsockopen ($host,) or Die ("Open"). $url. "Failed");
$header = "Get/". $url. " Http/1.1\r\n ";
$header. = "Accept: */*\r\n";
$header. = "accept-language:zh-cn\r\n";
$header. = "Accept-encoding:gzip, deflate\r\n";
$header. = "user-agent:mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; Maxthon; Infopath.1. NET CLR 2.0.50727) \ r \ n ";
$header. = "Host:". $host. " \ r \ n ";
$header. = "connection:keep-alive\r\n";
$header. = "cookie:cnzz02=2; rtime=1; ltime=1148456424859; Cnzz_eid=56601755-\r\n\r\n ";
$header. = "connection:close\r\n\r\n";

Fwrite ($fp, $header);
while (!feof ($fp)) {
$contents. = Fgets ($fp, 8192);
}
Fclose ($FP);
return $contents;
}


Gets the URL in the specified content
function Get_content_url ($host _url, $file _contents) {

$reg = '/^ (#|javascript.*?| ftp:\/\/.+|http:\/\/.+|. *?href.*?| play.*?| index.*?|. *?asp) +$/i ';
$reg = '/^ (down.*?\.html|\d+_\d+\.htm.*?) $/i ';
$rex = "/([HH][RR][EE][FF]) \s*=\s*[' \"]* ([^> ' \ "\s]+) [\] ' >]*\s*/i ';
$reg = '/^ (down.*?\.html) $/i ';
Preg_match_all ($rex, $file _contents, $r);
$result = ""; Array ();
foreach ($r as $c) {
if (Is_array ($c)) {
foreach ($c as $d) {
if (Preg_match ($reg, $d)) {$result. = $host _url. $d. " \ n "; }
}
}
}
return $result;
}

Gets the multimedia files in the specified content
function Get_content_object ($str, $split = "| |:* *:--|") {
$REGX = "/href\s*=\s*['"]* ([^> ' \ "\s]+) [\ ' >]*\s* (<b>.*?<\/b>)/I];
Preg_match_all ($REGX, $str, $result);

if (count ($result) = 3) {
$result [2] = Str_replace ("<b> Multimedia:", "", $result [2]);
$result [2] = Str_replace ("</b>", "", $result [2]);
$result = $result [1][0]. $split. $result [2][0]. "\ n";
}
return $result;
}

?>

======================================================

When the same domain name corresponds to multiple IP, PHP gets the function of the Remote Web page content

FGC is simply read over, and all operations are encapsulated
Fopen also has some encapsulation, but it requires you to iterate and get all the data.
Fsockopen This is the socket operation of the straight plate.
FGC is better if you just read an HTML page.
If the company is through the firewall Internet, the general file_get_content function is not. Of course, through some socket operation, directly to the proxy to write HTTP request is also possible, but more cumbersome.
If you can confirm that the file is small, you can choose either of these two ways fopen, join (', file ($file));. For example, you only operate less than 1k of files, it is best to use file_get_contents bar.

If you determine that the file is large, or you are unsure of the size of the file, it is best to use file flow. fopen a 1K file and fopen a 1G file is no obvious difference. Content is long, it can take longer to read, rather than let the script die.

----------------------------------------------------
Http://www.phpcake.cn/archives/tag/fsockopen
PHP gets Remote Web page content in a variety of ways, such as file_get_contents, fopen and other functions.

<?php

Echo file_get_contents ("http://img.jb51.net/abc.php");
?>
However, in the DNS polling, such as load balancing, the same domain name, may correspond to multiple servers, multiple IP. Suppose img.jb51.net is resolved by DNS to 72.249.146.213, 72.249.146.214, 72.249.146.215 three IP, Each time the user accesses the img.jb51.net, the system accesses one of the servers according to the corresponding algorithm of load balancing.
When I did a video project last week, I came across a kind of requirement: we need to access a PHP interface program (abc.php) on each server in turn, and query the transmission status of this server.

It is not possible to access the http://img.jb51.net/abc.php directly with file_get_contents because it may repeatedly access a server.

In turn, the use of access to http://72.249.146.213/abc.php, http://72.249.146.214/abc.php, http://72.249.146.215/ abc.php approach, when the Web server on the three servers is equipped with multiple virtual hosts, it is not.

It is not possible to set up a local hosts because hosts cannot set up multiple IPs for the same domain name.

That's only done through PHP and the HTTP protocol: When accessing abc.php, add the img.jb51.net domain name in the header header. So, I wrote the following PHP function:
Copy Code code as follows:

<?php

/************************
* Function use: When the same domain name corresponds to multiple IP, get the Remote Web page content of the specified server
* Date Created: 2008-12-09
* Create Person: Zhang Yi (img.jb51.net)
* Parameter Description:
* $ip The IP address of the server
* $host the host name of the server
* $url The URL address of the server (excluding the domain name)
* Return value:
* Access to the Remote Web page content
* False access to Remote Web page failed
************************/
function Httpvisit ($ip, $host, $url)
{
$errstr = ';
$errno = ';
$fp = Fsockopen ($ip, $errno, $errstr, 90);
if (! $fp)
{
return false;
}
Else
{
$out = "Get {$url} http/1.1\r\n";
$out. = "host:{$host}\r\n";
$out. = "connection:close\r\n\r\n";
Fputs ($fp, $out);

while ($line = Fread ($fp, 4096)) {
$response. = $line;
}
Fclose ($FP);

Remove Header Header information
$pos = Strpos ($response, "\r\n\r\n");
$response = substr ($response, $pos + 4);

return $response;
}
}

Call Method:
$server _info1 = httpvisit ("72.249.146.213", "img.jb51.net", "/abc.php");
$server _info2 = httpvisit ("72.249.146.214", "img.jb51.net", "/abc.php");
$server _info3 = httpvisit ("72.249.146.215", "img.jb51.net", "/abc.php");
?>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.