Ec (2); news thief code, php thief Program & lt ;? Php ** & nbsp; * $ Id $ & nbsp; * Filename: referer. php & nbsp; * Author & nbsp;: ear fart & nbsp; * Email & nbsp;: lianxiwoo@gmail.com | hotmail.com & nbsp; script ec (2); script
News thief code, php thief Program
/**
* $ Id $
*
* Filename: referer. php
* Author
* Email: lianxiwoo@gmail.com | hotmail.com
* Create: 20060831
* LastMod: 2006
* Usage:
* International Village debut
*/
// Ini_set ('display _ errors ', 1 );
// Error_reporting (E_ALL ^ E_NOTICE );
// Header ("Content-type: text/html; charset = UTF-8 ");
Set_time_limit (5 );
$ Referer = 'HTTP: // php.club.goodoon.com ';
$ Data = 'Hello PHPX! ';
$ Host = "www.phpx.com ";
$ Path = "/happy/index. php ";
$ Port = '80 ';
$ User = 'Food fart ';
$ Pswd = 'fart ';
// {Method 1, fsockopen
/*
// After processing the formhash value, you can log on. This is not the case yet. If you have used discuz, try again.
$ Path = "/happy/logging. php? Action = login ";
$ Data = "formhash = & referer = $ referer & loginmode = normal & cookietime = 2592000 & loginfield = username & username = $ user & password = $ pswd & questionid = 0 ";
*/
$ Q = '';
$ Q. = "GET $ path HTTP/1.1rn ";
$ Q. = "Host: $ hostrn ";
$ Q. = "Referer: $ refererrn"; // click here.
$ Q. = "Content-type: application/x-www-form-urlencodedrn ";
$ Q. = "Content-length:". strlen ($ data). "rn ";
$ Q. = "Accept: */* rn ";
$ Q. = "Connection: closern ";
$ Q. = "rn ";
$ Q. = $ data;
$ Fp = fsockopen ($ host, $ port );
Fputs ($ fp, $ q );
$ R = '';
While (! Feof ($ fp )){
$ R. = fgets ($ fp );
}
Fclose ($ fp );
// Echo $ r;
//}}}
// {Method 2, using the curl series functions (In addition, curl is really good and worth your efforts. You deserve it. ^_^)
$ Ch = curl_init ();
Curl_setopt ($ ch, CURLOPT_URL, "http://www.phpx.com/happy/logging.php? Action = login ");
Curl_setopt ($ ch, CURLOPT_REFERER, $ referer); // You can also say that you are from google
Curl_setopt ($ ch, CURLOPT_USERAGENT, $ _ SERVER ['HTTP _ USER_AGENT ']);
// In the above sentence, of course, you can say that you are a baidu. If you get rid of the value here, it will be OK and the thief's function can be implemented,
// You can also create a spider by yourself, so pretend to be the CURLOPT_USERAGENT here.
Curl_setopt ($ ch, CURLOPT_POST, 1); // post past
Curl_setopt ($ ch, CURLOPT_POSTFIELDS, array ('username' => $ user, 'Password' => $ pswd); // post Field
// Of course, you can use this method to log on here, but there is still a formhash problem for discuz,
// You may be able to log on without such a security authentication mechanism.
// I will not get it anymore.
$ R = curl_exec ($ ch );
Curl_close ($ ch );
//}}}
// {Method 3 is the stream Series function.
$ Opts = array (
"Http" => array (
'Method' => "GET ",
'Header' => "Referer: http://php.club.goodoon.comrnAccept: */* rnAccept-language: zh-cnrnCookie: username = ear fart rn"
/* Maybe encode is required here for "Food fart */
)
);
$ Context = stream_context_create ($ opts );
// Of course, you can use the stream_context_set_option function to set more options.
$ Fp = fopen ("http://www.phpx.com/happy/index.php", "r", false, $ context );
Fpassthru ($ fp );
Fclose ($ fp );
//}}}
/**
In conclusion, nothing is done here.
However, it is clear that,
The above describes how to use php, which we all like, to disguise http referer.
Therefore, referer is not completely trustworthy.
The methods I know are just like this. I hope they will bring you a cool pleasure in the hot summer. Remember, when you get the pleasure, shout! Pai_^
The above involves not only referer problems, but you can also see that we can disguise a lot of things,
It's easy to make things like thieves, hey
Of course, there are still many things to do for thieves. For example, after the webpage content can be retrieved, we still need to analyze it. Of course, we need to use preg _ * series functions,
I always pay attention to benefits. Later I will talk about the beauty in regular expressions "preg _ *" ^_^
If you support it, you can vote! The amount of fuel is cool! Come on! Come on, boys! Come on, phpx!
Another mention is that wget in linux may be the most suitable candidate for crawling to the page. You can give it a try. It's powerful and everyone knows it.
*/
?>