The following code records the websites that have been crawled by Baidu, Google, Bing, Yahoo, Soso, Sogou, and Yodao: 01 & lt ;? Php02 // http://www.tongqiong. com03functionget_naps_bot () 04 {05 $ useragent = strtolower ($ _ SERVER [& amp ;#
Records of crawling websites such as Baidu, Google, Bing, Yahoo, Soso, Sogou, and Yodao
The code is as follows:
| 02 |
// Http://www.tongqiong.com |
| 03 |
Function get_naps_bot () |
| 05 |
$ Useragent = strtolower ($ _ SERVER ['http _ USER_AGENT ']); |
| 07 |
If (strpos ($ useragent, 'googlebot ')! = False ){ |
| 11 |
If (strpos ($ useragent, 'baidider Ider ')! = False ){ |
| 14 |
If (strpos ($ useragent, 'msnbot ')! = False ){ |
| 18 |
If (strpos ($ useragent, 'slurp ')! = False ){ |
| 22 |
If (strpos ($ useragent, 'sosospider ')! = False ){ |
| 26 |
If (strpos ($ useragent, 'sogou spider ')! = False ){ |
| 30 |
If (strpos ($ useragent, 'yodaobot ')! = False ){ |
| 37 |
$ Date = date ("Y-m-d.G: I: s "); |
| 41 |
$ Searchbot = get_naps_bot (); |
| 44 |
$ Tlc_thispage = addslashes ($ _ SERVER ['http _ USER_AGENT ']); |
| 45 |
$ Url = $ _ SERVER ['http _ referer']; |
| 46 |
$ File = "www.tongqiong.com.txt "; |
| 48 |
$ Data = fopen ($ file, ""); |
| 49 |
Fwrite ($ data, "Time: $ time robot: $ searchbot URL: $ tlc_thispage \ n "); |
| 52 |
// Http://www.tongqiong.com |