To edit the/usr/local/apache/conf/extra/http_vhost.conf, configure the log in the added virtual host as follows:
Setenvifnocase user-agent Baiduspider Baidu_robot #百度访问日志
Setenvifnocase user-agent Googlebot Google_robot #谷歌访问日志
Setenvifnocase user-agent 360Spider 360__robot
Setenvifnocase user-agent Iaskspider Xinglang_robot
Setenvifnocase user-agent Sogou Sogou_robot
Setenvifnocase user-agent Yodaobot Wangyi_robot
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.site_%y%m%d.log 86400" combined env=! Image-request
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.google_%y%m%d.log 86400" combined env= Google_robot
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.baidu_%y%m%d.log 86400" combined env= Baidu_robot
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.360_%y%m%d.log 86400" combined env=360_ Robot
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.xinglang_%y%m%d.log 86400" combined env= Xinglang_robot
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.sougou_%y%m%d.log 86400" combined env= Sougou_robot
Customlog "|/usr/local/apache2/bin/rotatelogs-l/usr/local/apache2/logs/cn.wangyi_%y%m%d.log 86400" combined env= Wangyi_robot
Then each day generates different logs to record, implementing different access logs to record the access records of different search engine crawlers.
This article is from the "11083647" blog, please be sure to keep this source http://11093647.blog.51cto.com/11083647/1745341
Configure Apache logs to record access records for different search engine crawlers, respectively