settings in nginx.conf:Log_formatMain ' $server _name $remote _addr-$remote _user [$time _local] "$request" $status $body _bytes_sent "$http _referer" "$http _ User_agent "" $http _x_forwarded_for "$upstream _addr $request _time $upstream _response_time ';Access_log/logs/nginx/access.log main;Error_log/logs/nginx/error.log;settings can be placed in http{} or in server{}, which will be configured differently
Description
Favicon.ico occupy Nginx error_log log a lot of information, we really need to view the log to cover. So here we turn it off by log_not_found off.
Realize:
Put the following configuration into the server {} block, log logging when Favicon.ico is not present
= /favicon.ico {log_not_found off;access_log off
Objective
process, NIGNX format log into JSON, Logstash directly to Elasticsearch, and then through the Kibana GUI interface display analysis
Important NIGNX Log into JSON format, avoid nignx default log is a space, need a regular match, resulting in logstash too much CPUThe Elasticsearch machine configures the firewall, allowing only the specified Logstash mac
1. Edit the shell program for log cutting and customize the Directory
[Plain] view plaincopy
# Vi/data/nginx/cut_nginx_log.sh
Enter the code:
[Python] view plaincopy
#! /Bin/bash
# This script run at 00:00
Function cutAccess ()
{
Dir = $1
Newdir = "$ {dir}/$ (date-d" yesterday "+" % Y ")/$ (date-d" yesterday "+" % m ")"
Suffix = $ (date-d "yesterday" + "% Y % m % d ")
Mkdir-p $ newdir
Mv $ {
Nginx log configuration and collationEnvironment Introduction:OS: CentOS 6.XNginx version: 1.4.7 or laterPrinting Nginx logs is essential for troubleshooting errors!Because Nginx is basically controlled by the module, reference: http://nginx.org/en/docs/http/ngx_http_log_module.htmlNginx
This script is an nginx log cut scriptVariable Nginx_path to nginx installation directoryVariable Log_path to nginx log directoryYou only need to modify these two variables, and then add them to the scheduled task.#!/bin/bashNginx_path=/opt/nginx_webLog_path=/opt/nginx_web/l
We often encounter a variety of nginx error logs, usually based on some nginx error log can be analyzed for reasons. However, it is not very systematic, here on the Internet to see a data or comparison of the system on the nginx of the Error.log detailed instructions, here record, easy to see after understanding.
AWStats is a Perl-based web log analysis tool, while jawstats is a PHP-based statistical suite designed to provide a more elegant graphic interface for AWStats. 1. installing and configuring awstatsawstats is simple, but you must first confirm that the Perl environment on your server is ready. Considering that the website traffic is not large, the daily nginx log
AWStats analyzes the nginx log file, saves the generated results (as a TXT file) in the/var/www/AWStats directory, and uses Apche
Displays the generated results.
The nginx domain name is www.a.com: 80.
Logfile = "/usr/local/mybin/nginx/logs/access. log" #
})? | -) \ "%{host:domain}%{number:response} (?:%{number:bytes}|-)%{qs:referrer}%{qs:useragent}" (%{ip:x_forwarder_for}| -)"
Because it is the test environment, I use Logstash to read the Nginx log file to get the Nginx log, and only read the Nginx access
is due to see the group within the great God @ Shenzhou browser statistics, I also want to practice, so today not out of the day (of course, still have to eat) ~train of ThoughtThe first is to write an automatic timing task, run the script at 23:55 every night, script do log cutting and turn into the required data JSONMake the access interface to get the parsed JSON data, such as browser model, System modelDraw a pie chart from the interfacetimed task
The following is a nginx log cut script, cut by day
$ (date +%d) Get the day of the month, such as May 18, will be taken out is 18
In this way, you can do the monthly cycle, only keep one months of the log, and do not worry about cleaning up the expired log
Script please set execution in crontab, such as
Because previously did not do the Nginx log segmentation, sometimes want to read the log always found that there are more than 10 g or even hundreds of g of log files, so you want to use Python to write an nginx log segmentation (
1 Scripting cut_nginx_log.sh, placed in the nginx/sbin/directory, modified into executable#!/bin/bash# This script run at 00:00#放置日志目录Logs_path= "/var/www/logs"#nginx DirectoryNginx_path= "/var/www/nginx"#创建日期目录Mkdir-p ${logs_path}/$ (date-d "Yesterday" + "%y.%m")/#移动nginx/logs/access.log
Get a basic understanding of a few commands first.
The script is based on the log format above, and if you have different log formats you need to adjust the parameters behind awk.
UserAgent in the analysis log
The code is as follows
Copy Code
Cat Access_20130704.log | Awk-f "" "' {print $
1. Write a cut scriptEnter the Vhost directoryCd/usr/local/nginx/conf/vhosts2. Open a filevim/usr/local/sbin/nginx_logrotate.sh---Log files are placed under this directoryAdd the following content#!/bin/bashD= ' date-d '-1 day "+%f"[-d/tmp/nginx_log] | | Mkdir/tmp/nginx_logmv/tmp/access.log/tmp/nginx_log/$d. Log/etc/init.d/ng
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.