Grok-patterns contains log parsing rules for regular expressions with many underlying variables, including Apache log parsing (which can also be used for nginx log parsing). Based on Nginx log analysis configuration: 1. Configure the Nginx log format as follows: Log_format main ' $remote _addr [$time _local] "$request" $status $body _bytes_ Sent "" $http _referer "" "$request _time" '; access_log/var/log/nginx/access.log main; The Nginx log is screened, Remove unused logs. At this time, for the log format of the Convention to be more careful, I in the debugging process, in the Nginx log more than a few spaces, logstash half-day resolution. The 2.logstash log format is configured as follows: input {file {path ' = '/var/log/nginx/*.log '}}filter {if [path] =~ "Access" {mutate {replace + = { "Type" = "nginx_access"}}grok {match + = {"Message" = "%{iporhost:clientip} \[%{httpdate:timestamp}\] \" (?:% {Word:method}%{notspace:request} (?: Http/%{number:httpversion})? | %{data:rawrequest}) \ "%{number:response} (?:%{number:bytes}|-)%{qs:referrer}%{qs:timeconsumer}"}}}date {match = > ["Timestamp", "Dd/mmm/yyyy:hh:mm:ss Z"]}}output {redis {data_type + = "List" key = "Logstash"}} so the Nginx log Request URL, request method, request time-consuming, Response bytes, request times basically separated out, follow-up with Kibana can do data market monitoring and analysis. while trying to logstash the business log,For the Business log format, you need to define a set of parsing rules, follow-up to improve related processing expressions.
Logstash patterns, log analysis (i)