Logstash Plug-in:
Input plugin:
File: Reads the stream of events from the specified file;
Use the Filewatch (Ruby Gem Library) to listen for changes to the file.
. Sincedb: Records the inode of each file being monitored, major number, minor Nubmer, POS;
is a simple example of collecting logs:
Input {
File {
Path = ["/var/log/messages"]
Type = "System"
Start_position = "Beginning"
}
}
Output {
stdout {
Codec=> Rubydebug
}
}
["/var/log/messages"] can contain multiple files [Item1, item2,...] Start_position = "Beginning" means to start reading from the first line
UDP: A message is read from a network connection via the UDP protocol, with the required parameter port, which indicates the port to which it is listening, and the host uses the address to indicate its own listener
COLLECTD: Performance monitoring program, based on C language development, running in the daemon mode, able to collect all aspects of the performance of the system data, and to store the collected results, to
Pass the network plugin and send the data collected by the computer to other hosts
COLLECTD package in Epel source, yum-y install epel-release;yum-y install COLLECTD, collecctd configuration file/etc/collectd.conf
vim/etc/collectd.conf, set a name for the Hostname under Global settings for the daemon: Hostname "Node1"
Find Loadplugin section, remove Loadplugin DF, loadplugin network start
Define a section below <plugin network> </Plugin>:
<plugin network>
<server "192.168.204.135" "25826" >
</Server>
</Plugin>
Indicates that the data is passed to the 192.168.204.135 host, and this host listens on a port of 25826
Service COLLECTD Start
192.168.204.135 installed Lostash, here is an example of a UDP configuration file
Input {
UDP {
Port = 25826
codec = COLLECTD {}
Type = "COLLECTD"
}
}
Output {
stdout {
codec = Rubydebug
}
}
codec = collectd {} to encode the information sent by COLLECTD.
Type = "COLLECTD" types can be arbitrarily named
Logstash-f/etc/logstash/conf.d/udp.conf--configtest logstash-f/etc/logstash/conf.d/udp.conf
This is the message that we can get from the COLLECTD.
Redis plugin:
Read data from Redis, support two ways of Redis channel and lists
Filter plugin:
Used to implement certain processing functions for an event before it is emitted through output
Grok: Used to analyze and structure text data, and is currently the perfect choice for Logstash unstructured log data into structured, queryable data.
Syslog, Apache, Nginx
Pattern Definition Location:/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.3.0/patterns/grok-patterns
Syntax format:
%{syntax:semantic}
SYNTAX: predefined schema name;
SEMANTIC: The custom identifier of the text to match;
Example: 1.1.1.1 get/index.html 30 0.23
{"message" = = "%{ip:clientip}%{word:method}%{uripathparam:request}%{number:bytes}%
{number:duration} "}
Vim groksample.conf A configuration example
Input {
stdin {}
}
Filter {
Grok {
Match + = {"Message" = "%{ip:clientip}%{word:method}%{uripathparam:request}%{number:bytes}%
{number:duration} "}
}
}
Output {
stdout {
codec = Rubydebug
}
}
Logstash-f/etc/logstash/conf.d/groksample.conf--configtest
Logstash-f/etc/logstash/conf.d/groksample.conf
Input 1.1.1.1 get/index.html 30 0.23 to obtain results
1.1.1.1 get/index.html 30 0.23
{
"Message" = "1.1.1.1 get/index.html 0.231.1.1.1 get/index.html 30 0.23",
"@version" = "1",
"@timestamp" = "2016-07-20t11:55:31.944z",
"Host" = "Centos7",
"ClientIP" = "1.1.1.1",
"Method" = "GET",
"Request" = "/index.html",
"bytes" = "30",
"duration" = "0.231"
}
Custom Grok mode: The Grok pattern is written based on regular expressions, and its meta-characters differ greatly from those used by other tools that use regular expressions Awk/sed/grep/pcre
The chances of customization are generally small
Match Apache Log example vim apachesample.conf
Input {
File {
Path = ["/var/log/httpd/access_log"]
Type = "Apachelog"
Start_position = "Beginning"
}
}
Filter {
Grok {
Match + = {"Message" = "%{combinedapachelog}"}
}
}
Output {
stdout {
codec = Rubydebug
}
}
How to match Nginx log:
Add the following information to the/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.3.0/patterns/grok-patterns text
The tail of the piece
#Nginx Log
Ngusername [a-za-z\.\@\-\+_%]+
Nguser%{ngusername}
Nginxaccess%{iporhost:clientip}-%{notspace:remote_user} \[%{httpdate:timestamp}\] \ "(?:%
{Word:verb}%{notspace:request} (?: Http/%{number:httpversion})? | %{data:rawrequest}) \ "%
{Number:response} (?:%{number:bytes}|-)%{qs:referrer}%{qs:agent}%{notspace:http_x_forwarded_for}
Yum-y Install epel-release;yum-y install nginx;systemctl start Nginx
Vim nginxsample.conf
Input {
File {
Path = ["/var/log/nginx/access.log"]
Type = "Nginxlog"
Start_position = "Beginning"
}
}
Filter {
Grok {
Match + = {"Message" = "%{nginxaccess}"}
}
}
Output {
stdout {
codec = Rubydebug
}
}
Logstash-f/etc/logstash/conf.d/nginxsample.conf
Logstash Plug-in