New plugins:
Description: starting from 5.0, the plug-in is split into the gem package independently, each plug-in can be updated independently, without waiting for the logstash itself overall update, specific management commands can be consulted./bin/logstash-plugin--help Help information: /bin/logstash-plugin list In fact, all the plugins are located in the local./vendor/bundle/jruby/1.9/gems/directory
extension: if GitHub above ( ) released the extension, Available via./bin/logstash-plugin install <PLUGIN-NAME>, and of course the upgrade is also very convenient./bin/logstash-plugin update <plugin-name> If you want to install the update locally, the existing plugins can go through the./bin/logstash-plugin install/update <plugin-path>.
Note: the default./bin/logstash-plugin install/update when the https://rubygems.org /download package, very slow, so it is highly recommended to manually from https://github.com/logstash-plugins/ or https://rubygems.org /Download down update
Decode plugin: https://www.elastic.co/guide/en/logstash/current/codec-plugins.html
Description: The encoding plug-in can be used in the Input/output section, you can change the data representation of the event, the previous version only supports plain text input, can now input period processing different types of data, data rheology to input | Decode | Filter | Encode | The advent of output,codec makes it easier to co-exist with other custom data format operations products, supporting all plugins in the list above
Plugin Name: JSON (https://www.elastic.co/guide/en/logstash/current/plugins-codecs-json.html)
Input {file {path = = ["/xm-workspace/xm-webs/xmcloud/logs/*.log"] type = "Dss-pubserver" codec =& Gt JSON start_position = "Beginning"}}output{stdout{codec = Rubydebug}}
Note: Many times in order to reduce the CPU load of the filter, will enter the predefined JSON data directly, this can omit the Filter/grok configuration, for the Web server Apache/nginx access log can work very well, but if run as a proxy server, Some variables in the access log, such as $upstream_response_time, may not be numbers but "-", resulting in input data validation exceptions, double quotation marks, and import of sed replacements from the terminal through the stdin of the input plug-in.
plug-in name: Multilline ( https://www.elastic.co/guide/en/logstash/current/plugins-codecs-multiline.html
input { stdin { codec => multiline { patterns_dir => ["/xm-workspace/xm-apps/logstash/vendor/bundle/jruby/1.9/ Gems/logstash-patterns-core-4.0.2/patterns "] pattern => "^%{year}/%{monthnum2}/%{monthday} %{time}" negate => true what => "Previous" } }}output{ stdout{ codec => rubydebug }}
2016/07/09 10:19:49 [notice] 32715#0: start worker process 327172016/07/09 10:20:19 [error] 32716#0: *1 lua entry thread aborted: runtime error: ...stTool/test/Openresty/authserver_lua/lua/auth-server.lua:387: attempt to index upvalue ' Lua_url ' (a boolean value) stack traceback:coroutine 0: ... sttool/test/openresty/authserver_lua/lua/auth-server.lua: in function ' process_msg ' ... stTool/ Test/openresty/authserver_lua/lua/auth-server.lua:423: in function <...sttool/test/openresty /authserver_lua/lua/auth-server.lua:2>, client: 10.2.5.51, server: , request: "post /webservice/c928/%e4%b8%8a%e8%99%9e%e9%9c%87%e8%bf%9c&1468030819562& e118f0d7aca5a0de1abadb94866173a4& http/1.1 ", host: " 10.2.5.51:9902 "2016/07/09 10:24:24 [notice] 32715#0: signal 15 (SIGTERM) received, exiting
Note: Many times the program debug log will contain rich content, print a lot of content for an event, you can pass in the input through codec multiline pre-multiline processing, because Logstash default comes with a lot of regular, through Patterns_ Dir loads its own or custom regular directory as an array, it automatically scans and loads the regular file under the directory, pattern specifies the regular expression, and the negate and what mates are used to indicate that this line belongs to the forward when it does not match the pattern. This accumulates until the line that matches the pattern ends as a line of content.
extension: The Application log is often used for log4j, although this type of log can be implemented through codec=>multiline, but in fact Logstash also provides another input=>log4j (https:// www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html), which directly handles the data received by the TCP port.
Plugin Name: (https://www.elastic.co/guide/en/logstash/current/plugins-codecs-netflow.html)
Input {udp {port = 9995codec = NetFlow {definitions =]/xm-workspace/xm-apps/logstash/vendor/bundle/jruby/ 1.9/gems/logstash-codec-netflow-3.1.2/lib/logstash/codecs/netflow/netflow.yaml "versions = [5]}}}output { stdout {codec = Rubydebug}}
Description: NetFlow is a data interchange format created by Cisco that is commonly used to collect NetFlow network traffic data for devices such as routers for further analysis of the data, Logstash codec=> NetFlow need to specify definitions default contains the defined standard NetFlow field file, versions currently supports version 5/9.
This article is from the "Li-Yun Development Road" blog, please be sure to keep this source http://xmdevops.blog.51cto.com/11144840/1884064
Log monitoring _elasticstack-0002.logstash Coding plug-in and actual production case application?