Logstash is mainly used for data collection and analysis, with Elasticsearch,kibana easy to use, installation tutorial Google out a lot.
Recommended Reading
- Elasticsearch Authoritative Guide
- Proficient in Elasticsearch
- Kibana Chinese Guide
- The Logstash Book
Objective
Enter the regular Nginx log, filter into the required fields and deposit into the elasticsearch.
Log style:
115.182.31.11- - [ Geneva/aug/ -: ,: *:Ten+0800]"get/v2/get?key=0b0c1c5523aa40c3a5dcde4402947693&appid=153&appname=%e6%96%97%e5%9c%b0%e4%b8%bb%e5% 8d%95%e6%9c%ba%e7%89%88&uuid=861698005693444&client=1&operator=1&net=2&devicetype=1& adspacetype=1&category=2&ip=117.136.22.36&os_version=2.2.2&aw=320&ah=50×tamp= 1375403699&density=1.5&pw=800&ph=480&device=zte-u%2bv880&sign= 1F6FD0992CA09E8525B0F7165A928A2A http/1.1" $ the "-" "-"-117.135.137.180- - [ Geneva/aug/ -: ,: *:Ten+0800]"get/v2/get? format=json&key=47378200063c41fe90eff85f11ca4d2f&appid=324&appname=%25e5%258d%2595%25e6%259c%25ba% 25e6%2596%2597%25e5%259c%25b0%25e4%25b8%25bb&uuid=b51d63a91da5a4111e6cc1fb2c2538d5&client=1& operator=1&net=2&devicetype=1&adspacetype=1&category=28&ip=117.136.7.111&os_version= 4.0.4&aw=320&ah=50×tamp=1375403708&sign=9a00b63a04c165deea70dedd6b747697 HTTP/1.0" $ 776 "-" "-"-115.182.31.11- - [ Geneva/aug/ -: ,: *:Ten+0800]"get/v2/get?key=0b0c1c5523aa40c3a5dcde4402947693&appid=153&appname=%e6%96%97%e5%9c%b0%e4%b8%bb%e5% 8d%95%e6%9c%ba%e7%89%88&uuid=860173017274352&client=1&operator=2&net=3&devicetype=1& adspacetype=1&category=2&ip=120.7.195.5&os_version=2.3.5&aw=320&ah=50×tamp= 1375403700&density=1.5&long=39&lat=0699733%2c116&pw=854&ph=480&device=mi-one%2bplus &SIGN=F65A4B2B2681AC489F65ACF49E3D8EBD http/1.1" $ the "-" "-"-115.182.31.12- - [ Geneva/aug/ -: ,: *:Ten+0800]"get/v2/get?key=0b0c1c5523aa40c3a5dcde4402947693&appid=153&appname=%e6%96%97%e5%9c%b0%e4%b8%bb%e5% 8d%95%e6%9c%ba%e7%89%88&uuid=863802017354171&client=1&operator=1&net=3&devicetype=1& adspacetype=1&category=2&ip=123.121.144.120&os_version=2.3.5&aw=320&ah=50×tamp= 1375403698&density=1.5&long=40&lat=11183975%2c116&pw=854&ph=480&device=mi-one%2bplus &sign=0c74cf53a4b6adfe5e218f4fab920da3 http/1.1" $ the "-" "-"-115.182.31.8- - [ Geneva/aug/ -: ,: *:Ten+0800]"get/v2/get?key=0b0c1c5523aa40c3a5dcde4402947693&appid=153&appname=%e6%96%97%e5%9c%b0%e4%b8%bb%e5% 8d%95%e6%9c%ba%e7%89%88&uuid=868247013598808&client=1&operator=4&net=2&devicetype=1& adspacetype=1&category=2&ip=117.136.20.88&os_version=2.3.5&aw=320&ah=50×tamp= 1375403707&density=1.5&pw=800&ph=480&device=lenovo%2ba520gray&sign= 43d5260eb2b89f5984b513067e074f5e http/1.1" $ the "-" "-"-
After Logstash extraction is collected, each segment will be output in the form:
{ "Message" = "115.182.31.8--[02/aug/2013:05:24:12 +0800] \" get/v2/get?key=0b0c1c5523aa40c3a5dcde4402947693& Appid=153&appname= bucket landlord stand-alone version &uuid=355696050506936&client=1&operator=1&net=3&devicetype=1 &adspacetype=1&category=2&ip=113.228.122.247&os_version=4.1.1&aw=320&ah=50& timestamp=1375392249&density=2&long=41&lat=917705,123&pw=1280&ph=720&device=gt-n7100 &sign=e9853bb1e8bd56874b647bc08e7ba576 http/1.1\ "200 67 \"-\ "\"-\ "-", "@version" = "1", "@timestamp" = "2015-01-15t08:06:26.340z", "Host" = "Vovo", "Path" = "/home/vovo/access.log", "Client" = "1", "Ident" = "-", "Auth" = "-", "Timestamp" = "1375392249", "Verb" = "GET", "Request" + "/v2/get?key=0b0c1c5523aa40c3a5dcde4402947693&appid=153&appname= bucket landlord stand-alone version &uuid= 355696050506936&client=1&operator=1&net=3&devicetype=1&adspacetype=1&category=2&ip =113.228.122.247&os_version=4.1.1&aw=320&ah=50×tamp=1375392249&density=2&long=41 &lat=917705,123&pw=1280&ph=720&device=gt-n7100&sign=e9853bb1e8bd56874b647bc08e7ba576 ", "Http_version" = "1.1", "Response" = "200", "bytes" = "67", "Key" = "0b0c1c5523aa40c3a5dcde4402947693", "AppID" = "153", "AppName" + "bucket Landlord stand-alone version", "UUID" = "355696050506936", "Operator" = "1", "Net" = "3", "DeviceType" = "1", "Adspacetype" = "1", "Category" = "2", "IP" = "113.228.122.247", "Os_version" = "4.1.1", "AW" = "320", "Ah" = "50", "Density" = "2", "Long" = "41", "Lat" = "917705,123", "PW" = "1280", "PH" = "720", "Device" = "gt-n7100", "Sign" = "e9853bb1e8bd56874b647bc08e7ba576"}
For ease of understanding and testing, I used the Logstash profile configuration file to set up.
Sample.conf
This includes the ability to implement UrlDecode and KV plug-ins, which need to be run./plugin Install contrib installs the default plug-in for Logstash.
Input {file{Path="/home/vovo/access.log"#指定日志目录或文件, you can also use the wildcard character *.log to enter a log file in the directory.start_position="beginning" }}filter {grok {match= ["message","%{iporhost:client} (%{user:ident}|-) (%{user:auth}|-) \[%{httpdate:timestamp}\] \ "(?:%{word:verb}%{NOTSPACE: Request} (?: Http/%{number:http_version})? | -) \ "%{number:response}%{number:bytes} \" (%{qs:referrer}|-) \ "\" (%{qs:agent}|-) \ ""]
#匹配模式 message is the log read in every paragraph, IP, Httpdate, WORD, Notspace, number are the regular format names defined in Patterns/grok-patterns, written against the above log, colon, (?:%{ user:ident}|-) This form is conditional judgment, equivalent to two mesh operations inside the program. If there is a double quote "" or a [] number, you need to escape in front plus \. } kv {Source="Request"Field_split="&?"Value_split="=" }
#再单独将取得的URL, request field is taken out for Key-value value matching, the KV plug-in is required. Provide the field separator "&?", the value key delimiter "=", the field and value will be automatically collected. UrlDecode {all_fields=true }
#把所有字段进行urldecode (show Chinese)}output {#elasticsearch {# host='localhost'# protocol="http" #}
#把采集的数据输出到elasticsearch里面. StdOut {codec=Rubydebug}
#输出到屏幕上}
Log analysis using Logstash