ELK you can complete the following functions:
L query log details by keyword
L Monitoring System Operation status
L statistical analysis, such as the number of calls to the interface, execution time, success rate, etc.
L automatically trigger message notification for abnormal data
L Log-based data mining
Elk can implement Splunk basic functions
Splunk is the engine of machine data. Use Splunk to collect, index, and leverage fast-moving computer data generated by all applications, servers, and devices (physical, virtual, and cloud). Search and analyze all real-time and historical data from one place. Processing computer data with splunking allows you to resolve issues and investigate security incidents in minutes, rather than hours or days. Monitor your end-to-end infrastructure to avoid service performance degradation or disruption. Meet compliance requirements at a lower cost. Correlate and analyze complex events that span multiple systems. Get new levels of operational visibility and IT and business intelligence.
First, the Preparation tool:
1 , Elasticsearch: Elasticsearch is an open source, distributed, restful search engine built on Lucene. Designed for cloud computing, it can achieve real-time search, stable, reliable, fast, easy to install and use.
Elasticsearch 1.4.2:http://www.elasticsearch.org/download/
2 , Logstash: is a fully open source tool that collects, analyzes, and stores your logs for later use (e.g., search), which you can use. When it comes to search, Logstash comes with a web interface that searches and displays all logs.
logstash-1.4.2.tar.gzhttp://www.elasticsearch.org/overview/logstash/download/
3:kibana: is a Web UI tool that provides log analysis for ElasticSearch, and it can be used to efficiently search, visualize, analyze, and perform various operations on logs.
Kibana 4 Beta 3?http://www.elasticsearch.org/overview/kibana/installation/
(Note: The version must be consistent, otherwise there may be some problems)
Second, installation
1 , installation Elasticsearch
① Decompression elasticsearch-1.4.2.tar.gz
Tar zxvf elasticsearch-1.4.2.tar.gz
②? Go to the elasticsearch-1.4.2 folder
CD elasticsearch-1.4.2
? ③ start Elasticsearch
Run Bin/elasticsearch on Unix,or bin/elasticsearch.bat on Windows
2 , installation Logstash
① Decompression logstash-1.4.2.tar.gz
Tar zxvf logstash-1.4.2.tar.gz
② into the logstash-1.4.2
CD logstash-1.4.2
③ Create a configuration file to capture the system log logstash-syslog.conf
mkdir conf
Vim conf/logstash-syslog.conf
Listen for 5000-port messages, logstash-syslog.conf content as follows
Input {
TCP {
Port = 5000
Type = Syslog
}
UDP {
Port = 5000
Type = Syslog
}
}
Filter {
if [type] = = "Syslog" {
Grok {
Match + = {"Message" = "%{syslogtimestamp:syslog_timestamp}%{sysloghost:syslog_hostname}%{data:syslog_ Program} (?: \ [%{posint:syslog_
Pid}\])?:%{greedydata:syslog_message} "}
Add_field = ["Received_at", "%{@timestamp}"]
Add_field = ["Received_from", "%{host}"]
}
Syslog_pri {}
Date {
Match = ["Syslog_timestamp", "Mmm D HH:mm:ss", "MMM dd HH:mm:ss"]
}
}
}
④ start Logstash Capture System log
./bin/logstash-f conf/logstash-syslog.conf
⑤ sending system logs to port 5000
telnet localhost 5000
Dec 12:11:43 Louis postfix/smtpd[31499]: Connect from unknown[95.75.93.154] Dec 14:42:56 Louis named[16000]: Client 199.48.164.7#64817:query (Cache) ' Amsterdamboothuren.com/mx/in ' denied Dec 14:30:01 Louis cron[619]: (www-data) CMD (php/usr/share/cacti/site/poller.php >/dev/null 2>/var/log/cacti/ Poller-error.log) Dec 18:28:06 Louis RSYSLOGD: [Origin software= "Rsyslogd" swversion= "4.2.0" x-pid= "2253" x-info= "/HTTP/ Www.rsyslog.com.sixxs.org "] Rsyslogd was huped, type ' lightweight '. |
⑥ viewing the interface of a started Logstash collection of 5000 port logs will produce the information as shown:
3. Installing Kibana
① Decompression kibana-4.0.0-beta3.tar.gz
Tar zxvf kibana-4.0.0-beta3.tar.gz
? ② into Kibana-4.0.0-beta3
CD KIBANA-4.0.0-BETA3
③ Start Kibana
./bin/kibana
④ Open Kibana Foreground interface
http://192.168.14.136:5601 or http://localhost:5601
⑤ Search mozi*
So far the elk has been built.
Iii. collecting Linux system logs through Logstash
1 , creating logstash-localsyslog.conf
Vim conf/logstash-localsyslog.conf
Add the following to the logstash-localsyslog.conf and save:
Input {
file{
Type=> "Syslog"
# wildcard characters apply here:)
path=>["/var/log/messages", "/var/log/syslog", "/var/log/*.log"]
}
}
Filter {
if [type] = = "Syslog" {
Grok {
Match + = {"Message" = "%{syslogtimestamp:syslog_timestamp}%{sysloghost:syslog_hostname}%{data:syslog_ Program} (?: \ [%{posint:syslog_
Pid}\])?:%{greedydata:syslog_message} "}
Add_field = ["Received_at", "%{@timestamp}"]
Add_field = ["Received_from", "%{host}"]
}
Syslog_pri {}
Date {
Match = ["Syslog_timestamp", "Mmm D HH:mm:ss", "MMM dd HH:mm:ss"]
}
}
}
Output {
elasticsearch {host = localhost}
stdout {codec = Rubydebug}
}
2 , start collecting Logstash for Linux Local system logs
./bin/logstash-f conf/logstash-localsyslog.conf
3 , querying through the Kibana
http://192.168.14.136:5601
4 , Kibana can choose to query data for a time period
Like querying past 30min data.
Elasticsearch+logstash+kibana Installation and use