Elasticsearch+kibana+logstash Build Log Platform

Source: Internet
Author: User
Tags geoip kibana logstash

Large log Platform Setup


Java Environment Deployment
Many tutorials on the web, just testing here
Java-versionjava version "1.7.0_45" Java (tm) SE Runtime Environment (build 1.7.0_45-b18) Java HotSpot (tm) 64-bit Server VM (Build 24.45-b08, Mixed mode)
Elasticsearch Construction
Curl-o Https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.5.1.tar.gztar ZXVF ELASTICSEARCH-1.5.1.TAR.GZCD Elasticsearch-1.5.1/./bin/elasticsearch

ES here do not need to set how many things, basically the default can meet our requirements ...
Logstash construction of the initial construction
Curl-o http://download.elastic.co/logstash/logstash/logstash-1.5.1.tar.gz

Now you should have a file called logstash-1.5.2.tar.gz. Let's unzip it, please.

Tar zxvf logstash-1.4.2.tar.gz
CD logstash-1.5.1

Now let's run it:
Bin/logstash-e ' input {stdin {}} ' output {stdout {}} '

We can now enter some characters at the command line, and then we will see the output of the Logstash:
Hello World
2015-06-17t01:22:14.405+1000 0.0.0.0 Hello World

OK, it's kinda interesting ... In the example above, we define an input called "stdin" and an output of "stdout" in the run Logstash, and Logstash will return the characters we have entered in a format, regardless of the character we enter. Note here that we used the-e parameter on the command line, which allows Logstash to accept settings directly from the command line. This is especially quick to help us repeatedly test the configuration correctly without writing the configuration file.
Let's try a more interesting example. First we use the CTRL-C command at the command line to exit the previously running Logstash. Now we re-run Logstash using the following command:

Bin/logstash-e ' input {stdin {}} output {stdout {codec = rubydebug}} '

We'll enter some characters, this time we enter "Goodnight Moon" will appear:

Goodnight moon{  "message" = "Goodnight Moon",  "@timestamp" = "2013-11-20t23:48:05.335z",  "@ Version "=" 1 ",  " host "=" My-laptop "}

In the example above, we can change the output performance of the Logstash by re-setting an output called "stdout" (Adding the "codec" parameter). Similarly, we can make the arbitrary format log data possible by adding or modifying inputs, outputs, and filters in your profile, thus making it easier to tailor a more reasonable storage format for the query.

Integrated Elasticsearch Insert data above steps have been successfully built Logstash, then add logstash configuration file, so that its configuration file start, the data into ES, display
1. Add logs.conf under the/root/config/directory
input{file{type = "all" Path = "/root/tomcat7/logs/catalina.out"} file{type =&gt ; "Access" path = "/root/tomcat7/logs/access.log"}}filter {multiline {pattern = "^[^\[]" W      Hat = "Previous"} if [type] = = "Access" {grok {pattern = "(? <request_info>{.*}$)" } JSON {Source = Request_info} geoip {Source = "client_ip" fields = ["C Ountry_name "," Region_name "," City_name "," Real_region_name "," latitude "," longitude "] Remove_field = [" [GeoIP]          ][longitude] "," [Geoip][latitude] "," Location "," Region_name "]} useragent {Source =" user_agent " prefix = "Useragent_" Remove_field = ["Useragent_device", "Useragent_major", "Useragent_minor", "Useragent_patch", "Useragent_os", "Useragent_os_major", "Useragent_os_minor"]}} else if [type] = = ' All ' {gro K {pattern = "\[(?<level>\w*). *\] (? <datetime>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2},\d{3}) \s "}} mutate {remove _field = ["Request_info", "@version", "tags"] remove_tag = ["_grokparsefailure"] Replace + = ["ho    St "," GD1_PRD_YOWOO_TOMCAT4 "]}}output {stdout {codec = Rubydebug} elasticsearch {host =" localhost " index = "logstash-%{type}-%{+yyyy. MM.DD} "Index_type ="%{type} "}}

2. Start Logstash (configuration file start)
SH logstash-f/root/config/logs.conf

3, in the above configuration file, has specified the Tomcat log, all is Tomcat's log, access is our own log written in the program, in Log4j.xml:
<?xml version= "1.0" encoding= "UTF-8"?> <! DOCTYPE log4j:configuration SYSTEM "log4j.dtd" ><log4j:configuration xmlns:log4j= "http://jakarta.apache.org/ log4j/"><!--all log for console--><appender name=" console "class=" Org.apache.log4j.ConsoleAppender "> <layout class= "Org.apache.log4j.PatternLayout" ><param name= "Conversionpattern" value= "[%-5p]%d{ Yyyy-mm-dd Hh:mm:ss,sss}%l%M-%m%n "/></layout></appender><!--access log--><appender name=" Access "class=" Org.apache.log4j.DailyRollingFileAppender "><layout class=" Org.apache.log4j.PatternLayout " ><param name= "Conversionpattern" value= "[%-5p]%d{yyyy-mm-dd hh:mm:ss,sss}-%m%n"/></layout>< param name= "Append" value= "true"/><param name= "File" value= "/root/tomcat7/logs/access.log" <span style= " Font-family:arial, Helvetica, Sans-serif; >/></span><param name= "datepattern" value= "'. ' Yyyy-mm-dd '. ' "/><filter class=" Com.lives.platForm.common.log.AccessLogFilter "/></appender><root><priority value=" Debug "/>< Appender-ref ref= "Console"/><appender-ref ref= "Access"/></root></log4j:configuration>

Log4j.xml is configured in the daily roll log file, Logstash point to generate the log file address, to listen, the log will not, I have a blog in the category called the log, go to see wow ...

Kibana Build Download Kibana
wget https://download.elastic.co/kibana/kibana/kibana-4.1.0-linux-x64.tar.gz

Unzip it out on the line ...
Configure the data display to read ES into the Kibana directory/config, modify the Kibana.yml file, specify the ES access address (PS: Previous version is modified Conf.js, do not let people mislead you ...)
# Kibana is served by a back end server. This controls which port to use.port:5601# the host to bind the server to.host: ' 0.0.0.0 ' # the Elasticsearch instance to Use for all your Queries.elasticsearch_url: "http://localhost:9200"

Finally go to the Web page to view



Thank you for watching, hey no, who else, have any questions please add qq:772846384 to verify the answer to the question: sweet words deceive the heart

Elasticsearch+kibana+logstash Build Log Platform

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.