Open source real-time log analytics Elk Platform Deployment

Source: Internet
Author: User
Tags echo command syslog elastic search kibana logstash

Open source real-time log analytics Elk Platform Deploymenttime 2015-07-21 17:13:10 51CTO recommended blog post Original http://baidu.blog.51cto.com/71938/1676798 ThemeLogstashElastic SearchOpen Source

Open source real-time log analytics Elk Platform Deployment

Logs primarily include system logs, application logs, and security logs. System operations and developers can use the log to understand the server hardware and software information, check the configuration process errors and the cause of the error occurred. Frequently analyze logs to understand the load of the server, performance security, so as to take timely measures to correct errors.

Typically, the logs are stored on different devices that are scattered. If you manage hundreds of dozens of of servers, you are also using the traditional method of logging in to each machine in turn. This is not feeling very cumbersome and inefficient. It is imperative that we use centralized log management, for example: Open source syslog, to summarize log collection on all servers.

Centralized management of the log, log statistics and retrieval has become a more troublesome thing, generally we use grep, awk and WC and other Linux commands to achieve retrieval and statistics, but for higher requirements of query, sorting and statistics and the large number of machines still use such a method is a little too hard.

Open source real-time log analysis ELK platform can perfectly solve our problems above, ELK by ElasticSearch, Logstash and Kiabana three open source tools. Official website: https://www.elastic.co/products

L Elasticsearch is an open source distributed search engine, it features: distributed, 0 configuration, automatic discovery, Index auto-shard, index copy mechanism, RESTful style interface, multi-data source, automatic search load, etc.

L Logstash is a fully open source tool that collects, analyzes, and stores your logs for later use (for example, search).

L Kibana is also an open source and free tool that Kibana can provide for Logstash and ElasticSearch log Analytics friendly Web interface that can help you summarize, analyze, and search for important data logs.

The operating principle is as follows:

Deploy Logstash on all services that need to collect logs, as Logstash agent (Logstash shipper) to monitor and filter the collection logs, send filtered content to Logstash indexer, Logstash Indexer Collect the logs together to the full-text search service ElasticSearch, you can use ElasticSearch to customize the search by Kibana to combine custom search for page presentation.

Open source real-time log Analytics Elk Platform Deployment process:

(1) Installing Logstash dependent package JDK

Logstash is dependent on the Java Runtime Environment, Logstash 1.5 or later is not lower than Java 7 is recommended to use the latest version of Java. Since we're just running Java programs instead of developing, download the JRE. First, the official download of the new JRE in Oracle: http://www.oracle.com/technetwork/java/javase/downloads/jre8-downloads-2133155.html

You can see a variety of versions available, when downloading, choose the version that suits your machine's operating environment, I'm using RHEL6.5 x86_64 OS, so download the linux-64 version. If you use the Linux download execute the following command to download.

http://download.oracle.com/otn-pub/java/jdk/8u45-b14/jdk-8u45-linux-x64.tar.gz

The JDK is easy to install, just unzip the downloaded package to the appropriate directory.

# mkdir/usr/local/java# tar-zxf jdk-8u45-linux-x64.tar.gz-c/usr/   local/java/

Set the environment variables for the JDK as follows:

# tail-3 ~/.bash_profileexport java_home=/usr/local/java/jdk1.8.0_45export path=$PATH:$JAVA _ Home/binexportclasspath=.:$JAVA _home/lib/tools.jar:$JAVA _home/lib/dt.jar:$CLASSPATH  

Executes the Java? version command at the Shell prompt, showing the following results, indicating that the installation was successful:

"1.8.0_45" Java (TM) SE Runtime Environment (1.8.  0_45-B14) Java HotSpot (64-bit Server VM (25.45-b02,mixed mode)    

(2) Installation Logstash

To download and install the Logstash, install Logstash just unzip it to the corresponding directory, for example:/usr/local under:

# https://download.elastic.co/logstash/logstash/logstash-1.5.2.tar.gz# tar? zxf logstash-1.5.2.tar.gz-c/ usr/local/ 

After the installation is complete, run the following command:

#/usr/local/logstash-1. 5. 2/bin/logstash-e 'output {stdout {}} ' Logstash startup Completedhello world! 2015-07-15t03::938Z noc. vfast. com Hello world!      

We can see that what we enter Logstash is output in some format, where the-e parameter allows Logstash to accept settings directly from the command line. This is especially quick to help us repeatedly test the configuration correctly without writing the configuration file. You can use the Ctrl-c command to exit a previously running Logstash.

Using the-e parameter to specify configuration on the command line is a very common way, but it requires a lot of content if you need to configure more settings. In this case, we first create a simple configuration file and specify Logstash to use this configuration file. For example: Create a "basic Configuration" test file logstash-test.conf in the Logstash installation directory, with the following file contents:

Cat Logstash-simple. confinput {stdin {}}output {   stdout {codec=> rubydebug}} 

Logstash uses input and output to define the relevant configuration of inputs and outputs when the log is collected, in this case, input defines an input called "stdin", and output defines an output called "stdout". Regardless of what character we enter, Logstash returns the characters we entered in a format where output is defined as "stdout" and uses the codec parameter to specify the Logstash output format.

Using the-f parameter of the Logstash to read the configuration file, do the following to begin testing:

# echo "' Date ' Hello World ' Thu Jul1604:06:48 CST 2015 Hello world#/usr/local/logstash-1.5.2/bin/logstash Agent-f Logstash-simple.conflogstash startup Completedtue Jul 14 18:07:07 EDT 2015 Hello World  #该行是执行echo "' Date ' Hello World ' after output, paste directly to that location { "message" =  "Tue Jul 18:07:07 EDT helloWorld",  "@version" =  "1",  "@timestamp" and " Span class= "hljs-string" > "2015-07-14t22:07:28.284z",  "host" =  "noc.vfast.com"}             

(3) Installation Elasticsearch

After downloading the Elasticsearch, unzip to the corresponding directory to complete the Elasticsearch installation.

# tar-zxf Elasticsearch-1.6. 0.tar.gz-c/usr/local/  

Start Elasticsearch

/usr/local/elasticsearch-1.6.0/bin/elasticsearch 

If you are using a remotely connected Linux method and want to run Elasticsearch in the background, execute the following command:

/usr/local/elasticsearch-1.6.0/bin/elasticsearch >nohup & 

Confirm that the Elasticsearch 9200 port is listening, indicating that Elasticsearch is running successfully

# NETSTAT-ANP |grep:9200tcp        0      0:::9200                     :::*                        LISTEN      3362/java  

Next we create a test file in the Logstash installation directory for testing Logstash using Elasticsearch as Logstash back end logstash-es-simple.conf, the file defines stdout and Elasticsearch as output, so that the "multiple output" is guaranteed to display the output to the screen and also to the Elastisearch.

Cat Logstash-es-simple. confinput {stdin {}}output {   "localhost"}   stdout {codec=> rubydebug}}  

Execute the following command

#/usr/local/logstash-1.5.2/bin/logstash Agent-f logstash-es-simple.conf ... Logstash startup Completedhello logstash{      "Hello Logstash",     "1",   "noc.vfast.com"} 

We can use the Curl command to send a request to see if ES has received the data:

# Curl' http:Localhost:9200/_search?pretty ' return result {"Took":58,"Timed_out":False"_shards": {"Total":5,"Successful":5,"Failed":0},"Hits": { "total": 1,  "Max_score": 1.0,  "hits": [{ "_index": " logstash-2015.07.15 ", " _type ": " logs ",  "_id":  "AU6TWIIXXDXYHYSMYTKP",  "_score": 1.0,  "_source": { "message":  "Hellologstash",  "@version":  "1",  "@timestamp":  "2015-07-15t20:13:55.199z",  "host":  "Noc.vfast.com"}}] }} 

At this point, you have successfully used Elasticsearch and Logstash to collect log data.

(4) Installing the Elasticsearch plug-in

The Elasticsearch-kopf plugin can query the data in Elasticsearch, install Elasticsearch-kopf, and simply execute the following command in the directory where you install Elasticsearch:

/usr/local/elasticsearch-1.6.0/#.  /plugin-install Lmenezes/elasticsearch-kopf 

After the installation is complete, you can see Kopf in the Plugins directory

# ls Plugins/kopf

In the browser access Http://10.1.1.188:9200/_plugin/kopf Browse the data saved in Elasticsearch as follows:

(5) Installation Kibana

After downloading the Kibana, unzip to the corresponding directory to complete the installation of Kibana

# tar-zxf Kibana-4.1. 1-linux-x64.tar.gz-c/usr/local/  

Start Kibana

/usr/local/kibana-4.1.1-linux-x64/bin/kibana 

With http://kibanaServerIP:5601 access to Kibana, after logging in, first configure an index, by default, Kibana data is pointed to Elasticsearch, uses the default logstash-* index name, and is time-based, Click "Create".

See the following interface to illustrate the completion of index creation.

By clicking on "Discover", you can search and browse the data in the Elasticsearch by default, which is the last 15 minutes of data. You can customize the time to select.

This means that your ELK platform installation deployment is complete.

(6) Configure Logstash as Indexer

The Logstash is configured as an indexer and the Logstash log data is stored to Elasticsearch, which is mostly indexed to the local system log.

# cat/usr/local/logstash-1.5.2/logstash-indexer.confinput {  file {    type ="syslog"     path = = [ "/var/log/syslog"]  }  syslog {    type ="Syslog"    port ="localhost"}}#/usr/local/logstash-1.5.2/bin/ logstash-flogstash-indexer.conf    

Use the echo command to simulate writing to the log, and to see information like this after the command executes

"' Date ' Uniqlo video" >>/var/log/messages 

Refresh Kibana to find the latest test data displayed in the browser, as shown in:

To this, the ELK platform deployment and the basic tests have been completed.

Open source real-time log analytics Elk Platform Deployment

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.