Ubuntu 14.04 Build Elk Log Analysis System (Elasticsearch+logstash+kibana)

Source: Internet
Author: User
Tags stdin syslog system log kibana logstash

The system transportation and the development personnel can through the log to understand the server hardware and software information, examines the configuration process the error and the error occurrence reason. Regular analysis of the log can understand the server load, performance security, so as to take timely measures to correct errors. The role of the log is self-evident, but for a large number of logs distributed across multiple machines, viewing is particularly troublesome. Therefore, the use of log analysis system is very necessary for operational personnel.

Open source real-time log analysis Elk platform can realize log monitoring and analysis, elk is composed of Elasticsearch, Logstash and Kiabana three open source tools. Official website: https://www.elastic.co/products

Elasticsearch is an open source distributed search engine, it features: distributed, 0 configuration, Automatic discovery, index automatic fragmentation, index copy mechanism, RESTful style interface, multiple data sources, automatic search load and so on.

Logstash is a fully open source tool that allows you to collect, analyze, and store your logs for later use (e.g., search).

Kibana is also an open source and free tool, and he kibana a friendly Web interface for Logstash and Elasticsearch, which can help you summarize, analyze, and search important data logs.

Elk work flow is as follows:

Deploy Logstash on all services that need to collect logs, as Logstash agent (Logstash shipper) to monitor and filter the collection logs and send filtered content to Logstash Indexer,logstash Indexer the logs together to the full-text search service Elasticsearch, you can use Elasticsearch to customize the search through the Kibana to combine the custom search for the page display.

Elk Platform Setup:

Built on Ubuntu 14.04 64-bit operating system

Download Elk installation Package: https://www.elastic.co/downloads/

Download JDK installation package: http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

Download-Complete packets:

1. Installation Dependency Pack Jdk8:

1 2 #sudo MKDIR/USR/LIB/JVM #tar xvzf jdk-8u91-linux-x64.tar.gz-c/usr/lib/jvm/
1 #vim ~/.BASHRC

Append to the bottom of the document

1 2 3 4 Export java_home=/usr/lib/jvm/jdk1.8.0_91 Export Jre_home=${java_home}/JRE export Classpath=.:${java_home}/lib: ${J Re_home}/lib Export Path=${java_home}/bin: $PATH
1 Execution: Source ~/.BASHRC

Execute Java-version and Java, with the corresponding data is installed complete.

2. Install Logstash:

1 #tar Xvzf logstash-2.3.3.tar.gz

Create the logstash-test.conf configuration file in the logstash-2.3.3 directory as follows:

1 2 3 4 5 6 # cat logstash-test.conf Input {stdin {}} output {stdout {codec=> Rubydebug}}

Logstash uses input and output to define the correlation configuration for input and output when collecting logs, in this case, input defines a input,output called "stdin" that defines an output called "stdout". Whatever character we enter, Logstash returns the character we entered in some format, where output is defined as "stdout" and the codec parameter is used to specify the Logstash output format.

Start with the following command:

1 #./bin/logstash agent-f logstash-test.conf

When you start, what you enter on the screen will be displayed in the console. If you enter "hehe", it appears as follows:

Indicates that the installation was successful. Use CTRL + C to exit the process.

3. Install Elasticsearch:

1 #tar Xvzf elasticsearch-2.3.3.tar.gz

To modify a configuration file to allow remote access:

1 2 CD Elasticsearch-2.3.3/config vim Elasticsearch.yml

Modify the network column to:

Start Elasticsearch:

1 #./bin/elasticsearch-d #-d for background startup

Visit http://<elasticsearch-ip>:9200

As described above, the installation was successful.

Install the Elasticsearch plugin head:

1 2 # CD elasticsearch-2.3.3 #/bin/plugin Install Mobz/elasticsearch-head

Appears as follows:

Installation successful, access to Http://<elasticsearch-ip>:9200/_plugin/head, as follows:

The installation was successful.

test whether Elasticsearch and Logstash can link successfully:

In the logstash-2.3.3 installation directory, create a test file logstash-es-simple.conf for testing Logstash using Elasticsearch as the back end of Logstash, which defines stdout and Elasticsearch as For output, such "multiple output" is to ensure that the output is displayed on the screen, but also output to the Elastisearch, which reads as follows:

1 2 3 4 5 6 7 8 9 # cat logstash-es-simple.conf Input {stdin {}} output {elasticsearch {hosts => ' localhost '} stdout { Codec=> Rubydebug} # Hosts for Elasticsearch hosts, where both are on the same machine

Start:

1 #./bin/logstash agent-f logstash-es-simple.conf

Open Http://<elasticsearch-ip>:9200/_search?pretty, as follows:

Indicates a successful link. At this point, you have successfully used Elasticsearch and Logstash to collect log data.

4. Install Kibana:

1 # tar Xvzf kibana-4.5.1-linux-x64.tar.gz

Start:

1 2 # CD kibana-4.5.1-linux-x64 #/bin/kibana

The startup process is as follows:

Access http://<ip>:5601, as follows:

The installation was successful.

After login, first, configure an index, the default, Kibana data is pointed to Elasticsearch, using the default logstash-* index name, and is based on time, click "Create" can be.

See the following, index creation complete:

Click on the "Discover" tab to search for and browse the data in Elasticsearch, the default search for the last 15 minutes of data, can also be customized.

The Elk platform has been deployed to completion.

5. Configure Logstash as indexer:

configures the Logstash as an indexer and stores the Logstash log data to Elasticsearch. This case is the index local System log.

Create the configuration file logstash-indexer.conf in the Logstash working directory as follows:

1 2 3 4 5 6 7 8 9 10 11 12 13-14 Input {file {type => ' syslog ' path => ['/var/log/messages ', '/var/log/syslog ']} syslog { Type => "syslog" Port => "5544"} output {stdout {codec=> Rubydebug} elasticsearch {hosts =>  ; "LocalHost"}}

After the Logstash is started, the echo is used to simulate writing to the log, after which the following information is executed:

In the browser:

Elasticsearch in the browser as follows:

The log data is synchronized successfully, as seen from the top.

To this end, the Elk platform deployment and testing has been completed.

Elasticsearch The latest version of the 2.20 release download http://www.linuxidc.com/Linux/2016-02/128166.htm

Linux Install deployment Elasticsearch full record http://www.linuxidc.com/Linux/2015-09/123241.htm

Elasticsearch Installation Use tutorial http://www.linuxidc.com/Linux/2015-02/113615.htm

Elasticsearch configuration file Translation Resolution Http://www.linuxidc.com/Linux/2015-02/114244.htm

Elasticsearch Cluster Construction Example http://www.linuxidc.com/Linux/2015-02/114243.htm

Distributed Search Elasticsearch single machine and server environment build http://www.linuxidc.com/Linux/2012-05/60787.htm

The working mechanism of Elasticsearch http://www.linuxidc.com/Linux/2014-11/109922.htm

Using Elasticsearch + Logstash + Kibana to build a log-focused analysis platform practice http://www.linuxidc.com/Linux/2015-12/126587.htm

Elasticsearch's detailed description : please click here
elasticsearch Download Address : please click here

This article permanently updates the link address : http://www.linuxidc.com/Linux/2016-06/132618.htm

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.