Elk Log Analysis platform to build the whole process

Source: Internet
Author: User
Tags date1 kibana logstash

First, the use of the background

When the production environment has many servers, many business module logs need to be viewed every moment

Second, the environment

System: CentOS 6.5

jdk:1.8

Elasticsearch-5.0.0

Logstash-5.0.0

kibana-5.0.0

Third, installation

1. Installing the JDK

Download jdk:http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

This environment downloads a 64-bit tar.gz package that copies the installation package to the installation Server/usr/local directory

[Root@localhost ~]# cd/usr/local/
[Root@localhost local]# TAR-XZVF jdk-8u111-linux-x64.tar.gz

Configuring Environment variables

[Root@localhost local]# Vim/etc/profile

Add the following to the end of the file (if the server requires multiple JDK versions, in order for elk to not affect other systems, you can add the contents of the environment variable later to the Elk startup script)

java_home=/usr/local/jdk1.8.0_111
Jre_home=/usr/local/jdk1.8.0_111/jre
Classpath=.: $JAVA _home/lib:/dt.jar: $JAVA _home/lib/tools.jar
Path= $PATH: $JAVA _home/bin
Export Java_home
Export Jre_home

Ulimit-u 4096

[Root@localhost local]# Source/etc/profile

Configuring limit-dependent parameters

[Root@localhost local]# vim/etc/security/limits.conf
Add the following content

* Soft Nproc 65536
* Hard Nproc 65536
* Soft Nofile 65536
* Hard Nofile 65536

Create a user running Elk

[Email protected] local]# Groupadd Elk

[Email protected] local]# Useradd-g Elk Elk

Create Elk Run Directory

[Email protected] local]# Mkdir/elk
[Email protected] local]# chown-r Elk:elk/elk

To turn off the firewall:

[Email protected] ~]# iptables-f

All of the above is done by the root user

2, Installation Elk

The following are operated by Elk users

Log on to the server as a Elk user

Download Elk Install package: https://www.elastic.co/downloads, upload to server and unzip, extract command: TAR-XZVF package name

Configure Elasticsearch

Modify the following content:

Save exit

Start Elasticsearch

To see if it started successfully

Access by browser: http://192.168.10.169:9200

Elasticsearch Installation Complete

Installing Logstash

Logstash is responsible for collecting and filtering logs in elk.

Write the configuration file as follows:

Explain:

The Logstash configuration file must contain three content:

input{}: This module is responsible for collecting logs that can be read from a file, read from Redis, or open ports to write the business system that generated the log directly to the Logstash

filter{}: This module is responsible for filtering the collected logs and displaying fields according to the log definition after filtering

output{}: This module is responsible for outputting filtered logs to elasticsearch or files, Redis, etc.

This environment uses the log from the file read, the business system generates the log format as follows:

[2016-11-05 00:00:03,731 INFO] [HTTP-NIO-8094-EXEC-10] [Filter. Logrequestfilter]-/merchant/get-supply-detail.shtml, ip:121.35.185.117, [device-dpi = 414*736, Version = 3.6, device-o s = iOS8.4.1, timestamp = 1478275204, bundle = apyq9watkk98v2ec, device-network = WiFi, token = 393e38694471483cb3686ec77b ABB496, Device-model = iPhone, device-cpu =, sequence = 1478275204980, Device-uuid = c52ff568-a447-4afe-8ae8-4c9a54ced10c , sign = 0966A15C090FA6725D8E3A14E9EF98DC, request = {
"Supply-id": 192
}]
[2016-11-05 00:00:03,731 DEBUG] [HTTP-NIO-8094-EXEC-10] [Filter. Validaterequestfilter]-unsigned:bundle=apyq9watkk98v2ec&device-cpu=&device-dpi=414*736& Device-model=iphone&device-network=wifi&device-os=ios8.4.1&device-uuid= c52ff568-a447-4afe-8ae8-4c9a54ced10c&request={
"Supply-id": 192

Output directly to Elasticsearch

The environment needs to handle two sets of business system logs

Type: Represents the types, in fact, it is to push this type to Elasticsearch, convenient for the following Kibana classification search, generally directly named Business System project name

Path: Paths to read files

This is on behalf of the log error, the error will be returned to the previous message content

Start_position = "Beginning" is a representation of reading from the beginning of the file header

Grok in filter{} uses regular expressions to filter the log, where%{timestamp_iso8601} represents a built-in function that gets the regular expression of the 2016-11-05 00:00:03,731 time,%{timestamp_ ISO8601:DATE1} represents the assigned value to Date1, which can be reflected in the Kibana

This environment has two Grok is represented, the first one does not conform to the second article will be executed

Where index is the name that defines the data that will be stored after the filtered log is pushed to Elasticsearch

%{type} is a type variable (function) in the call to input

Start Logstash

Represents a successful start

Installing Kibana

Save exit

Start Kibana

where api-app-* and api-cxb-* have never been, * on behalf of all

Represents the number of log entries collected in real time

In the red box is the one defined in the filter filtering rule just now.

Elk Log Analysis platform to build the whole process

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.