Kibana + Logstash + Elasticsearch log query system, kibanalostash

Source: Internet
Author: User
Tags kibana logstash

Kibana + Logstash + Elasticsearch log query system, kibanalostash

The purpose of this platform is to facilitate log query During O & M and R & D. Kibana is a free web shell. Logstash integrates various log collection plug-ins and is also an excellent regular-cut log tool. Elasticsearch is an open-source search engine framework (supporting cluster architecture ).

 

1 installation requirement 1.1 theoretical Topology

 

1.2 installation environment 1.2.1 hardware environment

192.168.50.62 (hp dl 385 G7, RAM: 12G, CPU: AMD 6128, DISK: SAS 146*4)

192.168.50.98 (hp dl 385 G7, RAM: 12G, CPU: AMD 6128, DISK: SAS 146*6)

192.168.10.42 (Xen Vm, RAM: 8G, CPU: × 4, DISK: 100G)

1.2.2 Operating System

CentOS 5.6X64

1.2.3 basic Web-server environment

Nginx + php (Installation Process skipped)

1.2.4 Software List

JDK 1.6.0 _ 25

Logstash-1.1.0-monolithic.jar

Elasticsearch-0.18.7.zip

Redis-2.4.12.tar.gz

Kibana

1.3 obtain method 1.3.1 Jdk obtain path

Http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u25-download-346242.html

1.3.2 Logstash obtaining path

Http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar

1.3.3 Elasticsearch retrieval path

Https://github.com/downloads/elasticsearch/elasticsearch/ elasticsearch-0.18.7.zip

1.3.4 Kibana get path

Http://github.com/rashidkpc/Kibana/tarball/master

2 Installation Steps 2.1 Download and install JDK

Basic installation

Wget http://download.oracle.com/otn-pub/java/jdk/6u25-b06/jdk-6u25-linux-x64.bin

Sh jdk-6u25-linux-x64.bin

Mkdir-p/usr/java

Mv./jdk1.6.0 _ 25/usr/java

Ln-s/usr/java/jdk1.6.0 _ 25/usr/java/default

Edit the/etc/profile file and add the following lines

Export JAVA_HOME =/usr/java/default

Export PATH = $ JAVA_HOME/bin: $ PATH

Export CLASSPATH =.: $ JAVA_HOME/lib/tools. jar: $ JAVA_HOME/lib/dt. jar: $ CLASSPATH

Refresh Environment Variables

Source/etc/profile

2.2 download and install Redis

Wget http://redis.googlecode.com/files/redis-2.4.14.tar.gz

Make-j24

Make install

Mkdir-p/data/redis

Cd/data/redis/

Mkdir {db, log, etc}

2.3 download and install Elasticsearch

Cd/data/

Mkdir-p elasticsearch & cd elasticsearch

Wget -- no-check-certificate https://github.com/downloads/elasticsearch/elasticsearch/ elasticsearch-0.18.7.zip

Unzip elasticsearch-0.18.7.zip

2.4 download and install Logstash

Mkdir-p/data/logstash/& cd/data/logstash

Wget http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar

2.5 download and install Kibana

Wget http://github.com/rashidkpc/Kibana/tarball/master -- no-check-certificate

Tar zxvf master

3 related configuration and startup 3.1 Redis configuration and startup 3.1.1 configuration file

Vim/data/redis/etc/redis. conf

#----------------------------------------------------

# This is the config file for redis

Pidfile/var/run/redis. pid

Port 6379

Timeout 0

Loglevel verbose

Logfile/data/redis/log/redis. log

Databases 16

Save 900 1

Save 300 10

Save 60 10000

Rdbcompression yes

Dbfilename dump. rdb

Dir/data/redis/db/

Slave-serve-stale-data yes

Appendonly no

Appendfsync everysec

No-appendfsync-on-rewrite no

Auto-aof-rewrite-percentage 100

Auto-aof-rewrite-min-size 64 mb

Slowlog-log-slower-than 10000

Slowlog-max-len 128

Vm-enabled no

Vm-swap-file/tmp/redis. swap

Vm-max-memory 0

Vm-page-size 32

Vm-pages 134217728

Vm-max-threads 4

Hhash-max-zipmap-entries 512

Hash-max-zipmap-value 64

List-max-ziplist-entries 512

List-max-ziplist-value 64

Set-max-intset-entries 512

Zset-max-ziplist-entries 128

Zset-max-ziplist-value 64

Activerehashing yes

3.1.2 Redis startup

[Logstash @ Logstash_2 redis] # redis-server/data/redis/etc/redis. conf &

3.2 configure and start Elasticsearch 3.2.1 start Elasticsearch

[Logstash @ Logstash_2 redis] #/data/elasticsearch/elasticsearch-0.18.7/bin/elasticsearch-p ../esearch. pid &

3.2.2 Elasticsearch cluster configuration

Curl 127.0.0.1: 9200/_ cluster/nodes/192.168.50.62

3.3 Logstash configuration and startup 3.3.1 Logstash configuration file

Input {

Redis {

Host => "192.168.50.98"

Data_type => "list"

Key => "logstash: redis"

Type => "redis-input"

}

}

Filter {

Grok {

Type => "linux-syslog"

Pattern => "% {SYSLOGLINE }"

}

Grok {

Type => "nginx-access"

Pattern => "% {NGINXACCESSLOG }"

}

}

Output {

Elasticsearch {

Host => "192.168.50.62"

}

}

3.3.2 start Logstash as Index

Java-jar logstash. jar agent-f my. conf &

3.3.3 start Logstash as agent

Configuration File

Input {

File {

Type => "linux-syslog"

Path => ["/var/log/*. log", "/var/log/messages", "/var/log/syslog"]

}

File {

Type => "nginx-access"

Path => "/usr/local/nginx/logs/access. log"

}

File {

Type => "nginx-error"

Path => "/usr/local/nginx/logs/error. log"

}

}

Output {

Redis {

Host => "192.168.50.98"

Data_type => "list"

Key => "logstash: redis"

}

}

Start Agent

Java-jar logstash-1.1.0-monolithic.jar agent-f shipper. conf &

3.3.4 kibana Configuration

Add site configuration in nginx

Server {

Listen 80;

Server_name logstash.test.com;

Index. php;

Root/usr/local/nginx/html;

# Charset koi8-r;

# Access_log logs/host. access. log main;

Location ~ . * \. (Php | php5) $

{

# Fastcgi_pass unix:/tmp/php-cgi.sock;

Fastcgi_pass 127.0.0.1: 9000;

Fastcgi_index index. php;

Include fastcgi. conf;

}

}

4 Performance Optimization 4.1 Elasticsearch optimization 4.1.1 JVM Optimization

Edit the Elasticsearch. in. sh File

ES_CLASSPATH = $ ES_CLASSPATH: $ ES_HOME/lib/*: $ ES_HOME/lib/sigar /*

If ["x $ ES_MIN_MEM" = "x"]; then

ES_MIN_MEM = 4g

Fi

If ["x $ ES_MAX_MEM" = "x"]; then

ES_MAX_MEM = 4g

Fi

4.1.2 Elasticsearch index compression

Vim index_elastic.sh

#! /Bin/bash

# Comperssion the data for elasticsearch now

Date = 'date + % Y. % m. % d'

# Compression the new index;

/Usr/bin/curl-XPUT http: // localhost: 9200/logstash-$ date/nginx-access/_ mapping-d' {"nginx-access ": {"_ source": {"compress": true }}}'

Echo ""

/Usr/bin/curl-XPUT http: // localhost: 9200/logstash-$ date/nginx-error/_ mapping-d' {"nginx-error ": {"_ source": {"compress": true }}}'

Echo ""

/Usr/bin/curl-XPUT http: // localhost: 9200/logstash-$ date/linux-syslog/_ mapping-d' {"linux-syslog ": {"_ source": {"compress": true }}}'

Echo ""

Save the script and execute

Sh index_elastic.sh

5. Use 5.1 Logstash to query the page

Access http://logstash.test.com with Firefox or Google Chrome

 

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.