Kibana+logstash+elasticsearch Log Query system

Source: Internet
Author: User
Tags redis download kibana logstash

The purpose of building this platform is to facilitate the operation of the research and development of the log query. Kibana a free web shell; Logstash integrates various collection log plug-ins, or is a good regular cutting log tool; Elasticsearch an open-source search engine framework that supports the cluster architecture approach.

1 Installation Requirements 1.1 theoretical topology

1.2 Installation Environment 1.2.1 hardware environment

192.168.50.62 (HP DL 385 G7, ram:12g, Cpu:amd 6128, Disk:sas 146*4)

192.168.50.98 (HP DL 385 G7, ram:12g, Cpu:amd 6128, Disk:sas 146*6)

192.168.10.42 (Xen virtual machine, ram:8g, cpu:x4, disk:100g)

1.2.2 Operating System

CentOS 5.6 X64

1.2.3 Web-server Basic Environment

nginx+php (Installation process skipped)

1.2.4 Software List

JDK 1.6.0_25

Logstash-1.1.0-monolithic.jar

Elasticsearch-0.18.7.zip

Redis-2.4.12.tar.gz

Kibana

1.3 Get Method 1.3.1 JDK Fetch path

Http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u25-download-346242.html

1.3.2 Logstash Get Path

Http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar

1.3.3 Elasticsearch Get Path

Https://github.com/downloads/elasticsearch/elasticsearch/elasticsearch-0.18.7.zip

1.3.4 Kibana Get Path

Http://github.com/rashidkpc/Kibana/tarball/master

2 Installation Steps 2.1 JDK download and installation

Basic Installation

wget Http://download.oracle.com/otn-pub/java/jdk/6u25-b06/jdk-6u25-linux-x64.bin

SH jdk-6u25-linux-x64.bin

Mkdir-p/usr/java

mv./jdk1.6.0_25/usr/java

Ln–s/usr/java/jdk1.6.0_25/usr/java/default

Edit the/etc/profile file and add the following line content

Export Java_home=/usr/java/default

Export path= $JAVA _home/bin: $PATH

Export classpath=.: $JAVA _home/lib/tools.jar: $JAVA _home/lib/dt.jar: $CLASSPATH

Refresh Environment Variables

Source/etc/profile

2.2 Redis Download and installation

wget http://redis.googlecode.com/files/redis-2.4.14.tar.gz

Make–j24

Make install

Mkdir-p/data/redis

cd/data/redis/

mkdir {DB,LOG,ETC}

2.3 Elasticsearch Download and installation

cd/data/

Mkdir–p elasticsearch && CD Elasticsearch

wget--no-check-certificate Https://github.com/downloads/elasticsearch/elasticsearch/elasticsearch-0.18.7.zip

Unzip Elasticsearch-0.18.7.zip

2.4 Logstash Download and installation

mkdir–p/data/logstash/&& Cd/data/logstash

wget Http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar

2.5 Kibana Download and installation

wget Http://github.com/rashidkpc/Kibana/tarball/master--no-check-certificate

Tar zxvf Master

3 related configuration and startup 3.1 Redis configuration and Startup 3.1.1 Profile

Vim/data/redis/etc/redis.conf

#----------------------------------------------------

#this is the config file for Redis

Pidfile/var/run/redis.pid

Port 6379

Timeout 0

LogLevel verbose

Logfile/data/redis/log/redis.log

Databases 16

Save 900 1

Save 300 10

Save 60 10000

Rdbcompression Yes

Dbfilename Dump.rdb

dir/data/redis/db/

Slave-serve-stale-data Yes

AppendOnly No

Appendfsync everysec

No-appendfsync-on-rewrite No

Auto-aof-rewrite-percentage 100

Auto-aof-rewrite-min-size 64MB

Slowlog-log-slower-than 10000

Slowlog-max-len 128

Vm-enabled No

Vm-swap-file/tmp/redis.swap

Vm-max-memory 0

Vm-page-size 32

Vm-pages 134217728

Vm-max-threads 4

Hash-max-zipmap-entries 512

Hash-max-zipmap-value 64

List-max-ziplist-entries 512

List-max-ziplist-value 64

Set-max-intset-entries 512

Zset-max-ziplist-entries 128

Zset-max-ziplist-value 64

activerehashing Yes

3.1.2 Redis Boot

[Email protected]_2 redis]# redis-server/data/redis/etc/redis.conf &

3.2 Elasticsearch Configuration and startup 3.2.1 Elasticsearch Boot

[Email protected]_2 redis]#/data/elasticsearch/elasticsearch-0.18.7/bin/elasticsearch–p. /esearch.pid &

3.2.2 Elasticsearch Cluster configuration

Curl 127.0.0.1:9200/_cluster/nodes/192.168.50.62

3.3 Logstash Configuration and startup 3.3.1 Logstash configuration file

Input {

Redis {

Host = "192.168.50.98"

data_type = "List"

Key = "Logstash:redis"

Type = "Redis-input"

}

}

Filter {

Grok {

Type = "Linux-syslog"

Pattern = "%{syslogline}"

}

Grok {

Type = "Nginx-access"

Pattern = "%{nginxaccesslog}"

}

}

Output {

Elasticsearch {

Host = "192.168.50.62"

}

}

3.3.2 Logstash start as Index

Java-jar Logstash.jar agent-f my.conf &

3.3.3 Logstash start as Agent

Configuration file

Input {

file{

Type = "Linux-syslog"

Path = ["/var/log/*.log", "/var/log/messages", "/var/log/syslog"]

}

File {

Type = "Nginx-access"

Path = "/usr/local/nginx/logs/access.log"

}

File {

Type = "Nginx-error"

Path = "/usr/local/nginx/logs/error.log"

}

}

Output {

Redis {

Host = "192.168.50.98"

data_type = "List"

Key = "Logstash:redis"

}

}

Agent Startup

Java-jar Logstash-1.1.0-monolithic.jar agent-f shipper.conf &

3.3.4 Kibana Configuration

First add site configuration in Nginx

server {

Listen 80;

server_name logstash.test.com;

Index index.php;

root/usr/local/nginx/html;

#charset Koi8-r;

#access_log Logs/host.access.log Main;

Location ~. *\. (PHP|PHP5) $

{

#fastcgi_pass Unix:/tmp/php-cgi.sock;

Fastcgi_pass 127.0.0.1:9000;

Fastcgi_index index.php;

Include fastcgi.conf;

}

}

4 Performance Tuning 4.1 Elasticsearch tuning 4.1.1 JVM Tuning

Edit Elasticsearch.in.sh File

Es_classpath= $ES _classpath: $ES _home/lib/*: $ES _home/lib/sigar/*

If ["X$es_min_mem" = "x"]; Then

es_min_mem=4g

Fi

If ["X$es_max_mem" = "x"]; Then

es_max_mem=4g

Fi

4.1.2 Elasticsearch Index Compression

Vim index_elastic.sh

#!/bin/bash

#comperssion the data for Elasticsearch now

Date= ' Date +%y.%m.%d '

# compression the new index;

/usr/bin/curl-xput http://localhost:9200/logstash-$date/nginx-access/_mapping-d ' {"nginx-access": {"_source": {" Compress ": true}} '

echo ""

/usr/bin/curl-xput http://localhost:9200/logstash-$date/nginx-error/_mapping-d ' {"Nginx-error": {"_source": {" Compress ": true}} '

echo ""

/usr/bin/curl-xput http://localhost:9200/logstash-$date/linux-syslog/_mapping-d ' {"Linux-syslog": {"_source": {" Compress ": true}} '

echo ""

Save the script and execute

SH index_elastic.sh

5 using 5.1 logstash query page

Access http://logstash.test.com using Firefox or Google Chrome

Kibana+logstash+elasticsearch Log Query system

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.