Document directory
The purpose of this platform is to facilitate log query During O & M and R & D. Kibana is a free web shell. logstash integrates various log collection plug-ins and is also an excellent regular-cut log tool. elasticsearch is an open-source search engine framework (supporting cluster architecture ).
1 installation requirement 1.1 theoretical Topology
1.2 installation environment 1.2.1 hardware environment
192.168.50.62 (hp dl 385 G7, Ram: 12g, CPU: AMD 6128, Disk: SAS 146*4)
192.168.50.98 (hp dl 385 G7, Ram: 12g, CPU: AMD 6128, Disk: SAS 146*6)
192.168.10.42 (xen Vm, Ram: 8g, CPU: × 4, Disk: 100g)
1.2.2 Operating System
Centos 5.6x64
1.2.3 basic web-server environment
Nginx + PhP (Installation Process skipped)
1.2.4 Software List
JDK 1.6.0 _ 25
Logstash-1.1.0-monolithic.jar
Elasticsearch-0.18.7.zip
Redis-2.4.12.tar.gz
Kibana
1.3 obtain method 1.3.1 JDK obtain path
Http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u25-download-346242.html
1.3.2 logstash obtaining path
Http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar
1.3.3 elasticsearch retrieval path
Https://github.com/downloads/elasticsearch/elasticsearch/ elasticsearch-0.18.7.zip
1.3.4 kibana get path
Http://github.com/rashidkpc/Kibana/tarball/master
2 Installation Steps 2.1 Download and install JDK
Basic installation
Wget http://download.oracle.com/otn-pub/java/jdk/6u25-b06/jdk-6u25-linux-x64.bin
Sh jdk-6u25-linux-x64.bin
Mkdir-P/usr/Java
MV./jdk1.6.0 _ 25/usr/Java
Ln-S/usr/Java/jdk1.6.0 _ 25/usr/Java/Default
Edit the/etc/profile file and add the following lines
Export java_home =/usr/Java/Default
Export Path = $ java_home/bin: $ path
Export classpath =.: $ java_home/lib/tools. jar: $ java_home/lib/dt. jar: $ classpath
Refresh Environment Variables
Source/etc/profile
2.2 download and install redis
Wget http://redis.googlecode.com/files/redis-2.4.14.tar.gz
Make-j24
Make install
Mkdir-P/data/redis
CD/data/redis/
Mkdir {dB, log, etc}
2.3 download and install elasticsearch
CD/data/
Mkdir-P elasticsearch & CD elasticsearch
Wget -- no-check-certificate https://github.com/downloads/elasticsearch/elasticsearch/ elasticsearch-0.18.7.zip
Unzip elasticsearch-0.18.7.zip
2.4 download and install logstash
Mkdir-P/data/logstash/& CD/data/logstash
Wget http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar
2.5 download and install kibana
Wget http://github.com/rashidkpc/Kibana/tarball/master -- no-check-Certificate
Tar zxvf master
3 related configuration and startup 3.1 redis configuration and startup 3.1.1 configuration file
Vim/data/redis/etc/redis. conf
#----------------------------------------------------
# This is the config file for redis
Pidfile/var/run/redis. PID
Port 6379
Timeout 0
Loglevel verbose
Logfile/data/redis/log/redis. Log
Databases 16
Save 900 1
Save 300 10
Save 60 10000
Rdbcompression Yes
Dbfilename dump. RDB
DIR/data/redis/DB/
Slave-serve-stale-data Yes
Appendonly No
Appendfsync everysec
No-appendfsync-on-Rewrite No
Auto-Aof-rewrite-percentage 100
Auto-Aof-rewrite-Min-size 64 MB
Slowlog-log-slower-than 10000
Slowlog-max-len 128
VM-enabled No
VM-Swap-file/tmp/redis. Swap
VM-max-memory 0
VM-page-size 32
VM-pages 134217728
VM-max-threads 4
Hhash-max-zipmap-entries 512
Hash-max-zipmap-value 64
List-max-ziplist-entries 512
List-max-ziplist-value 64
Set-max-intset-entries 512
Zset-max-ziplist-entries 128
Zset-max-ziplist-value 64
Activerehashing Yes
3.1.2 redis startup
[Logstash @ logstash_2 redis] # redis-server/data/redis/etc/redis. conf &
3.2 configure and start elasticsearch 3.2.1 start elasticsearch
[Logstash @ logstash_2 redis] #/data/elasticsearch/elasticsearch-0.18.7/bin/elasticsearch-P ../esearch. PID &
3.2.2 elasticsearch cluster configuration
Curl 127.0.0.1: 9200/_ cluster/nodes/192.168.50.62
3.3 logstash configuration and startup 3.3.1 logstash configuration file
Input {
Redis {
Host => "192.168.50.98"
Data_type => "list"
Key => "logstash: redis"
Type => "redis-input"
}
}
Filter {
Grok {
Type => "Linux-syslog"
Pattern => "% {syslogline }"
}
Grok {
Type => "nginx-access"
Pattern => "% {nginxaccesslog }"
}
}
Output {
Elasticsearch {
Host => "192.168.50.62"
}
}
3.3.2 start logstash as index
Java-jar logstash. Jar agent-f my. conf &
3.3.3 start logstash as Agent
Configuration File
Input {
File {
Type => "Linux-syslog"
Path => ["/var/log/*. log", "/var/log/messages", "/var/log/syslog"]
}
File {
Type => "nginx-access"
Path => "/usr/local/nginx/logs/access. log"
}
File {
Type => "nginx-error"
Path => "/usr/local/nginx/logs/error. log"
}
}
Output {
Redis {
Host => "192.168.50.98"
Data_type => "list"
Key => "logstash: redis"
}
}
Start agent
Java-jar logstash-1.1.0-monolithic.jar agent-F shipper. conf &
3.3.4 kibana Configuration
Add site configuration in nginx
Server {
Listen 80;
SERVER_NAME logstash.test.com;
Index index. php;
Root/usr/local/nginx/html;
# Charset koi8-r;
# Access_log logs/host. Access. Log main;
Location ~ . * \. (PhP | PhP5) $
{
# Fastcgi_pass Unix:/tmp/php-cgi.sock;
Fastcgi_pass 127.0.0.1: 9000;
Fastcgi_index index. php;
Include FastCGI. conf;
}
}
4 Performance Optimization 4.1 elasticsearch optimization 4.1.1 JVM Optimization
Edit the elasticsearch. In. Sh File
Es_classpath = $ es_classpath: $ es_home/lib/*: $ es_home/lib/sigar /*
If ["x $ es_min_mem" = "X"]; then
Es_min_mem = 4G
Fi
If ["x $ es_max_mem" = "X"]; then
Es_max_mem = 4G
Fi
4.1.2 elasticsearch index compression
Vim index_elastic.sh
#! /Bin/bash
# Comperssion the data for elasticsearch now
Date = 'date + % Y. % m. % d'
# Compression the new index;
/Usr/bin/curl-xput http: // localhost: 9200/logstash-$ date/nginx-access/_ mapping-d' {"nginx-access ": {"_ source": {"compress": True }}}'
Echo ""
/Usr/bin/curl-xput http: // localhost: 9200/logstash-$ date/nginx-error/_ mapping-d' {"nginx-error ": {"_ source": {"compress": True }}}'
Echo ""
/Usr/bin/curl-xput http: // localhost: 9200/logstash-$ date/Linux-syslog/_ mapping-d' {"Linux-syslog ": {"_ source": {"compress": True }}}'
Echo ""
Save the script and execute
Sh index_elastic.sh
5. Use 5.1 logstash to query the page
Access http://logstash.test.com with Firefox or Google Chrome
From: http://enable.blog.51cto.com/747951/1049411