Single-Machine Deployment Elk Log collection, analysis system

Source: Internet
Author: User
Tags kibana logstash install redis

Recently do log analysis, found that logstash more in line with their own needs,

Logstash: Do the System log collection, reprint the tool. At the same time, the integration of various log plug-ins, log query and analysis of the efficiency of a great help. Generally use shipper as log collection, indexer as log reprint.

Logstash shipper collects log and forwards log to Redis storage

Logstash Indexer reads data from Redis and forwards to Elasticsearch

Redis: is a db,logstash shipper the log is forwarded to the Redis database for storage. Logstash Indexer reads data from Redis and forwards it to elasticsearch.

Elasticsearch:elasticsearch is a lucene-based open source search engine used to index.

Kibana: Open source Web presentation, the interface is very beautiful, is a powerful Elasticsearch data display client, Logstash has built-in Kibana, you can also deploy Kibana alone, the latest version of Kibana3 is a pure HTML+JS client.


Software Download Catalogue

http://www.elasticsearch.org/downloads/

My environment is as follows

ip:192.168.81.44 t44os:centos6.5 x86_64openjdk Version "1.8.0_31" nginx-1.0.15redis-2.4.10elasticsearch-1.5.0logstash-1.4.2kibana-3.1.0




First, configure Epel yum source

Yum-y Localinstall http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm


Second, install the JDK environment

Yum-y Install JAVA-1.8.0-OPENJDK


Iii. installing Redis, starting Redis

Yum-y install Redis; /etc/init.d/redis restart


Iv. installation configuration Elasticsearch, start Elasticsearch

Wget-c Https://download.elastic.co/elasticsearch/elasticsearch/elasticsearch-1.5.0.tar.gz-O/root/ Elasticsearch-1.5.0.tar.gztar-xvf/root/elasticsearch-1.5.0.tar.gz-c/usr/local/


Add 2 lines of configuration:

Tail-n2/usr/local/elasticsearch-1.5.0/config/elasticsearch.yml http.cors.allow-origin: "/.*/" http.cors.enabled: True


Start Elasticsearch

/usr/local/elasticsearch-1.5.0/bin/elasticsearch-d


View Elasticsearch Logs

Tail-f/usr/local/elasticsearch-1.5.0/logs/elasticsearch.log


V. Installation configuration Logstash, start Logstash

Wget-n Https://download.elastic.co/logstash/logstash/logstash-1.4.2.tar.gz-O/ROOT/LOGSTASH-1.4.2.TAR.GZTAR-XVF/ Root/logstash-1.4.2.tar.gz-c/usr/local/


Configure index.conf

cat /usr/local/logstash-1.4.2/bin/index.conf input {redis {host =>  "127.0.0.1 "    # these settings should match the output of  the agentdata_type =>  "List" key =>  "Logstash"     # we  use the  ' JSON '  codec here because we expect to read     # json events from redis.codec =>json  }file { Type => "T44message" path =>["/var/log/messages"]  }syslog {      type  =>  "Rsyslog"      port => 514}file  {type => "T44secure" path =>["/var/log/secure"]  }file {type => " T44nginx "path =>["/var/log/nginx/*.log "]  }}output {#  stdout {  Debug => true debug_format =>  "JSON"}stdout { codec =>rubydebug }elasticsearch { host =>  "127.0.0.1"   }}

Start Logstash

/usr/local/logstash-1.4.2/bin/logstash-f/usr/local/logstash-1.4.2/bin/index.conf


Vi. installation of Nginx, Kibana

Yum-y Install nginxwget-c Https://download.elasticsearch.org/kibana/kibana/kibana-3.1.0.tar.gz-O/root/ Kibana-3.1.0.tar.gztar-xvf/root/kibana-3.1.0.tar.gz-c/usr/share/nginx/html/


Configure Nginx:

egrep -v  ' ^#|    # '  /etc/nginx/conf.d/default.conf |grep -v  ' ^$ ' Server {    listen       80 default_ server;    server_name  _;    include /etc/nginx/ default.d/*.conf;    location / {         root   /usr/share/nginx/html;        index   index.html index.htm index.php;    }    error_page   404              /404.html;     location = /404.html {         root   /usr/share/nginx/html;    }    error_page    500 502 503 504  /50x.html;    location = /50x.html {         root   /usr/share/nginx/html;    }     location ~ \.php$ {        root            /usr/share/nginx/html;         fastcgi_pass   127.0.0.1:9000;         fastcgi_index  index.php;        fastcgi_param   SCRIPT_FILENAME   $document _root$fastcgi_script_name;         fastcgi_buffer_size         32k;         fastcgi_buffers              8 32k;        include        fastcgi_ params;    }}


Configuration Kibana:

grep ' Elasticsearch: '/usr/share/nginx/html/kibana/config.js |grep-v ' \* ' elasticsearch: ' http://192.168.81.44:9200 ' , \cp/usr/share/nginx/html/kibana/app/dashboards/default.json{,.bak}\cp/usr/share/nginx/html/kibana/app/ Dashboards/logstash.json/usr/share/nginx/html/kibana/app/dashboards/default.json


Visit Kibana:

http://192.168.81.44/kibana/


This article from the "Past with the Wind" blog, declined reprint!

Single-Machine Deployment Elk Log collection, analysis system

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.