Recently in the log analysis of this piece, to use Logstash+elasticsearch+kibana to implement log import, filtering and visual management, official documentation is not detailed enough, the online articles are mostly either for the use of Linux systems, or the copying of other people's configuration is mostly impossible to run. It took a lot of effort to get rid of these three things, write a usage experience, nonsense not much to say, into the subject.
First of all, to install Java JDK Environment on your computer, to use Logstash+elasticsearch+kibana, you need to download the three software and some necessary plugins, the list is as follows:
1, Java JDK (the latest version logstash needs JDK1.8) http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
2, Logstash https://www.elastic.co/downloads
3, Elasticsearch Https://www.elastic.co/downloads
4, Kibana Https://www.elastic.co/downloads
5. Curl Plugin http://curl.haxx.se/download.html
First, Elasticsearch configuration
Unzip the downloaded Elasticsearch, go to the Config directory, modify the Elasticsearch.yml file, and write the following code:
Discovery.zen.ping.multicast.enabled:false #关闭广播, if the LAN has a machine open 9300 port, the service will start Can't move.
network.host:192.168.1.91 #指定主机地址, in fact, is optional, but it is better to specify that the following HTTP connection error is reported when the Kibana is integrated (visual representation of Monitored::: 9200 instead of 0.0.0.0:9200)
Http.cors.allow-origin: "/.*/"
Http.cors.enabled:true
This adds a 9200 port Elasticsearch listener, ready for the back logstash.
Second, Logstash configuration
The Logstash Core command is input--->filter--->output, which can either enter configuration information in a DOS window or save configuration information in a. conf file, using the configuration file loading method.
Unzip the downloaded Logstash compressed package, and then create a new stdin.conf file in the bin directory with the following configuration code:
Input {
stdin{}
}
Output {
Elasticsearch {
Host = "192.168.1.91"
}
}
The purpose of this code is to send the information entered in the console to Elasticsearch
Three, Kibana configuration
Unzip the downloaded Kibana compressed file, and also go to the config file to modify the Kibana.yml file:
port:7873
Host: "192.168.1.91"
Elasticsearch_url: "http://192.168.1.91:9200"
The configuration you want to do is basically complete, and these environments begin to execute:
1, cmd into the Elasticsearc bin directory, execute the following command: Elasticsearch
2, cmd into the Elasticsearc bin directory, execute the following command: Logstash-f stdin.conf, see Logstash startup completed on behalf of Logstash has been started, entered below: where There is a would there is a a-type press ENTER.
3, cmd into the Kibana Bin directory (you can also press the shift+ right mouse button in the bin directory to select "Open command Window Here"), execute the following command: Kibana
4, the browser input Http://localhost:7873/kibana can see the information you entered:
Ps:
1, the last step I have configured the relevant mapping on the Web page, so directly opened the results, you can follow the instructions on the page to configure, I wish you success ~ ~ ~
2, about the Curl plug-in, it can be elasticsearch on the DOS side of the results displayed. To use: Put the downloaded Curl.exe program into the bin directory of Logstash, after performing the second step above logstash-f stdin.conf command and enter the information, cmd into the Logstash bin directory, run Curl " http://192.168.1.91: 9200/_search?pretty"The results are also visible.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Logstash+elasticsearch+kibana combined use to build a log analysis system (Windows system)