Nlog, Elasticsearch, Kibana and Logstash
Objective
Recently in the document management, it is necessary to record each administrator and user in the process of all operational records, originally through the EF directly to record the operation data in the database, when the query directly from the database read, but this is too stupid, so found on the Internet Logstash this NB tool, Share with you the process of learning.
Environment preparation
These three tools need to be downloaded on the official website (https://www.elastic.co/) I downloaded the version is elasticsearch-2.3.4,kibana-4.5.2-windows,logstash-2.3.4, is the version of Windows, three tools of the source code is written in Java, so to install Java First, this tutorial on the Internet a lot of, can be found. After the installation, extract three folders separately
The name of the different folders of the version is not the same, but it does not affect the different versions of the three tools will have a significant impact is unclear. Nlog DLLs can be downloaded directly from NuGet.
Get Up!
After the environment is ready, you can take off, first in the elasticsearch-2.3.4 file, the Config/elasticsearch.yml file can modify the address and port of the service, usually with the default address
Double-click Run Bin\elasticsearch.bat file
When this heap of code appears, there is no problem with the configuration. Type the address you just configured in the browser, such as: localhost:9200, the code appears
Run successfully.
The following configuration Kibana, primarily in config/kibana.yml, configures the Logstash service address and Elasticsearch address and port number
Attention!!! , the address of the server.host here is the same as the Logstash address that will be configured below, and cannot use localhost, only with the ID address, such as: Your ID address is 192.168.4.12, although 127.0.0.1 also represents the local address, but the two are not equal here, must be consistent Just fine. Elasticsearch.rul corresponds to the address just configured Elasticsearch, where the ID address is not so strict. After configuration, double-click Bin\kibana.bat to Run, Kibana run successfully, the premise is Elasticsearch is started,
See the page code, the configuration is successful, there is an address http://192.168.4.12:5601 in the interface, open the address in the browser
When the interface is displayed, it is not wrong to run successfully. The last step, configure the Logstash, in the configuration of the need to build their own configuration files, such as ***.conf, where there is no storage, you can find the line, usually in the directory under the Conf folder, and then copied here. There are three elements that need to be configured,
Input,filter,output,input refers to where the data source comes from, stdin{} is from the screen is read data, filter is based on your needs to modify some of the data format, Outout is to send this data to where, stdout{} is to output data on the screen, because I do not need filter here, so there is no write, there is a need to refer to the official document https://www.elastic.co/guide/en/logstash/current/ Filter-plugins.html, after these are configured, open cmd, go to Logstash folder, run Logstash-f ***.conf,-f followed by the location where the configuration file was just I put it in the same folder as Logstash.bat, so I can run it directly.
, config file main started, run successfully, just enter a line below, enter
The data you just entered is printed below.
Here, the configuration of the three tools has been configured, and the next article will explain how to use Nlog to logstash the log input and view it on Kibana.
Nlog, Elasticsearch, Kibana and Logstash