Objective
process, NIGNX format log into JSON, Logstash directly to Elasticsearch, and then through the Kibana GUI interface display analysis
Important NIGNX Log into JSON format, avoid nignx default log is a space, need a regular match, resulting in logstash too much CPUThe Elasticsearch machine configures the firewall, allowing only the specified Logstash mac
Tags: eyes NEC Beat statement classified password means weight rom The Logstash is constructed from three components, namely input, filter, and output. We can do it, Logstash. The workflow of three components is understood as: input collects data, filter processes the data, and outputs output data. The question of how to collect, where to collect, what to do, what to do, how to do it, and where to send it,
The Logstash is a lightweight Log collection processing framework that allows you to easily collect scattered, diverse logs and customize them for processing, and then transferring them to a specific location, such as a server or file.
This article is for the Official document translation and practice, I hope there are more users to understand, use this tool. download, install, use
This tool is out-of-the-box software, download address stamp here, dow
Do Android 3 years, the network is not very concerned about, and now look let me eat a surprise, many of the previously expected features are open source, and powerful, try a bit. Simple trialDownload elasticsearch-1.4.2 and startDownload logstash-1.4.2, run the following commandBin/logstash-e ' input {stdin {}} output {elasticsearch {host = localhost}} 'The data entered by the console will be
-flume-1.5.2-bin/tracklog-kafka/checkpointAgent.channels.m1.datadirs=/opt/modules/apache-flume-1.5.2-bin/tracklog-kafka/datadirAgent.channels.m1.transactionCapacity = 1000000agent.channels.m1.capacity=1000000Agent.channels.m1.checkpointInterval = 30000
Second, the data into the KafkaThe above collect topic need to be Kafka in advance, the other steps into the Kafka has been configured in the Collect.To create a topic statement reference:
%{kafka_home}/bin/kafka-topics.sh-
Data acquisition of Kafka and Logstash
Based on Logstash run-through Kafka still need to pay attention to a lot of things, the most important thing is to understand the principle of Kafka.
Logstash Working principleSince Kafka uses decoupled design ideas, it is not the original publication subscription, the producer is responsible for generating the
The Nginx Access log we collected through Logstash already contains the data for the client IP (REMOTE_ADDR), but only this IP is not enough, the location of the Kibana to display the requested source needs to be implemented by GEOIP database. GeoIP is the most common free IP address classification query library, but also has a pay version can be purchased. GeoIP Library can provide the corresponding geographical information according to the IP addres
Logstash cannot read redis data
A problem occurred when constructing logsatsh + redis + elasticsearch today. After nearly one hour of troubleshooting, the problem was finally solved. Record it.
The environment is like this. A client sends data to redis on the server, and logstash on the server reads redis data and stores it in elasticsearch.
The initial problem is that on the server side, the log sent from
Official website https://www.elastic.coSoftware version: Logstash 2.2.0 all Pluginselasticsearch 2.2.0Kibana 4.4.0Note: This environment becomes Centos6.5 64 bits, the single machine does the test, the specific configuration is simple.1.Logstash installation ConfigurationUnzip to/usr/local/logstash-2.2.0/Logstash confi
Describes how to export log4j logs to Logstash from Java projects. First, log4j Foundation
Cannot exception the official introduction:
Log4j is a reliable, fast, flexible log framework (API) written in the Java language, and is licensed using Apache Software License. It is ported to C, C + +, C #, Perl, Python, Ruby, and Eiffel languages.
The log4j is highly configurable and is configured at run time using an external configuration file. It records lo
Logstash has a simple batch build plugin. Generator For details, see official website: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-generator.htmlHow to use: Config file modified toInput { generator { = = [ "line1", " Line 2", "line3" ] 3 }}#下面的输出部分可以替换成其他输出插件. such as Elasticsearch or Redis,mongo. Output { stdout {codec = dots }}The
Environmental conditions:System version: CentOS 6.8Logstash version: 6.3.2Redis Version: 2.4Logstash input configuration:Input {redis {host="172.16.73.33"#redis IP Port="52611"#redis Port Password="123456"#redis Password db=9# Specify Redis library number data_type="List"#数据类型 Key="filebeat"#key Value name}}View CodeProblem:1. When I do not add the password parameter in the above input configuration, will report the following warning, do not forget to configure the password Oh .[2018--29t17: +:8
# cat syslog02.conf #filename: syslog02.conf #注意这个是要用 # comment out input{ file{= ["/var/ Log/*.log"] }}output{ elasticsearch { = = ["12x.xx.15.1xx : 9200"] }}See if there is a problem with the configuration file:# .. /bin/logstash-f syslog02.conf-tsending logstash's logs to/usr/local/logstash/logs which is now C onfigured via log4j2.properties[]--01t09: Wu,][fatal ][logstash.runner i
Rsyslog is a log collection tool. Currently, many Linux systems use rsyslog to replace syslog. I will not talk about how to install rsyslog. I will talk about the principle and the configuration of logstash.
Rsyslog itself has a configuration file/etc/rsyslog. conf, which defines the log file and the corresponding storage address. The following statement is used as an example:
local7.* /var/log/boot.log
I
Tags: blog http io os ar file sp div onWith this logstash extension,Https://github.com/PeterPaulH/logstash-influxdb/blob/master/src/influxdb.rbPut this file in Logstash-1.4.2/lib/logstash/outputs.Look at the Logstash document for the afternoon and finally solve their own nee
Tags:. NET for file att enable IDE World Oar VIMCentOS self-test can be installed first mysql,elasticsearch do not understand, please refer to another articleInstalling LogstashOfficial: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html1. Download the public keyRPM--import Https://artifacts.elastic.co/GPG-KEY-elasticsearch2. Add Yum SourceVim/etc/yum.repos.d/logstash.repoWrite in File[logst
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.