Elasticsearch, Kibana, Logstash, and NLog implement the ASP. NET Core distributed log system,

Source: Internet
Author: User
Tags kibana logstash

Elasticsearch, Kibana, Logstash, and NLog implement the ASP. NET Core distributed log system,
Elasticsearch-Overview

As a core part of Elasticsearch, Elasticsearch is a document repository with powerful indexing functions and can search for data through REST APIs. It is written in Java based on Apache Lucene, although these details are hidden in the API. Through indexed fields, you can use many different aggregation methods to find any stored (indexed) documents. However, ElasticSearch not only provides powerful search functions for these indexed documents. Quick, distributed, and horizontally scalable. It supports real-time document storage and analysis, and supports hundreds of servers and PB-level indexing data. At the same time, as the core of Elastic stack (aka ELK), it provides powerful applications such as LogStash, Kibana, and more.

Kibana is a powerful Visual Query Web application provided by Elasticsearch. With Kibana, you can easily create queries, charts, and dashboards for indexed data in Elasticsearch.
Elasticsearch opens a rest api, and you will find that many examples of documents are HTTP calls. You can try to use tools such as curl or postman. Of course, this API client has been written in many different languages, including. Net, Java, Python, Ruby, and JavaScript.

Logstash is an open-source data collection engine with real-time processing capabilities. You can dynamically collect data from different sources and output data processing (filtering and deformation) to a specific address to prepare for more diversified data analysis in the future.

If you want to read more, the Elasticsearch official website may be the best place.

Note: This article may take a long time. If you know more about Elasticsearch installation and installation, you can skip this article and check ASP. NET Core and Nlog. All linux commands in this article are operated by the root user.

One-click Installation
yum install java*
View JDK version information
java -version
Elasticsearch document Installation

There are many installation methods on the official website. I use the rpm installation method here. You can install it as you like ,.

// Enter the local directory cd/usr/local // create the elasticsearch folder mkdir elasticsearch // enter the elasticsearch folder cd elasticsearch // start downloading the wget started-ivh elasticsearch-5.5.0.rpm
Configuration
// Find the installation directory whereis elasticsearch // enter the installation directory cd/etc/elasticsearch // edit the configuration file vi elasticsearch. yml

Mainly configure Network. host (local ip address) and http. port (9200 by default) (currently single-node mode. For other parameters, see the official documentation)

 

Start the service
// Enable port 9200firewall-cmd -- add-port = 9200/tcp -- permanent // reload configuration firewall-cmd -- reload // set the service to start systemctl enable elasticsearch // start the service systemctl start elasticsearch

Open http: // 192.168.30.128: 9200 in the browser, as shown in.

Kibana document Installation

Official and official installation tutorials

// Enter the elasticsearch directory cd/usr/local/elasticsearch // download Kibana rpm 64-bit package wget ready-ivh kibana-5.5.0-x86_64.rpm
Configuration
// Enter the installation directory cd/etc/kibana // edit the configuration file vi kibana. yml

Set port: 5601, Host address: "192.168.30.128", elasticsearch service address: "http: // 192.168.30.128: 9200"

Start the service
// Enable port 5601firewall-cmd -- add-port = 5601/tcp -- permanent // reload configuration firewall-cmd -- reload // set the service to start systemctl enable kibana // start the service systemctl start kibana

Open http: // 192.168.30.128: 5601 in the browser and go to the Kibana Management page.

Install LogStash

Official installation tutorials

// Enter the elasticsearch directory cd/usr/local/elasticsearch // download the logstash rpm package wget ready-ivh logstash-5.5.0.rpm
Configuration
// Enter the installation directory cd/etc/logstash // enter the conf. d directory cd conf. d // Add the configuration information vi nlog. conf

Input: uses TCP to monitor messages on port 8001 of the local machine.

Filter: Use the grok plug-in to customize the Message format. We recommend that you use grokdebug for online debugging.

Output: elasticsearch is used for data storage.

 

Note: There are many official plug-ins for message processing. For details, refer to the official documentation.

Start the service
// Enable port 8001firewall-cmd -- add-port = 8001/tcp -- permanent // reload firewall-cmd -- reload // set start systemctl enable logstash // start logstashsystemctl start logstash

 

ASP. ENT Core logs with Nlog

The following describes the key content of this article. The Nlog records logs and sends messages to logstash. logstash stores converted messages to elasticsearch for query in kibana.

Create an ASP. NET Core Project

This article uses the. NETCore 1.1 Project Elasticsearch. QuickStart created in VS2017.

Install the Nlog dependency package through Nuget

NLog. Web. AspNetCore

 

Nlog. Extensions. Logging (pre Version)

 

Add Nlog service in Startup. cs

Add Nlog configuration (Web root directory)
<? Xml version = "1.0" encoding = "UTF-8"?> <Nlog xmlns =" http://www.nlog-project.org/schemas/NLog.xsd "Xmlns: xsi =" http://www.w3.org/2001/XMLSchema-instance "AutoReload =" true "internalLogLevel =" Warn "internalLogFile =" internal-nlog.txt "> <extensions> <! -- Enable NLog. Web for ASP. NET Core --> <add assembly = "NLog. Web. AspNetCore"/> </extensions> <! -- Define various log targets --> <! -- Define the log file directory --> <variable name = "logDirectory" value = "$ {basedir}/logs/$ {daily date}"/> <variable name = "nodeName" value = "node1"/> <targets async = "true"> <! -- All logs target --> <target xsi: type = "File" name = "allfile" fileName = "$ {logDirectory}/nlog-all/$ {latest date }. log "layout =" # node1 #$ {longdate }#$ {logger }#$ {uppercase: $ {level }## {callsite }#$ {callsite-linenumber }#$ {aspnet-request-url }#$ {aspnet-request-method }#$ {aspnet- mvc-controller }#$ {aspnet-mvc-action }#$ {message }#$ {exception: format = ToString} # "keepFileOpen =" false "/> <! -- Local file log target --> <target xsi: type = "File" name = "ownLog-file" fileName = "$ {logDirectory}/nlog-$ {level}/$ {1_date }. log "layout =" #$ {longdate }#$ {nodeName} #$ {logger }#$ {uppercase: $ {level }## {callsite }#$ {callsite-linenumber }#$ {aspnet-request-url }#$ {aspnet-request-method }#$ {aspnet- mvc-controller }#$ {aspnet-mvc-action }#$ {message }#$ {exception: format = ToString} # "keepFileOpen =" false "/> <! -- Tcp log target --> <target xsi: type = "Network" name = "ownLog-tcp" keepConnection = "false" address = "tcp: // 192.168.30.128: 8001 "layout =" #$ {longdate }#$ {nodeName} #$ {logger }#$ {uppercase: $ {level }## {callsite }#$ {callsite-linenumber }#$ {aspnet-request-url }#$ {aspnet-request-method }#$ {aspnet- mvc-controller }#$ {aspnet-mvc-action }#$ {message }#$ {exception: format = ToString }# "/> <! -- Grok rule --> <! -- % # {DATA: request_time} # % {DATA: node_name} # % {DATA: class_name} # % {DATA: log_level} # % {DATA: call_site }#% {DATA: line_number }#% {DATA: request_url }#% {DATA: request_method }#% {DATA: container_name }#% {DATA: action_name }#% {DATA: log_info }#% {DATA: exception_msg} # --> <! -- Blank --> <target xsi: type = "Null" name = "blackhole"/> </targets> <! -- Log Level Trace-> Debug-> Info-> Warn-> Error-> <! -- Log rule --> <rules> <! -- All logs, including Microsoft logs --> <logger name = "*" minlevel = "Trace" writeTo = "allfile"/> <! -- Customize logs to exclude Microsoft logs --> <logger name = "Microsoft. * "minlevel =" Trace "writeTo =" blackhole "final =" true "/> <logger name =" * "minlevel =" Debug "writeTo =" ownLog-file "/> <logger name = "*" minlevel = "Info" writeTo = "ownLog-tcp"/> </rules> </nlog>

Note: The address in Tcp target points to the address listened to in logstash, And the grok template information is also provided in the comment.

 

Test Nlog Logging

View the final result in Kibana

 

Summary

This article is just an example tutorial, hoping to serve as an example. For detailed functions, please refer to the official documentation. Elasticsearch, Kibana, and logstash are very powerful. I have just come into contact with them. If there is anything wrong, I hope you will forgive me and correct me. If this document is helpful to you, please give me a thumbs up. Thank you.

Reference

1: LogStash + ElasticSearch (CentOS)

2: Use ElasticSearch, Kibana, ASP. NET Core, and Docker to visualize data

3: Elastic Stack and Product Documentation

4. installation and configuration of Elasticsearch on Centos 7

5: Nlog official documentation

6: Build an ELKB log collection system from scratch

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.