Elasticsearch,kibana,logstash,nlog Implementing ASP. NET Core Distributed log System

Source: Internet
Author: User
Tags kibana logstash

Elasticsearch,kibana,logstash,nlog Implementing ASP. NET Core Distributed log System

Elasticsearch official website

Elasticsearch Documentation

NLog.Targets.ElasticSearch Package

Elasticsearch-Introduction

Elasticsearch, as a core part, is a document repository with powerful indexing capabilities and can be used to search for data through the REST API.

It is written in Java, based on Apache Lucene, although these details are hidden in the API.

By indexed fields, any document that is stored (indexed) can be found in many different aggregations.

However, Elasticsearch not only provides powerful search capabilities for these indexed documents.

Fast, distributed, horizontally scalable, supports real-time document storage and analysis, and supports hundreds of of servers and petabytes of indexed data.

At the same time as the core of Elastic stack (aka ELK), it offers powerful applications such as LogStash, Kibana and more.

Kibana is a Web application dedicated to providing powerful visual queries in Elasticsearch.

With Kibana, you can easily create queries, charts, and dashboards for the data indexed in Elasticsearch.


Elasticsearch opens up a REST API and you'll find many document examples are HTTP calls that you can try using tools such as curl or postman.

Of course, the client for this API has been written in many different languages, including. Net, Java, Python, Ruby, and JavaScript.

Logstash is an open source data collection engine with real-time processing power. Data can be collected dynamically from different sources, and processed (filtered, distorted) through a unified output to a specific address to prepare for more diversified data analysis in the future.

Installing the Java Environment installation

Lazy One-click installation

Yum Install java*

View JDK Version information

Java-version

Elasticsearch Installation

The official website has many installs the way, I here uses the RPM installment, everybody can install according to oneself Custom way,.

Go to local directory cd/usr/local//create Elasticsearch folder mkdir elasticsearch//go to elasticsearch folder cd elasticsearch//start download wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.5.0.rpm//Start Installation Rpm-ivh elasticsearch-5.5.0.rpm
Configuration
Find the installation directory Whereis elasticsearch//Enter the installation directory cd/etc/elasticsearch//edit the Profile VI elasticsearch.yml

Main configuration network.host (native IP) and http.port (default 9200) (current single-node mode, other parameters refer to official documentation)

Start the service
Open port 9200firewall-cmd--add-port=9200/tcp--permanent//Reload configuration firewall-cmd--reload//Set service boot up systemctl enable elasticsearch//start service Systemctl start Elasticsearch

Open http://192.168.30.128:9200 in the browser, as shown to indicate that the boot was successful

Kibana

Kibana Documentation

Installation

Official, official Installation Tutorials

Go to elasticsearch directory cd/usr/local/elasticsearch//download Kibana rpm 64-bit package wget https://artifacts.elastic.co/downloads/ kibana/kibana-5.5.0-x86_64.rpm//Installing KIBANARMP-IVH kibana-5.5.0-x86_64.rpm
Configuration
Go to the installation directory cd/etc/kibana//edit the configuration file VI kibana.yml

Set port number: 5601,host Address: "192.168.30.128", Elasticsearch service Address: "HTTP://192.168.30.128:9200"

Start the service
Open port 5601firewall-cmd--add-port=5601/tcp--permanent//Reload configuration firewall-cmd--reload//Set service boot up systemctl enable kibana//start service Systemctl start Kibana

Open http://192.168.30.128:5601 in Browser, will go to Kibana management interface

LogStash

Logstash Documentation

Installation

Official Official Installation Tutorials

Go to elasticsearch directory cd/usr/local/elasticsearch//download logstash rpm package wget https://artifacts.elastic.co/downloads/ logstash/logstash-5.5.0.rpm//Installing RPM Package RPM-IVH logstash-5.5.0.rpm
Configuration
Enter the installation directory cd/etc/logstash//enter the CONF.D directory CD conf.d//new configuration Information VI nlog.conf

Input: Using TCP to monitor native 8001 port messages

Filter: Use Grok plugin, custom message format, recommended to use Grokdebug online debugging

Output: Using Elasticsearch as the data store

Note: There are very rich plugins for message processing, which can be viewed in official documents.

Start the service
Open port 8001firewall-cmd--add-port=8001/tcp--permanent//Reload configuration firewall-cmd--reload//Set boot up systemctl enable logstash/ /Start Logstashsystemctl start Logstash

  

Asp. ENT core combined with Nlog for logging

The following is the focus of this article, through Nlog logging, sending messages to Logstash,logstash to store the converted messages to Elasticsearch, and for use in Kibana queries.

Creating an ASP. NET Core Project

This article was created by VS2017. Netcore 1.1 Project Elasticsearch.quickstart

Installing Nlog dependency Packages through NuGet

NLog.Web.AspNetCore

Nlog.Extensions.Logging (pre version)

Add Nlog service in Startup.cs

New Nlog configuration (Web root directory)
<?xml version= "1.0" encoding= "Utf-8"? ><nlog xmlns= "Http://www.nlog-project.org/schemas/NLog.xsd" xmlns: Xsi= "Http://www.w3.org/2001/XMLSchema-instance" autoreload= "true" internalloglevel= "Warn" internallogfile= " Internal-nlog.txt "> <extensions> <!--enable nlog.web for ASP core--> <add assembly=" Nlog.web . Aspnetcore "/> </extensions> <!--define various log targets-<!--definition log file directory--<variable nam E= "Logdirectory" value= "${basedir}/logs/${shortdate}"/> <variable name= "nodeName" value= "Node1"/> < Targets async= "true" > <!--all logs target---<target xsi:type= "File" name= "Allfile" f Ilename= "${logdirectory}/nlog-all/${shortdate}.log" layout= "#node1 #${longdate}#${logger}#${uppercase:${level}}# ${callsite}#${callsite-linenumber}#${aspnet-request-url}#${aspnet-request-method}#${aspnet-mvc-controller}#${ Aspnet-mvc-action}#${message}#${exception:format=tostring}# "keepfileopen=" false "/> <!--local file log target---<target xsi:type=" file "Name=" Ownlog-file "filename=" ${logdirectory}/nlog-${level}/${shortdate}.log "layout=" #$ {Longdate}#${nodename}#${logger}#${uppercase:${level}}#${callsite}#${callsite-linenumber}#${aspnet-request-url }#${aspnet-request-method}#${aspnet-mvc-controller}#${aspnet-mvc-action}#${message}#${exception:format= tostring}# "keepfileopen=" false "/> <!--TCP Log target---<target xsi:type=" networ            K "Name=" Ownlog-tcp "keepconnection=" false "address =" tcp://192.168.30.128:8001 " Layout= "#${longdate}#${nodename}#${logger}#${uppercase:${level}}#${callsite}#${callsite-linenumber}#${ aspnet-request-url}#${aspnet-request-method}#${aspnet-mvc-controller}#${aspnet-mvc-action}#${message}#${ exception:format=tostring}# "/> <!--grok Rules--<!--%#{data:request_time}#%{data:node_name}#%{data:class_name}#%{data:log_level}#%{data:call_site}#%{data:line_number}#%{ Data:request_url}#%{data:request_method}#%{data:container_name}#%{data:action_name}#%{data:log_info}#%{data: exception_msg}#--> <!--Blank--<target xsi:type= "Null" name= "blackhole"/> </targets> <!--day Log level Trace-"debug-" Info-"warn-" error-"Fatal--> <!--log Rules--<rules> <!--all logs, including Microsoft log--&gt    ; <logger name= "*" minlevel= "Trace" writeto= "Allfile"/> <!--custom log, excluding Microsoft Logs--<logger name= "MICR Osoft.* "minlevel=" Trace "writeto=" Blackhole "final=" true "/> <logger name=" * "minlevel=" Debug "writeto=" ownLog- File "/> <logger name=" * "minlevel=" Info "writeto=" ownlog-tcp "/> </rules></nlog>

  Note: The address in TCP target points to the addresses that are listening in Logstash, and the Grok template information is also given in the comments.

Test Nlog Log records

See the final effect in Kibana

Reference

1:logstash+elasticsearch Simple to use (CentOS)

2: Visualize data using elasticsearch,kibana,asp.net core and Docker

3:elastic Stack and Product documentation

4:elasticsearch installation and configuration on CentOS 7

5:nlog Official documents

6: Build a ELKB Log collection system from scratch

Elasticsearch,kibana,logstash,nlog Implementing ASP. NET Core Distributed log System

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.