Elasticsearch+nlog+elmah Implementing ASP. NET Distributed Log Management
Elasticsearch Introduction
ElasticSearch是一个基于Lucene的搜索服务器。
它提供了一个分布式多用户能力的全文搜索引擎,基于RESTful web接口。
Elasticsearch是用Java开发的,并作为Apache许可条款下的开放源码发布,是第二流行的企业搜索引擎。
希望我们的搜索解决方案要快,希望有一个零配置和一个完全免费的搜索模式,我们希望能够简单地使用JSON通过HTTP的索引数据,我们希望我们的搜索服务器始终可用,我们希望能够一台开始并扩展到数百,我们要实时搜索,我们要简单的多租户,我们希望建立一个云的解决方案。Elasticsearch旨在解决所有这些问题和更多的问题。
Elasticsearch'sSchema与其它DB比较:
ElasticSearch三方访问方式:
Environment is CentOS6.4, there are several methods of installation, where we download the package directly from the official website, the 1.71 version of the decompression, into the directory to execute:
Bin/elasticsearch
Check that the service is working correctly:
Curl-x GET Http://localhost:9200/
The Elasticsearch default is 9200 port, which returns a JSON data with version description running normally.
The scalability of the elasticsearch is very high, as shown in the sample data shards:
Install front end Elasticsearch-head:
Elasticsearch/bin/plugin–install Mobz/elasticsearch-head
Open http://localhost:9200/_plugin/head/, you can see the following UI, here we configure the IP is 192.168.0.103, it is multi-language version, has been automatically recognized as the Chinese UI
Here we also install a Management node front-end Bigdesk, similar installation, is also recommended plug-in mode:
$./bin/plugin-install lukas-vlcek/bigdesk/<bigdesk_version>
http://192.168.0.103:9200/_plugin/bigdesk/之后UI是这样的:
There are other front-end projects, which we don't describe here, for the purpose of better managing elasticsearch clusters.
Elasticsearch integration with ASP.
OK, we have installed Elmah in the ASP. Now we install Elmah.elasticsearch, here is 1.1.0.27:
Pm> Install-package Elmah.elasticsearch
In the configuration section of Web. config, we configure the index name: elmahcurrent
<elmah> <!--see http://code.google.com/p/elmah/wiki/SecuringErrorLogPages for more Information on remote access and securing ELMAH. --><security allowremoteaccess= "true"/> <errorlog type= " Elmah.Io.ElasticSearch.ElasticSearchErrorLog, Elmah.Io.ElasticSearch "Connectionstringname=" Elmahioelasticsearch "defaultindex=" Elmahcurrent "/></elmah>
Connection string increased
<connectionStrings> <add name= "Elmahioelasticsearch" connectionstring= "http://192.168.0.103:9200/"/ ></connectionStrings>
Let's visit a non-existent Http://localhost:1960/KK webpage intentionally throwing an exception, and then we can see in the Front head:
JSON data is fully recorded and can of course be used in query mode.
Next, let's configure the NLog log to output to ElasticSearch, first install the package NLog.Targets.ElasticSearch 1.0.14
Pm> Install-package NLog.Targets.ElasticSearch
The corresponding nlog.config file is like this, look at the bold font:
<?xml version= "1.0" encoding= "Utf-8"? ><nlog xmlns= "Http://www.nlog-project.org/schemas/NLog.xsd" xmlns: xsi= "Http://www.w3.org/2001/XMLSchema-instance" > <extensions> <add assembly= " NLog.Targets.ElasticSearch "/> </extensions> <targets async=" true "><strong> <target name=" Elastic "xsi:type=" ElasticSearch "uri=" Http://192.168.0.103:9200/"index=" devlogging "documenttype=" LogEvent "> </strong> </target> <target name= "Asyncfile" xsi:type= "Asyncwrapper" > <target xsi:type= "Fi Le "name=" F "filename=" ${basedir}/logs/${shortdate}.log "layout=" ${longdate} ${logger} ${uppercase:${level}} ${messa GE} ${exception:format=tostring,stacktrace,method:maxinnerexceptionlevel=5:innerformat=tostring} "/> </ target> </targets> <rules> <logger name= "*" minlevel= "Trace" writeto= "F"/> <strong> < ; Logger name= "*" minlevel= "Trace" writeto= "elastic"/></strong> </ruleS></nlog>
This allows us to freely output non-anomalous logs to elasticsearch, such as the logs we recorded for the WEBAPI request:
Devlogging is the index name that we have configured in the configuration file. We also use Nlog to record the file log.
Search:
The rest-based request is queried by ID:
Http://localhost:9200/<index>/<type>/<id>.
Such as:
Http://192.168.0.103:9200/devlogging/logevent/AU9a4zu6oaP7IVhrhcmO
There are also some search examples:
Index
$ curl-xput HTTP://LOCALHOST:9200/TWITTER/TWEET/2-d ' {
"User": "Kimchy",
"Post_date": "2009-11-15t14:12:12",
"Message": "You know, for Search"
}‘
Query for Lucene syntax mode
$ curl-xget Http://localhost:9200/twitter/tweet/_search?q=user:kimchy
Query DSL mode queries
$ curl-xget http://localhost:9200/twitter/tweet/_search-d ' {
"Query": {
"term": {"user": "Kimchy"}
}
}‘
Query DSL mode queries
$ curl-xget http://localhost:9200/twitter/_search?pretty=true-d ' {
"Query": {
"Range": {
"Post_date": {
"From": "2009-11-15t13:00:00",
"To": "2009-11-15t14:30:00"
}
}
}
}‘
We can configure the log output of multiple applications to ES in order to facilitate our query and analysis.
Elasticsearch+nlog+elmah Implementing ASP. NET Distributed Log Management