Logs are an important way to analyze online problems, usually we will output the logs to the console or local files, to troubleshoot the problem by searching the local log according to the keyword, but more and more companies, project development with a distributed architecture, logs are recorded in multiple servers or files, When you analyze a problem, you may need to view multiple log files to locate the problem, and if the related project is not a team maintenance, the communication cost increases linearly. It is an effective way to analyze the distributed system problems by aggregating the logs of each system and linking a transaction request through the keyword.
ELK (Elasticsearch+logstash+kibana) is the most commonly used log analysis system, including log collection (Logstash), log storage search (elasticsearch), display query (Kibana), We use elk as the storage analysis system for logs and by assigning RequestID link-related logs for each request. Elk the concrete structure as shown:
1, installation Logstash
Logstash need to rely on the JDK to install the Logstash before installing the Java environment.
Download JDK:
Download on Oracle's official website, http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
Download the corresponding JDK installation package according to the operating system version, this experiment is jdk-8u101-linux-x64.tar.gz
Upload the file to the server and execute:
# Mkdir/usr/local/java
# TAR-ZXF Jdk-8u45-linux-x64.tar.gz-c/usr/local/java/
Configuring the Java Environment
Export Java_home=/usr/local/java/jdk1.8.0_45export path= $PATH: $JAVA _home/binexport classpath=.: $JAVA _home/lib/ Tools.jar: $JAVA _home/lib/dt.jar: $CLASSPATH
Execute the java-version command to print out the Java version information indicating that the JDK configuration was successful.
Download Logstash:
wget https://download.elastic.co/logstash/logstash/logstash-2.4.0.tar.gz
TAR-XZVF logstash-2.4.0.tar.gz
Enter the installation directory: CD #{dir}/logstash-2.4.0
To create a logstash test configuration file:
Vim test.conf
Edit the contents as follows:
Input {stdin {}}output {stdout {codec = Rubydebug {}}}
Run the Logstash test:
Bin/logstash-f test.conf
Show
To prove that Logstash has been activated.
Enter Hello World
Because we configured the content to be, the console outputs the log content, so displaying the above format is a success.
2, Installation Elasticsearch
To download the installation package:
wget https://download.elastic.co/elasticsearch/release/org/elasticsearch/distribution/tar/elasticsearch/2.4.0/ Elasticsearch-2.4.0.tar.gz
Unzip and configure:
TAR-XZVF elasticsearch-2.4.0.tar.gz
CD #{dir}/elasticsearch-2.4.0
Vim Config/elasticsearch.yml
Modify:
Path.data:/data/es #数据路径path. Logs:/data/logs/es #日志路径network. Host: Native address #服务器地址http. port:9200 #端口
To configure the execution user and directory:
Groupadd elsearchuseradd elsearch-g elsearch-p elasticsearchchown-r elsearch:elsearch elasticsearch-2.4.0mkdir/data/ Esmkdir/data/logs/eschown-r Elsearch:elsearch/data/eschown-r elsearch:elsearch/data/logs/es
Start Elasticsearch:
Su Elsearch
Bin/elasticsearch
Access via browser:
The installation was successful.
To integrate Logstash and Elasticsearch, modify the Logstash configuration to:
Input {stdin {}}output {elasticsearch {hosts = "elasticsearchip:9200" index = "logstash-test"} stdout { codec = Rubydebug {}}}
Start Logstash again and enter any text: "Hello Elasticsearch"
Through the Elasticsearch search to the text just entered, integration success.
But through Elasticsearch's native interface query and display are not convenient and intuitive, below we configure a more convenient query analysis tool Kibana.
3, Installation Kibana
To download the installation package:
wget https://download.elastic.co/kibana/kibana/kibana-4.6.1-linux-x86_64.tar.gz
Unzip the Kibana and enter the extracted directory
Open Config/kibana.yml and modify the following:
#启动端口 changed the default port because of a restricted port
server.port:8601
#启动服务的ip
Server.host: "Native IP"
#elasticsearch地址
Elasticsearch.url: "http://elasticsearchIP:9200"
Start the program:
Bin/kibana
Access to the configuration of the Ip:port, in the Discover search for the characters just entered, the content is very beautiful show out.
Here our Elk environment has been configured, we use the Java Web Project test log in Elk.
4. Create a Web project
A common Maven Java Web project, in order to test the continuity of distributed system logs, we have this project call n times, and deploy 2 projects, calling each other, the key code is as follows:
@RequestMapping ("Http_client") @Controllerpublic class Httpclienttestcontroller { @Autowired private Httpclienttestbo Httpclienttestbo; @RequestMapping (method = requestmethod.post) @ResponseBody public baseresult doPost (@RequestBody httpclienttestresul T result) {Httpclienttestresult testpost = httpclienttestbo.testpost (result); return testpost; }}
@Servicepublic class Httpclienttestbo {private static Logger Logger = Loggerfactory.getlogger (Httpclienttestbo.class) ; @Value ("${test_http_client_url}") Private String Testhttpclienturl; Public Httpclienttestresult testpost (httpclienttestresult result) {Logger.info (jsonobject.tojsonstring (result)); Result.setcount (Result.getcount () + 1); if (Result.getcount () <= 3) {map< String, string> Headermap = new hashmap< String, string> (); String RequestID = RequestIdUtil.requestIdThreadLocal.get (); Headermap.put (Requestidutil.request_id_key, RequestID); map< String, string> Parammap = new hashmap< String, string> (); Parammap.put ("Status", Result.getstatus () + ""); Parammap.put ("ErrorCode", Result.geterrorcode ()); Parammap.put ("Message", Result.getmessage ()); Parammap.put ("Count", Result.getcount () + ""); String resultstring = Jsonhttpclientutil.post (Testhttpclienturl, Headermap, Parammap, "UTF-8"); Logger.info (resultstring); } logger.info (jsonobject.tojsonstring (result)); return result; }}
In order to represent the link of the call we configured the RequestID filter in Web. XML to create the RequestID:
<filter> <filter-name>requestIdFilter</filter-name> <filter-class> Com.virxue.baseweb.utils.requestidfilter</filter-class></filter><filter-mapping> < Filter-name>requestidfilter</filter-name> <url-pattern>/*</url-pattern></filter-mapping >
public class Requestidfilter implements Filter {private static final Logger Logger = Loggerfactory.getlogger (RequestID Filter.class); /* (non-javadoc) * @see javax.servlet.filter#init (javax.servlet.FilterConfig) */public void init (filterconfig Filterconfig) throws Servletexception {logger.info ("Requestidfilter init"); }/* (non-javadoc) * @see javax.servlet.filter#dofilter (javax.servlet.ServletRequest, javax.servlet.ServletRespons E, Javax.servlet.FilterChain) */public void DoFilter (ServletRequest request, servletresponse response, Filterchain Chain) throws IOException, servletexception {String RequestID = Requestidutil.getrequestid ((httpservletreq uest) (request); Mdc.put ("RequestID", RequestID); Chain.dofilter (request, response); RequestIdUtil.requestIdThreadLocal.remove (); Mdc.remove ("RequestID"); }/* (non-javadoc) * @see Javax.servlet.filter#destroy () */public void Destroy () { }}
public class Requestidutil {public static final String Request_id_key = "RequestID"; public static threadlocal< String> requestidthreadlocal = new threadlocal< String> (); private static final Logger Logger = Loggerfactory.getlogger (Requestidutil.class); /** * Get RequestID * @Title Getrequestid * @Description TODO * @return * * @author Sunhaojie [email protected] * @date August 31, 2016 morning 7:58:28 */public static String Getrequestid (HttpServletRequest request) {String RequestID = null; String Parameterrequestid = Request.getparameter (Request_id_key); String Headerrequestid = Request.getheader (Request_id_key); if (Parameterrequestid = = NULL && Headerrequestid = = null) {logger.info ("Request parameter and he Ader have no RequestID entry "); RequestID = Uuid.randomuuid (). toString (); } else {RequestID = Parameterrequestid! = null? parameterrequEstid:headerrequestid; } requestidthreadlocal.set (RequestID); return RequestID; }}
We make use of logback as the log output plug-in, and use its MDC class, can be non-intrusive output requestid anywhere, the specific configuration is as follows:
<configuration> <appender name= "logfile" class= " Ch.qos.logback.core.rolling.RollingFileAppender "> <Encoding>UTF-8</Encoding> <file>${log_ base}/java-base-web.log</file> <rollingpolicy class= "Ch.qos.logback.core.rolling.TimeBasedRollingPolicy "> <FileNamePattern>${log_base}/java-base-web-%d{yyyy-MM-dd}-%i.log</FileNamePattern> < maxhistory>10</maxhistory> <timebasedfilenamingandtriggeringpolicy class= " Ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP "> <MaxFileSize>200MB</MaxFileSize> </ timebasedfilenamingandtriggeringpolicy> </rollingPolicy> <layout class= " Ch.qos.logback.classic.PatternLayout "> <pattern>%d^|^%x{requestid}^|^%-5level^|^%logger{36}%m^|^%msg%n </pattern> </layout> </appender> <root level= "info" > <appender-ref ref= "logfile"/> < /root> </configuration>
The log format here uses "^|^" as a delimiter to facilitate logstash segmentation. Deploy 2 Web projects on the test server, modify the log output location, and modify the URL invocation links to make the items call each other.
5, modify Logstash read the project output log:
Added stdin.conf, which reads as follows:
Input {file {path = = ["/data/logs/java-base-web1/java-base-web.log", "/data/logs/java-base-web2/ Java-base-web.log "] Type =" Logs "start_position =" Beginning "Codec + Multiline {pattern =" ^\[\d{4}-\d{ 1,2}-\d{1,2}\s\d{1,2}:\d{1,2}:\d{1,2} "negate = true What +" Next "} }}filter{mutate{split=>[" message " , "^|^"] Add_field = {"Messagejson" = "{datetime:%{[message][0]}, Requestid:%{[message][1]},level:%{[message] [2]}, Class:%{[message][3]}, Content:%{[message][4]}} "} Remove_field = [" message "]} }output { Elasticsearch {hosts = "10.160.110.48:9200" index = "Logstash-${type}"} stdout {codec = rubydebug {}} }
Where path is the log file address; codec = multiline to process the Exception log, the exception content and the exception header are split in the same log; filter for the Log content segmentation, the log content as JSON format, convenient query analysis;
Test it:
Use the Postman impersonation call to prompt for server-side exceptions:
Through the interface search "Call interface Exception", a total of two data.
Using one of the data of the RequestID search, show the request system and the system between the execution process, convenient for us to troubleshoot errors.
Here we experimented with the use of Elk Configuration Log Analysis, many of the details need better processing, welcome more students exchange learning.
ELK implementing the Java Distributed System Log Analysis architecture