1, the acquisition of Tomcat is indeed more complex than the previous requirements, I built a tomcat environment, and then produced the following error first posted:
Jan, 10:53:35 AMorg.apache.catalina.core.AprLifecycleListener lifecycleevent
Info:the APR based Apache Tomcat Native Library which allowsoptimal performance in production environments is not found On Thejava.library.path:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Jan 10:53:35 AM org.apache.coyote.AbstractProtocol Init
Info:initializing Protocolhandler ["http-bio-8088"]
Jan 10:53:35 AM org.apache.coyote.AbstractProtocol Init
Info:initializing Protocolhandler ["ajp-bio-8009"]
Jan 10:53:35 AM org.apache.coyote.AbstractProtocol Init
Severe:failed to initialize end point associated Withprotocolhandler ["ajp-bio-8009"]
Java.net.BindException:Address already in use (Bind failed) <null>:8009
Atorg.apache.tomcat.util.net.JIoEndpoint.bind (jioendpoint.java:413)
At Org.apache.tomcat.util.net.AbstractEndpoint.init (abstractendpoint.java:665)
Atorg.apache.coyote.AbstractProtocol.init (abstractprotocol.java:452)
Atorg.apache.catalina.startup.Catalina.load (catalina.java:667)
ATSUN.REFLECT.NATIVEMETHODACCESSORIMPL.INVOKE0 (Native Method)
Atsun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:62)
Atsun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43)
Atorg.apache.tomcat.util.net.JIoEndpoint.bind (jioendpoint.java:400)
... More
2. Analyze the structure we need:
From The above analysis, we need the data are: Timestamp, class name, log information.
The only thing we need to do is to combine the same time and the multi-line log data into the same event and then analyze it.
# # #提示, because the Tomcat log is difficult, we can refer to the default log structure:
[Root@monitor patterns]# pwd
/test/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns
[Root@monitor patterns]# Catjava
Javaclass (?: [A-za-z$_][a-za-z$_0-9]*\.) *[a-za-z$_][a-za-z$_0-9]*
#Space is a allowed Characterto match Special cases like ' Native Method ' or ' Unknown Source '
Javafile (?: [a-za-z0-9_.-]+)
#Allow Special <init>,<clinit> methods
Javamethod (?:( < (?: CL)?init>) | [A-za-z$_] [a-za-z$_0-9]*]
#Line number is optional inspecial cases ' Native method ' or ' Unknown source '
Javastacktracepart%{space}at%{javaclass:class}\.%{javamethod:method}\ (%{javafile:file} (?::%{NUMBER:line})? \)
# Java Logs
Javathread (?: [a-z]{2}-processor[\d]+)
Javaclass (?: [A-za-z0-9-]+\.) +[a-za-z0-9$]+
Javafile (?: [a-za-z0-9_.-]+)
Javastacktracepart at%{javaclass:class}\.%{word:method}\ (%{javafile:file}:%{number:line}\)
Javalogmessage (. *)
# MMM DD, yyyy HH:mm:ss Eg:jan 9, 7:13:13 AM
Catalina_datestamp%{month}%{monthday}, 20%{year}%{hour}:?%{minute} (?::?%{second}) (?: am| PM)
# YYYY-MM-DD Hh:mm:ss,sss zzzeg:2014-01-09 17:32:25,527-0800
Tomcat_datestamp20%{year}-%{monthnum}-%{monthday}%{hour}:?%{minute} (?::?%{second})%{iso8601_timezone}
Catalinalog%{catalina_datestamp:timestamp}%{javaclass:class}%{javalogmessage:logmessage}
# 2014-01-09 20:03:28,269-0800 | ERROR | Com.example.service.exampleservice-something compeletelyunexpected happened ...
Tomcatlog%{tomcat_datestamp:timestamp} \| %{loglevel:level} \| %{javaclass:class}-%{javalogmessage:logmessage}
By contrast we can simply merge the logs in the same time:
[root@controlleretc]# Cat tomcat.conf
input{stdin{}}
Filter {
Multiline {
Pattern = "(^%{catalina_datestamp})"
Negate = True
what = "Previous"
}
If "_grokparsefailure" In[tags] {
Drop {}
}
Grok {
Match = = ["Message", "%{catalinalog}"]
}
Date {
Match = ["timestamp", "Yyyy-mm-dd hh:mm:ss,sss Z", "MMM dd,yyyy HH:mm:ss A"]
}
}
Output{stdout{codec=>rubydebug}}
# #先看测试数据, to be a little bit smaller:
Jan, 10:53:35 AMorg.apache.catalina.startup.Catalina load
Info:initialization Processedin 728 ms
Jan, 10:53:35 AMorg.apache.catalina.core.StandardService startinternal
Info:starting Servicecatalina
Jan, 10:53:35 AMorg.apache.catalina.core.StandardEngine startinternal
Info:starting Servlet Engine:apache tomcat/7.0.73
Test results:
"@timestamp" =>2017-01-05t03:45:46.749z,
"@version" = "1",
"Host" = "Controller",
"Message" = "Jan 05,2017 10:53:35 AM org.apache.catalina.startup.Catalina load\ninfo:initialization processed in 728 MS ",
"Tags" = [
[0] "multiline"
]
}
{
"@timestamp" =>2017-01-05t03:45:46.760z,
"@version" = "1",
"Host" = "Controller",
"Message" = "Jan 05,2017 10:53:35 AM org.apache.catalina.core.StandardService startinternal\ninfo:starting Service Catalina ",
"Tags" = [
[0] "multiline"
]
}
{
"@timestamp" = 2017-01-05t03:45:46.780z,
"@version" = "1",
"Host" = "Controller",
"Message" = "Jan 05,2017 10:53:35 AM org.apache.catalina.core.StandardEngine startinternal\ninfo:starting Servlet Engine:apache tomcat/7.0.73 ",
"Tags" = [
[0] "multiline"
]
}
3, before using the system default Catalina file management log, through a simplified way we can use the log4j way.
1, installation log4j:
1. Download the corresponding version of Tomcat-juli.jar and Tomcat-juli-adapters.jar with Tomcat, and Log4j-1.2.17.jar, and place the URL in the Tomcat/lib directory:/http archive.apache.org/dist/tomcat/tomcat-7/v7.0.73/bin/extras/Watch your tomcat version when downloading
Then copy the Tomcat-juli.jar to the Tomcat/bin directory, replacing the original
2. Modify Tomcat's Conf/context.xml file, change <Context> to <context swallowoutput= "true" > This step is important. Many people will forget.
3, the creation of log4j.properties placed in the Tomcat/lib
[Root@controller lib]# Cat Log4j.properties
Log4j.rootlogger=info,console,r
Log4j.appender.console=org.apache.log4j.consoleappender
Log4j.appender.console.layout=org.apache.log4j.patternlayout
#log4j. appender.console.layout.conversionpattern=%d [%t]%-5p%c-%m%n
LOG4J.APPENDER.CONSOLE.LAYOUT.CONVERSIONPATTERN=%D{YY-MM-DDHH:MM:SS}%5p%c{1}:%l-%m%n
Log4j.appender.r=org.apache.log4j.dailyrollingfileappender
Log4j.appender.r.file=${catalina.home}/logs/tomcat.log
Log4j.appender.r.layout=org.apache.log4j.patternlayout
Log4j.appender.r.layout.conversionpattern=%d{yyyy. Mm.dd hh:mm:ss}%5p%c{1} (%l):? %m%n
Log4j.logger.org.apache=info, R
Log4j.logger.org.apache.catalina.core.ContainerBase. [Catalina]. [Localhost]=debug,r
Log4j.logger.org.apache.catalina.core=info, R
Log4j.logger.org.apache.catalina.session=info,r
4. Restart to see the log directory generated Tomcat.log file description has been installed successfully.
5. Log4j can of course specify the format of the build log file:
log4j.appender.r.layout.conversionpattern={"Debug_level": "%p", "Debug_timestamp": "%d{iso8601}", "Debug_thread": " %t "," Debug_file ":"%F "," Debug_line ":"%l "," Debug_message ":"%m "}%n
# #生成日志之后直接解析成json即可.
6, of course, there is a relatively good plug-in, but also we recommend the way:log4j-jsonevent-layout:
This thing is equivalent to what we do in Nginx, the log4j log format is directly defined as JSON, which helps to improve performance ~
7. Installation:
First upload a few packages, has been packaged from the official several jar packages, it is very easy to fail and error:
Commons-lang-2.6.jar
Jsonevent-layout-1.8-snapshot.jar
Json-smart-1.1.1.jar
7, modify the log4j.properties, directly send the log to Logstash:
[Root@controller lib]# Cat Log4j.properties
Log4j.rootcategory=info, Rollinglog # # #为了方便出日志我们用Info, online you can use warn
Log4j.appender.rollinglog=org.apache.log4j.dailyrollingfileappender
Log4j.appender.rollinglog.threshold=trace
Log4j.appender.rollinglog.file=${catalina.home}/logs/api.log
Log4j.appender.rollinglog.datepattern=.yyyy-mm-dd
Log4j.appender.rollinglog.layout=net.logstash.log4j.jsoneventlayoutv1
# # #备注: After the restart we generated the relevant log under Api.log, below we use JSON format can directly parse him.
Take a look at the matching files we need to make:
[Root@controller etc]# cattomcat_log4j_layout.conf
Input {
file{
codec = JSON
Path = "/usr/local/src/apache-tomcat-7.0.73/logs/api.log"
Type = "Log4j"
Start_position = "Beginning"
Sincedb_path = "/dev/null"
}
}
output{
If[type] = = "Log4j" {
Redis {
Host = "192.168.0.46"
Port = 6379
data_type = "List"
Key = "Logstash:log4j"
}
}
}
This article is from a personal study note: Log analysis of the ELK stack combat