We currently want to use the Zabbix every five minutes to monitor an error log file, if the monitoring to have errors generated, send an email alert. Like standard access logs, such as Nginx access log, a row represents a log that is easier to parse, but when the log is not a row, such as a tomcat,glassfish log, as follows:
[2015-07-17t14:24:04.552+0800] [GlassFish 4.0] [SEVERE] [as-web-core-00037] [Javax.enterprise.web.core] [tid: _threadid=26 _threadname=http-listener-1 (3)] [timemillis:1437114244552] [ LEVELVALUE:1000] [[
An exception or error occurred in the container during the request processing
Java.lang.IllegalArgumentException
At Org.glassfish.grizzly.http.util.CookieParserUtils.parseClientCookies (cookieparserutils.java:353)
At Org.glassfish.grizzly.http.util.CookieParserUtils.parseClientCookies (cookieparserutils.java:336)
At Org.glassfish.grizzly.http.Cookies.processClientCookies (cookies.java:220)
At Org.glassfish.grizzly.http.Cookies.get (cookies.java:131)
At Org.glassfish.grizzly.http.server.Request.parseCookies (request.java:1911)
At Org.glassfish.grizzly.http.server.Request.getCookies (request.java:1505)
At Org.apache.catalina.connector.Request.parseSessionCookiesId (request.java:4077)
At Org.apache.catalina.connector.CoyoteAdapter.postParseRequest (coyoteadapter.java:649)
At Org.apache.catalina.connector.CoyoteAdapter.doService (coyoteadapter.java:297)
]]
This is a relatively complex parsing, and we can use the following script to get the last five minutes of the log:
#!/bin/bash
# 5 minutes before Time
last_minute=$ (date-d '-5 MINUTE ' +%h%m%s)
# Initialize the number of log bars
Log_num=0
# Maximum fetch log bar count
Max_log=3
# Initialize final match log
Log_content= ""
# Initializes a matching value that contains a time line
Log_date_match=false
# Set Log path
Log_path= "/data/log/glassfish/domain1/server.log"
While Read Line;do
# matches the line containing the time
If echo "$line" | Grep-q ' ^\[20 '; then
# Get a specific time format based on the containing time line, such as 181320
date_time=$ (echo $line | grep-e-O "[0-9]{2}:[0-9]{2}:[0-9]{2}" | tr-d ': ')
date_time=$ (echo $date _time | sed ' s/^0//')
last_minutes=$ (echo $LAST _minutes | sed ' s/^0//')
# whether the current line is longer than 5 minutes ago
if [["$date _time"-gt "$LAST _minute"]];then
log_content= "$LOG _content\n$log_entry"
((log_num++))
Log_date_match=true
log_entry= "$line \ n"
Else
Log_date_match=false
Continue
Fi
Else
# Set the Log_entry value only if the previous log time satisfies the condition
If $LOG _date_match;then
log_entry= "$log _entry\n$line"
Fi
Fi
# Limit the maximum number of rows fetched
if [["$LOG _num"-gt "$MAX _log"]];then
Break
Fi
Done < $LOG _path
# Output All Logs
Echo-n-E "$LOG _content"
The preceding scripts are read sequentially, but when the log file is large, the efficiency of getting the log is very low, so it is recommended that the following method of reading the log in reverse order be more efficient.
#!/bin/bash
# 5 minutes before Time
last_minute=$ (date-d '-5 MINUTE ' +%h%m%s)
# Initialize the number of log bars
Log_num=0
# Maximum fetch log bar count
Max_log=3
# Initialize final match log
Log_content= ""
# Set Log path
Log_path= "/data/log/glassfish/domain1/server.log"
While Read Line;do
# matches the line containing the time
If echo "$line" | Grep-q ' ^\[20 '; then
# Get a specific time format based on the containing time line, such as 181320
date_time=$ (echo $line | grep-e-O "[0-9]{2}:[0-9]{2}:[0-9]{2}" | tr-d ': ')
# whether the current line is longer than 5 minutes ago
if [["$date _time" > "$LAST _minute"]];then
((log_num++))
log_entry= "$line \n$log_entry"
log_content= "$LOG _content\n$log_entry"
Else
Break
Fi
Log_entry= ""
Else
log_entry= "$line \n$log_entry"
Fi
# Limit the maximum number of rows fetched
if [["$LOG _num" > "$MAX _log"]];then
Break
Fi
Done < < (TAC $LOG _path)
# Output All Logs
Echo-n-E "$LOG _content"
You can then add a monitor entry in Zabbix to get the log content, and the trigger uses the {Itemname.strlen (0)} #0表达式来检测获取到的日志内容是否不为空. ItemName is the monitor item name.