Use shell to generate a digital report for zabbix monitoring

Source: Internet
Author: User
Tags webhosting

We all know that zabbix is a very powerful monitoring tool. Our company is also using zabbix to monitor the status of all websites.

A recent requirement is that we need to generate a report that includes the time nodes and corresponding response times of the previous day, this report is used to measure the availability of the website.

?

Zabbix has its own report function, but all of them are images and there is no data format. Although no report in data format is provided, zabbix provides a set of APIs to obtain the corresponding data (in JSON format) as needed, so I wrote a script, it is used to output the monitoring data of each monitored URL to a CSV file. (You can use Excel as a macro to aggregate all the collected CSV files into an xls file. This Excel file can be downloaded :)

?

PS: the server does not have any compilation tools or runtime environments for advanced programming languages, so all data is processed using shell scripts. Using shell to process JSON, it is not a little bit painful ...... It may also be that my shell level is too poor... If you have a better idea, please advise. Thank you.

?

?

?

The script is as follows:

#! /Bin/bash

# Obtain the authorization code of the API, which will be used later to obtain the required information.

# First obtain the authorization according to the official method, and then there will be a piece of output information, including the authorization code, which can be obtained after processing with awk.
Authjson = 'curl-l-x post-H' Content-Type: Application/json'-d' {"jsonrpc": "2.0", "method": "user. authenticate "," Params ": {" user ":" admin "," password ":" zabbix "}," ID ": 1 }'? Http: // 127.0.0.1/zabbix/api_jsonrpc.php'
Authstr = 'echo $ authjson | awk-F "[,: \"] "'{print $11 }''
Echo $ authstr

# Initial and end time of the report (from to the previous day)
From = 'date? "+ % Y-% m-% d 00:00:00"-d "-1day "'
Echo $ from
Now = 'date? "+ % Y-% m-% d 00:00:00 "'

# Convert to Linux time format. zabbix only supports this format.
From = 'date-d "$ from" '+ % s''
Now = 'date-d "$ now" '+ % s''

# Each monitored URL has a corresponding record in the database. Now, you can obtain the corresponding IDs of all URLs from the database.

# The content of GETID. SQL is as follows:

# Select items. itemid from items join hosts on (items. hostid = hosts. hostid) where items. description like '% response time %' and hosts. host like '% website %' and items. status = 0; (the SQL file can be compiled according to the actual situation, as long as you can get the ID of all monitored URLs .)


MySQL -- user = root -- Password = zabbix <getmalaysiaid. SQL>/etc/scripts/outputmyid.txt


# The ID obtained by the MySQL Command has a header. Remove it.
Sed '1d '/etc/scripts/outputmyid.txt>/etc/scripts/outputmyid_daily.txt

# Clear previous temporary files
Rm-RF/etc/scripts/dailyreports/tmpjson/*. txt

# Obtain the Historical monitoring records of the website corresponding to each ID in JSON format. The time period is used as the limit.
For I in 'cat/etc/scripts/outputmyid_daily.txt'
Do
???? Jsonstr = "{\" jsonrpc \ ": \" 2.0 \ ", \" Method \ ": \" history. get \ ", \" Params \ ": {\" History \ ": 0, \" itemids \ ": [\" $ I \ "], \" time_from \": \ "$ from \", \ "time_till \": \ "$ now \", \ "output \": \ "eXtend \"}, \ "auth \": \ "$ authstr \", \ "ID \": 1 }"
???? Gethistory = "curl-l-x post-H 'content-type: Application/json'-d'" $ jsonstr "'HTTP: // 127.0.0.1/zabbix/api_jsonrpc.php"
???? Echo $ gethistory>/etc/scripts/tmp. Sh
???? Chmod A + x/etc/scripts/tmp. Sh
???? /Etc/scripts/tmp. Sh>/etc/scripts/dailyreports/tmpjson/$ I .txt
Done

# Create a folder! --
Now = 'date "+ % Y % m % d "'
Mkdir/etc/scripts/dailyreports $ now

# Process the obtained JSON data
For I in 'ls/etc/scripts/dailyreports/tmpjson /'
Do
???? Rm/etc/scripts/dailyreports/tmp /*

# Retrieve all response time records

???? CAT/etc/scripts/dailyreports/tmpjson/$ I | sed-E's/[{}]/''/G' | awk-V K =" text "'{ N = Split ($0, a, ","); for (I = 1; I <= N; I ++) print a [I]} '| grep value | grep-o' [0-9] \. [0-9] * '>/etc/scripts/dailyreports/tmp/Values

# Retrieve records of all time nodes
???? CAT/etc/scripts/dailyreports/tmpjson/$ I | sed-E's/[{}]/''/G' | awk-V K =" text "'{ N = Split ($0, a, ","); for (I = 1; I <= N; I ++) print a [I]} '| grep clock | grep-o' [0-9] * '>/etc/scripts/dailyreports/tmp/clocks_org

# The obtained time node must process the formats of adult views.
???? For C in 'cat/etc/scripts/dailyreports/tmp/clocks_org'
???? Do
????????? Date [email protected] $ c '+ % Y-% m-% d % H: % m: % s'>/etc/scripts/dailyreports/tmp/clocks_new
???? Done

# Set the file name of the new file
???? Itemidstr = 'echo $ I | awk-f [.] '{print $1 }''
???? Itemname = 'mysql -- user = root -- Password = zabbix-e "select key _ from items where Itemid = $ itemidstr;" | awk-f [\ [,] '{print $2 }''

# Create a report file and add it to the header
???? Echo "monitored URL:, $ itemname,">/etc/scripts/dailyreports $ now/shortitemidstr.csv
???? Echo "clock, response time (s),">/etc/scripts/dailyreports $ now/define itemidstr.csv

# Combine the record of the time node and the record of the response time into a file and append it to the report file just now.
???? Paste-d ", "/etc/scripts/dailyreports/tmp/clocks_new/etc/scripts/dailyreports/tmp/values/dev/null>/etc/scripts/dailyreports $ now/shortitemidstr.csv

Done

# Clear temporary files
Rm-RF/etc/scripts/dailyreports/tmpjson/*. txt

# Collect all the obtained reports, and then collect the Excel files of the prepared macros and package them to the desired users.

# There will be a bunch of instructions in the Excel file at that time. Click where to find a proper Excel report.
Mkdir/etc/scripts/dailyreports $ now/CSV
MV/etc/scripts/dailyreports $ now/*. CSV/etc/scripts/dailyreports $ now/CSV/
CP/etc/scripts/dailyreports/readme.xls/etc/scripts/dailyreports $ now/
CD/etc/scripts/dailyreports/
Zip-r dailyreports?no=zip dailyreports $ now/

CD/root/sendEmail-v1.51

. /Sendemail-f [email protected]-T [email protected]-u kldc dmz1 daily SLA report for webhosting-M kldc dmz1 daily SLA report for webhosting-s 192.168.169.23: 25-A/etc/scripts/dailyreports/dailyreports?no=zip


Rm-RF/etc/scripts/dailyreports *

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.