Recently, just need to test the stability of the new station, so write a shell script to the local (recently changed MAC), to real-time view you need to monitor the status of the Web page, and sent to the designated mailbox.
Here's a compliment. OS X with crontab scheduled tasks, you can test the script directly on the native ^_^
The code is as follows |
Copy Code |
# VI check_web_alive.sh --------------------------------------------------------------------- #!/bin/bash Path=/bin:/sbin:/usr/bin:/usr/sbin:/usr/local/bin:/usr/local/sbin:~/bin Export PATH # define URL Web_url= ("Http://www.111cn.net" "Http://m.111n.net" "http://www.111cn.net") # Check Network net_alive=$ (ping-c 5 8.8.8.8 |grep ' received ' |awk ' BEGIN {fs= ', "} {print $} ' |awk ' {print $} ') if [$NET _alive = 0]; Then echo "Network is not active,please check your network configuration!" Exit 0 Fi # Check URL for ((i=0; i!=${#WEB_URL [@]}; ++i)) { alive=$ (Curl-o/dev/null-s-M 10-connect-timeout 10-w%{http_code} ${web_url[i]} |grep "000000") If ["$ALIVE" = = "000000"]; Then echo "' ${web_url[i]} ' can not be open,please check! | Mail-s "Website Notification to ${web_url[i]}" yourname@example.com echo "Failed" Else echo "' ${web_url[i]} ' is ok!" Fi } |
Method two, using Curl to monitor web shell scripts
, because the script's role is to keep access to the given URL at a fixed frequency, when the site is not accessible automatically to set up the mailbox to send an alert message to notify the user. Well, look at the script.
The code is as follows |
Copy Code |
#!/bin/sh # */2 * * * * sh/var/monitor/web_monitor.sh http:///www.111cn.net # */2 * * * * sh/var/monitor/web_monitor.sh http:///www.111cn.net 5 # */2 * * * * sh/var/monitor/web_monitor.sh http:///www.111cn.net 10 Export Lang=c Url= "$" Email= "rocdk890@gmail.com" # Change to your mail address. log_file= "/var/log/monitor/web_status_ ' Date ' +%y%m '. Log" Tmp_email= "/var/monitor/.tmp.mail. ' Date ' +%s '" if [$] Then Sleep $ Fi # Define function "echo", append the timestamp at the head of every ECHO display. ECHO () { printf "%s" ' Date ' +%y-%m-%d%h:%m:%s ' echo $ } # Define function Http_code, obtain the status of Web service. Http_code () { Http_code= ' curl-m 10-o/dev/null-s-W%{http_code} $URL ' } # Define function MAIL. MAIL () { echo "$URL is isn't available now, pls pay attention." > $TMP _email echo "and the" Server ' s time is: >> $TMP _email Date >> $TMP _email echo >> $TMP _email echo "------" >> $TMP _email echo "BR" >> $TMP _email echo "Shell Robot." >> $TMP _email Mail-s "Server Alert: $URL" $EMAIL < $TMP _email RM $TMP _email } N=0 Http_code If [$http _code-eq 200] Then ECHO "|http_code:200|+ $n |webpage visit success.| $URL ">> $LOG _file Else While [$http _code-ne 200] Todo n= ' expr $n + 1 ' ECHO "|http_code: $http _code|+ $n |webpage visit failed. | $URL ">> $LOG _file If [$n-eq 5]; Then MAIL $; Exit 0 Fi Sleep 10 Http_code Done Fi # end.
|
Use Linux curl tools to access the URL, and return to Http_code, when the http_code is not equal to 200 when the site can be considered access failure, but taking into account the actual instability factors, when the first time to return Http_code unequal 200, the script hibernate 10 seconds, Then continue to visit, the cumulative number of consecutive inaccessible to reach 5 times after the alarm message triggered.
How to use:
The script needs to be followed by two parameters: the first parameter is the URL of the monitoring site, the second parameter is the delay time (optional, in seconds, it is recommended to add when monitoring multiple sites), add this script to the operating system crontab, set the operating frequency according to the demand, Recommended 2 minutes at a time. If there are multiple sites to monitor, add more than one line in the crontab, each URL line, the following for the same time to three sites for monitoring:
The code is as follows |
Copy Code |
*/2 * * * * sh/var/monitor/web_monitor.sh http://www.111cn.net */2 * * * * sh/var/monitor/web_monitor.sh http://www.111cn.net 5 */2 * * * * sh/var/monitor/web_monitor.sh http://www.111cn.net 10 |
The easiest way, not tested.
Linux Shell Curl Command gets HTTP status code
Sometimes in the use of Linux shell to write Web site status monitoring, I want to judge the status of the site returned by the status of the header information, use the following command to resolve:
Curlshell
The code is as follows |
Copy Code |
Curl-i-M 10-o/dev/null-s-W%{http_code} www.111cn.net |