When everyone's site more and more time will find management is also quite complex, so this article to share with you the use of Python batch check the usability of the site, the management of the site has a very practical value, the need for friends can reference.
">
Objective
With the increase of the site, management complexity comes up, as the saying goes: People are more bad, I found that more than the site is also not good tube, because these sites have important and unimportant, important core of the site of course management more, like some years do not have a problem, slowly be their own forgotten, sudden that day a problem , but also rushed to emergency treatment, so the standard to manage these sites is very necessary, today we do the first step, regardless of station station, first unified monitoring to do, first not to say the business situation, at least that site can not access, to the first time reported, do not wait for business party to give you feedback, It seems that we are not professional enough, then let's see if you use Python to implement multi-site usability monitoring, the script is as follows:
#!/usr/bin/env python import pickle, OS, sys, loggingfrom httplib import httpconnection, socketfrom smtplib import SMTP D EF email_alert (Message, status): Fromaddr = ' xxx@163.com ' Toaddrs = ' xxxx@qq.com ' server = SMTP (' smtp.163.com:25 ') server . STARTTLS () server.login (' xxxxx ', ' xxxx ') server.sendmail (fromaddr, Toaddrs, ' Subject:%s\r\n%s '% (status, message)) Server.quit () def get_site_status (URL): response = get_response (URL) try:if getattr (response, ' status ') = = 200:return ' Up ' except attributeerror:pass return ' down ' def get_response (URL): try:conn = httpconnection (URL) conn.request (' H EAD ', '/') return Conn.getresponse () except Socket.error:return None except:logging.error (' Bad URL: ', URL) exit (1) def get_headers (URL): response = get_response (URL) try:return getattr (response, ' getheaders ') () except Attributeerror: Return ' Headers unavailable ' Def compare_site_status (prev_results): def is_status_changed (URL): status = Get_site_statu S (URL) friendly_status = '%s Is%s '% (URL, status) print Friendly_status if URL in Prev_results and Prev_results[url]! = status:logging.warning ( Status) Email_alert (str (get_headers (URL)), friendly_status) Prev_results[url] = status return is_status_changed def is _internet_reachable (): If Get_site_status (' www.baidu.com ') = = ' Down ' and Get_site_status (' www.sohu.com ') = = ' Down ': Return False return True def load_old_results (file_path): Pickledata = {} if Os.path.isfile (file_path): picklefile = Open (File_path, ' rb ') Pickledata = Pickle.load (picklefile) picklefile.close () return pickledata def store_results (file_path , data): output = open (File_path, ' WB ') pickle.dump (data, Output) Output.close () def main (URLs): Logging.basicconfig (leve L=logging. WARNING, filename= ' Checksites.log ', format= '% (asctime) s% (levelname) s:% (message) s ', datefmt= '%y-%m-%d%h:%m:%s ') p Ickle_file = ' data.pkl ' pickledata = Load_old_results (pickle_file) Print pickledata if Is_internet_reachable (): Status_ Checker = Compare_sitE_status (pickledata) map (status_checker, URLs) else:logging.error (' Either the world ended or we is not connected to th e net. ') Store_results (Pickle_file, pickledata) if __name__ = = ' __main__ ': Main (sys.argv[1:])
Script Core Point Explanation:
1, GetAttr () is a python built-in function that receives an object that can return the value of an object based on the object's properties.
2. The Compare_site_status () function returns an internally defined function.
3, Map (), requires 2 parameters, one is a function, one is a sequence, the function is to apply each element of the sequence to the function method.
Summarize
The above is the whole content of this article, the need for friends can reference.