Python Multi-process usage monitoring

Source: Internet
Author: User

The previous article described using PowerShell multithreading to monitor site status, but PowerShell can only run in the Windows environment, so the PowerShell code to the Python code, the following simple write down the implementation, the specific code will not be all posted.


The PY libraries that need to be used have the following

Import urllib2import socketfrom multiprocessing import poolfrom threading Import Thread

Urllib2 for requesting web pages

Socket used to limit network request timeouts

Pool is used to set up process pools, which are much better than using a pool in large volumes of data, which consumes a lot of server resources and even the risk of downtime, which can be considered in the case of small amounts of data, such as 10 or less.

"Request Specify URL address ' def request_url (URL): ' Set Request timeout time ' ' socket.setdefaulttimeout (5) Try: ' Construct HTTP request ' Request=urllib2. Request (URL) response=urllib2.urlopen (request) except Exception,e:print '%s|%s|%s '% (url,e,request.get_me Thod ()) else:print '%s|%s|%s '% (Url,response.code,request.get_method ())

The way to get the URL can be defined by itself, sometimes we will keep the data in the file, but more is written in the database, the following code provides a good flexibility, the content of the code can be perfected

"' Get the Web address that needs to be monitored from a file or database" Def get_url_list (value):    if value ==  ":         print  ' This function needs to specify a parameter! '         return    if value ==  ' File ':         file_path= ' C:\urllist.txt '   #文本中的数据一行一条          try:             f=open (file_path,  ' R ')         except exception,e:             print e         else:            return  f.readlines ()     elif value ==  ' MySQL ':         pass    else:        print  ' Incoming value Error! '         print  ' This function only receives  file, MySQL parameters '

Try the code to try the effect.

if __name__ = = ' __main__ ': url_list=get_url_list (' file ') if Url_list: ' Defines the maximum number of processes in the process pool ' Pl=pool (p rocesses=10) for the URL in url_list: ' Remove line feed ' ' Url=url.strip (' \ n ') result=p L.apply_async (request_url, (URL,)) Pl.close () Pl.join ()


Python Multi-process usage monitoring

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.