Python Weather Collector Implementation Code (web crawler) _python

Source: Internet
Author: User
The reptile simply includes two steps: getting the Web page text, filtering the data.
   1, get the HTML text.
Python is handy for getting HTML, and a few lines of code can do what we need.
Copy Code code as follows:

def gethtml (URL):
page = Urllib.urlopen (URL)
html = Page.read ()
Page.close ()
return HTML

So many lines of code believe that you can probably know what it means without annotations.

   2, according to the regular expression and so on to obtain the content.

Using regular expressions requires a careful look at the structure of the information on the Web page and writes out the correct regular expression.
The use of Python regular expressions is also concise. My last article, some of the uses of Python, introduces a bit of regular usage. A new usage is needed here:
Copy Code code as follows:

def getWeather (HTML):
reg = ' <a title=.*?> (. *?) </a>.*?<span> (. *?) </span>.*?<b> (. *?) </b> '
Weatherlist = Re.compile (reg). FindAll (HTML)
Return weatherlist

Where Reg is a regular expression, HTML is the first step to get the text. The role of FindAll is to find all matching regular match strings in HTML and place them in the weatherlist. You can then enumerate the data output from the weathelist.
The regular expression reg here has two places to pay attention to.
One is "(. *?)". As long as the content in the () is what we are going to get, if there are multiple parentheses, then each result of the FindAll contains the contents of the parentheses. There are three brackets above, respectively corresponding to the city, the lowest temperate and the highest temperature.
The other is ". *?". Python's regular matches are greedy by default, which is to match as many strings as possible by default. If you add a question mark at the end, it means that it is not greedy, that is, match the string as little as possible. Here, because there are multiple cities that need to match the information, you need to use a non greedy mode, otherwise the result is only one, and is not correct.
  
Python is really handy to use:
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.