Python uses regular expressions to write web crawlers

Source: Internet
Author: User

"""

Text processing is the main task of the current computer processing, from the text to find some useful information,

Digging out some information is what most of the computer programs do now. While Python's lightweight, compact language contains a lot of processing libraries,

These libraries have good cross-platform performance and are highly portable.

In Python, the RE module provides a number of advanced text pattern matching features, as well as the ability to replace the corresponding string with the corresponding search.

"""


"""

Regular expression symbols and special characters

Re1|re2-----> matches Re1 or Re2 of regular expressions

. -----> can match any character, except for line break

^-----> Match Word a start ^dear

$-----> Match end of String/bin/*sh$

*-----> Match Regular expressions that appear earlier 0 or more times [a-za-z0-9]*

+-----> match the preceding regular expression one or more times [a-za-z0-9]+ [a-z]+\.com

? -----> match the preceding regular expression 0 or one time goo?

{n}-----> matches the preceding regular expression N times [0-9]{n}

{M,n}-----> Match repeated M to N regular expressions [0-9]{3,5}

[...] -----> Match any character that appears in a character group [AEIOU]

[... x-y ...] -----> match any character from character X to y [0-9],[a-za-z]

[^...] -----> does not match any one of the characters appearing in this character set, including a range of characters [^a-za-z0-9_]

(*|+|?| {})? -----> used to repeat the number of occurrences of any "non-greedy version" symbol that appears above. [A-z]

(...) -----> Match enclosing parentheses in the expression (RE) and Save as a word group ([0-9]{3})?, f (oo|u) bar

"""


Import rep=re.compile (' ab* ');p rint p;r1=r ' \d{3,4}-?\d{8} ';p rint re.findall (R1, ' 010-12345678 ');p rint re.findall (R1, ' 010-00000000 '); R2=re.compile (R ' [Cc][ss][vv][tt] '); R3=re.compile (R ' CSVT ', re. I);p rint r3.findall (' CSvT '), Test_r=r ' (abc/[0-9]{8,8}$) ';p rint re.findall (test_r, ' abc/12345678 ');

"""

Use regular expressions to make a web crawler

"""


headers={' Connection ': ' keep-alive ', ' Accept ': ' text/html, Application/xhtml+xml, */* ', ' accept-language ': ' En-u s,en;q=0.8,zh-hans-cn;q=0.5,zh-hans;q=0.3 ', ' user-agent ': ' mozilla/5.0 (Windows NT 6.3; WOW64; trident/7.0; rv:11.0) like Gecko '};import urllib2url= ' Http://blog.csdn.net/berguiliu '; req=urllib2. Request (URL); Req.add_header (' user-agent ', ' mozilla/5.0 (Windows NT 6.3; WOW64; trident/7.0; rv:11.0) like Gecko '); Browser=urllib2.urlopen (req);d ata = Browser.read (); Re_blog_list=re.compile (R ' href= "(/berguiliu/article/details/[ 0-9]{8,8}) ">" Url_list=re.findall (re_blog_list,data); Import Cookielibdef Makemyopener (head): Cj=cookielib.    Cookiejar (); Opener=urllib2.build_opener (URLLIB2.        Httpcookieprocessor (CJ)); Header=[] for key, value in Head.items (): elem= (key, value) Header.append (elem) opener.addheaders=head    Er Return opener Oper = Makemyopener (headers), UOP = Oper.open (' http://www.baidu.com/', timeout = +);d ata = uop.read ();p RI NT data;Import time;for Suburl in url_list:new_url= ' http://blog.csdn.net ' +suburl;    Print New_url;    Oper = Makemyopener (headers);    UOP = Oper.open (new_url, timeout = 1000);    data = Uop.read (); Time.sleep (3)

This program is mainly by crawling Web pages, analysis of the Web page information, find their own parts of interest, you can find their corresponding information, the corresponding operation!



Python uses regular expressions to write web crawlers

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.