Python crawler crawls proxy IP and verifies instances of availability

Source: Internet
Author: User
This article mainly introduces the Python crawler to crawl proxy IP and verify the availability of examples, has a certain reference value, now share to everyone, the need for friends can refer to

Often write crawlers, will inevitably encounter the IP is the target of the site screen, silver, an IP is certainly not enough, as a program to save the ape, can not spend money, then go to find it, this time to write the next crawl agent on the IP, but this site also anti-climbing!!!

As for how to deal with, I think can increase the delay test, may be I crawl too frequently, so was blocked IP.

However, you can still go to the IP bus to try, all roads through Rome, can not be hanged in a tree.

No nonsense, on the code.

#!/usr/bin/env python#-*-Coding:utf8-*-import urllib2import timefrom bs4 import beautifulsoupimport sysreload (SYS) SYS . setdefaultencoding ("Utf-8") Req_header = {' user-agent ': ' mozilla/5.0 (Windows NT 6.1) applewebkit/537.11 (khtml, like Ge CKO) chrome/23.0.1271.64 safari/537.11 ', ' Accept ': ' text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q= 0.8 ', # ' accept-language ': ' en-us,en;q=0.8,zh-hans-cn;q=0.5,zh-hans;q=0.3 ', ' accept-charset ': ' iso-8859-1,utf-8;q= 0.7,*;q=0.3 ', ' accept-encoding ': ' en-us ', ' Connection ': ' keep-alive ', ' Referer ': ' http://www.baidu.com/'}req_timeout = 5TESTURL = "http://www.baidu.com/" teststr = "Wahaha" file1 = open (' Proxy.txt ', ' W ') # URL = "" # req = urllib2. Request (url,none,req_header) # Jsondatas = Urllib2.urlopen (req,none,req_timeout). Read () cookies = urllib2. Httpcookieprocessor () Checked_num = 0grasp_num = 0for page in range (1): req = urllib2. Request (' http://www.xici.net.co/nn/' + str (page), None, req_header) Html_doc = Urllib2.urlopen (req, None, Req_timeouT). Read () # Html_doc = Urllib2.urlopen (' http://www.xici.net.co/nn/' + str (page)). Read () soup = BeautifulSoup (Html_doc) TRS = soup.find (' table ', id= ' ip_list '). Find_all (' tr ') for TR in trs[1:]: TDs = Tr.find_all (' td ') IP = Tds[1].text.strip ( ) port = Tds[2].text.strip () protocol = Tds[5].text.strip () if protocol = = ' HTTP ' or protocol = = ' HTTPS ': #of. Write (' %s=%s:%s\n '% (protocol, IP, port)) print '%s=%s:%s '% (protocol, IP, port) grasp_num +=1 Proxyhandler = urllib2. Proxyhandler ({"http": R ' http://%s:%s '% (IP, Port)}) opener = Urllib2.build_opener (cookies, Proxyhandler) opener.addhe Aders = [(' User-agent ', ' mozilla/5.0 (Windows NT 6.1; WOW64) applewebkit/537.36 (khtml, like Gecko) chrome/31.0.1650.63 safari/537.36 ')] T1 = time.time () try:req = Open    Er.open (Testurl, timeout=req_timeout) result = Req.read () timeused = Time.time ()-T1 pos = Result.find (TESTSTR)     If pos > 1:file1.write (protocol+ "\ t" +ip+ "\ T" +port+ "\ n") checked_num+=1Print Checked_num, grasp_num else:continue except Exception,e:continuefile1.close () print Checked_num,grasp_n Um

Personal feeling code is not too complicated, there is no comment, I believe we can basically understand, if there are problems also please criticize correct, common progress!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.