Detailed description of using python to crawl soft exam questions using ip automatic proxy

Source: Internet
Author: User
I recently encountered some problems during the capture of soft exam questions for the purpose of capturing the online exam. the following article mainly describes how to use python to crawl the ip address of the soft exam questions for automatic proxy, this article is very detailed. let's take a look at it. Recently, I plan to capture soft questions on the Internet for the exam and encountered some problems during the capture, the following article describes how to use # wiki/1514.html "target =" _ blank "> python to crawl ip address automatic proxy for soft exam questions. This article describes in detail, let's take a look.

Preface

Recently, I have a software professional grade examination, hereinafter referred to as the soft exam. in order to better review and prepare for the examination, I plan to capture the soft exam questions on www.rkpass.cn.

First, let's talk about how I crawled the soft exam question (keng) (shi ). Now I can automatically capture all the questions of a module, such:

2. The second method is to break through the anti-crawler mechanism to continue high-frequency crawling by setting proxy IP addresses and other means. However, multiple stable proxy IP addresses are required.

If there are not many books, go to the code directly:

# The IP address is taken from the domestic zookeeper proxy IP website: http://www.xicidaili.com/nn/ # Crawling the homepage IP address is sufficient. Generally, use from bs4 import BeautifulSoupimport requestsimport random # obtain ipdef get_ip_list (url, headers) on the current page: web_data = requests. get (url, headers = headers) soup = BeautifulSoup (web_data.text) ips = soup. find_all ('tr ') ip_list = [] for I in range (1, len (ips): ip_info = ips [I] tds = ip_info.find_all ('TD ') ip_list.append (tds [1]. text + ':' + tds [2]. text) return ip_list # obtain an ipdef get_random_ip (ip_list): proxy_list = [] for Ip in ip_list: proxy_list.append ('http: // '+ ip) randomly from the captured ip address) proxy_ip = random. choice (proxy_list) proxies = {'http ': proxy_ip} return proxies # domestic Master proxy IP address url =' http://www.xicidaili.com/nn/ '# Request header headers = {'user-Agent': 'mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) chrome/53.0.2785.143 Safari/537.36 '} # The counter is cyclically crawled based on the counter. ipnum = 0 # create an array, store the captured ip address to the array ip_array = [] while num <1537: num + = 1 ip_list = get_ip_list (url + str (num), headers = headers) ip_array.append (ip_list) for ip in ip_array: print (ip) # Create a random number and obtain an ip randomly # proxies = get_random_ip (ip_list) # print (proxies)

Running result:

In this way, when crawling requests, setting the request ip address as an automatic ip address can effectively escape the simple anti-crawler method of blocking the fixed ip address.

The above is a detailed description of the ip automatic proxy method for crawling soft exam questions using python. For more information, see other related articles in the first PHP community!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.