Basic knowledge of python crawlers and python Crawlers

Source: Internet
Author: User

Basic knowledge of python crawlers and python Crawlers

Crawler Overview

According to Baidu encyclopedia's definition: Web Crawlers (also known as web spider and web robot) are usually called Web page chaser in the foaf community. They follow certain rules, programs or scripts that automatically capture World Wide Web information. In addition, some uncommon names include ant, automatic indexing, simulation programs, and worms.

With the continuous development of big data, the crawler technology gradually enters people's field of view. It can be said that crawlers are the product of the emergence of big data. At least I have removed the big data to understand the crawler technology.

As the data volume increases, we need to select the required data on the Internet for our own research analysis and experiments. This uses the crawler technology. The following is the first time that python crawlers come along with the small editor!

I. Request-Response

When using python to implement crawlers, The urllib and urllib2 libraries are mainly used. First, use a piece of code to describe the following:

 import urllib import urllib2 url="http://www.baidu.com" request=urllib2.Request(url) response=urllib2.urlopen(request) print response.read()

We know that a webpage consists of html as the skeleton, js as the muscle, and css as the clothes. The function implemented by the above Code is to crawl the source code of the Baidu web page to the local machine.

The url is the url of the web page to be crawled; the request sends a request, and the response is the response given after receiving the request. Finally, the read () function is used to output the source code of the Baidu webpage.

Ii. GET-POST

Both of them transmit data to the webpage. The most important difference is that the GET method is to directly access the webpage through a link. The Link contains all the parameters, of course, a password is an insecure choice, but you can intuitively see what you have submitted.

POST will not display all the parameters on the website. However, it is not convenient if you want to directly view what is submitted. You can choose as appropriate.

POST method:

 import urllib import urllib2 values={'username':'2680559065@qq.com','Password':'XXXX'} data=urllib.urlencode(values) url='https://passport.csdn.net/account/login?from=http://my.csdn.net/my/mycsdn' request=urllib2.Request(url,data) response=urllib2.urlopen(request) print response.read()

GET method:

import urllibimport urllib2values={'username':'2680559065@qq.com','Password':'XXXX'}data=urllib.urlencode(values)url = "http://passport.csdn.net/account/login"geturl = url + "?"+datarequest=urllib2.Request(geturl)response=urllib2.urlopen(request)print response.read()

Iii. Exception Handling

The try-retry t statement is used to handle exceptions.

import urllib2 try:   response=urllib2.urlopen("http://www.xxx.com") except urllib2.URLError,e:   print e.reason

Summary

The above section describes the basic knowledge of python crawlers. I hope it will help you. If you have any questions, please leave a message and I will reply to you in a timely manner. Thank you very much for your support for the help House website!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.