Python crawler Development (i)-------requests module

Source: Internet
Author: User


Requests function (You can view detailed functions through Pycharm):-requests.get () () Requests.put () Requests.delet () ... requests . Request ()

Parameters: Requests.request ()

-method: How to submit

-url: Submit Address

-params parameters passed in the URL, get


1 Requests.request (2 3METHOD ="Get",4URL =""5params = {"K1":"v1","K2":"v2"}6 )7 #Http://

-data: Data passed in the request body

1 Requests.request (2  3METHOD ="Get",4URL ="",5params = {"K1":"v1","K2":"v2",6data = {'uers':'Alex','pwd':'123'}   7 }8  9 Ten----------------------->#equivalent to converting a dictionary into a string One Requests.request ( A   -METHOD ="Get", -URL ="", theparams = {"K1":"v1","K2":"v2", -data ='uers=alex&pwd=123' -}

-json data passed in the request body

1 Requests.request (2METHOD ='Get',3URL ='',4params = {'K1':'v1','K2':'v2'},5JSON = {'uers':'Alex','pwd':'123'}6     7     #Internal conversions: "{' uers ': ' Alex ', ' pwd ': ' 123 '}"8 9)

#思考 How to set up the anti-crawl mechanism: (the source of the Referer record your last landing site)

-headers Request Header

1 Requests.request (2METHOD ='Get',3URL ='',4params = {'K1':'v1','K2':'v2'},5JSON = {'uers':'Alex','pwd':'123'}6 7headers = {8       'Referer':''9        'user-agent':'-------'#property that indicates what browser you are using to accessTen } One      #Internal conversions: Forge the last access record A   -)

-cookies Request Server

Put it in the headers, and send it over.

The above is the most important requests function!!!!

More parameters

Session get Cookie and get GPSD grant permission

1 ImportRequests2 3Session =requests. Session ()4 5 ## # # #首先登陆任何页面, get Cookies6i1 = session.get (url ="")7 ## # # #用户登录, carry the last cookie, and authorize the GPSD in the cookie in the background8I2 (9URL ="Http://",TenData= { One         'Phone':'XXX', A         'Password':'Jshfkaj', -         'Onemonth':'1' -  the     } -  -  - ) +i3 ( -URL ='' + ) A Print(I3.text)

Python crawler Development (i)-------requests module

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.