After updating the requests package, I found that the Charles tool on my computer was no longer able to crawl the packet successfully. Baidu has not found the reason for half a year.
Then I used Google to find out about Charles ' latest document discovery. Need to set up an agent, otherwise traffic in the past can not be the default crawl system all traffic Charles caught. It's amazing.
Specifically on the StackOverflow to see the first original address. Http://stackoverflow.com/questions/8287628/proxies-with-python-requests-module
And then reprint the content. I tried the second way to add proxy addresses to the environment variable export. But it doesn't seem to work, but the way to write an agent in code is to do it. Specific:
The proxies
' dict syntax is {"protocol":"ip:port", ...}
. With it can specify different (or the same) Proxie (s) for requests using HTTP, https, and ftp protocols:
Http_proxy= "Http://10.10.1.10:3128"Https_proxy= "Https://10.10.1.11:1080"Ftp_proxy= "Ftp://10.10.1.10:3128"Proxydict= { "http" :, "https" :< Span class= "PLN" > Https_proxy, "FTP" : Ftp_proxy }r = Requests. (url, Headers=< Span class= "PLN" >headers, Proxies=
Deduced from the requests
documentation:
Parameters:
method
–method for the new Request object.
url
–url for the new Request object.
...
proxies
– (optional) Dictionary mapping protocol to the URLof the proxy.
...
On Linux You can also does this via HTTP_PROXY
the, HTTPS_PROXY
, and FTP_PROXY
environment variables:
export HTTP_PROXY=10.10.1.10:3128export HTTPS_PROXY=10.10.1.11:1080export FTP_PROXY=10.10.1.10:3128
On Windows:
set http_proxy=10.10.1.10:3128set https_proxy=10.10.1.11:1080set ftp_proxy=10.10.1.10:3128
Thanks, Jay for pointing this out:
The syntax changed with requests 2.0.0.
You'll need to add a schema to the url:http://docs.python-requests.org/en/latest/user/advanced/#proxies
Questions about the Python requests package new version Setup Agent