Python Crawler Tutorial -09-error module
Today's protagonist is the error, crawl, it is easy to appear wrong, so we have to do in the code, common mistakes in the place, about Urllib.error
Urlerror
- Reasons for Urlerror production:
- 1. No network connection
- 2. Server Connection failure
- 3. The specified server could not be found
- 4.URLError is a subclass of OSError
- Case V9 File: https://xpwi.github.io/py/py%E7%88%AC%E8%99%AB/py09error.py
# 案例v9# URLError的使用from urllib import request,errorif __name__ == ‘__main__‘: url = "http://www.baiiiiiiiiiiidu.com/" try: req = request.Request(url) rsp = request.urlopen(req) html = rsp.read().decode() print(html) except error.URLError as e: print("URLError:{0}".format(e.reason)) print("URLError:{0}".format(e)) except Exception as e: print(e)
Httperror
- 1. is a subclass of Urlerror
The difference between Urlerror and Httperror:
- Httperror is a return code error for the corresponding HTTP request, and if the return error code is more than 400, the httperror is raised
- Urlerror corresponds to a network problem, including URL problems
-This note does not allow any person or organization to reprint
Python Crawler Tutorial -09-error module