Environment: Centos7 python3.6
Test URL: www.bai.com
Test method: Crawl Baidu 100 times
Results:
aio:10.702147483825684s
Requests:12.404678583145142s
The speed of the asynchronous framework is still significantly improved.
The following contribution code:
Import aiohttpimport timeimport requestsimport asynciodef test_requests (): "" " test requessts request Baidu 100 times " " Start = Time.time () url = "https://www.baidu.com" for I in range: requests.get (URL) end = Time.time () print ("Requests:") print (End-start) async def aio_download (URL): "" " aiohttp Download" " async with Aiohttp." Clientsession () as session: await session.get (URL) async def test_aio (): "" " test AIOHTPP request Baidu 100 times " " url = "https://www.baidu.com" start = Time.time () for I in range: await aio_download (URL) end = Time.time () print ("AIO:") print (end-start) if __name__ = = "__main__" : loop = Asyncio.get_event_loop ( ) Loop.run_until_complete (Test_aio ()) test_requests ()
————————————————————————————————————————
-—————————————————————————————————————————
Tips:
Requests do not use the session to repeatedly crawl a site test, because from the 2nd time, reading is the cache, no matter the crawl 50 or 100 times or more, the total time is within 1s.
Python requests vs aiohttp speed comparison