Multi-threaded downloads are primarily used for headers in HTTP requests
- Content-length: Resource length, which confirms the total length of the resource, making it easier to plan the volume of tasks per thread
- Range:bytes=beg1-end1;beg2-end2, which is used to control a portion of a downloaded resource, it is important to note that the beg and end of this section are closed closed intervals.
When the downloaded fragments are small, it is easy to make mistakes and retry, and the retry module can be used to retry using annotations, this module is very useful.
Python's multithreading does not show an edge. The strength of the chain depends on the weakest link, the capacity of the cask depends on the shortest plank, and the concurrency of the system depends on the module with the smallest number of concurrent units. Multithreading does not show the advantage, probably because of the speed. Multiple threads are useless if a single thread is able to make full use of the network speed.
ImportOsImportThreadingImportTimeImportRequestsImportRetryurl= ' Http://mp4.vjshi.com/2017-12-18/422ded2944a95d6ca09752e04f687dd6.mp4 'defOne_thread ():# 37.86 secondsBegtime=Time.time () Resp=Requests.get (URL) with Open("Haha.mp4","WB") asF:f.write (resp.content) endTime=Time.time ()Print(EndTime-Begtime)defMulti_thread (): Per_thread_min= - # At least download volume per threadMax_thread_count= - # Maximum number of threadsTemp_folder= "Dow" # Temp FolderTarget_file_name= "Mul.mp4" # Storage Destination if notOs.path.exists (Temp_folder): Os.mkdir (Temp_folder) begtime=Time.time () Resp=Requests.get (URL, stream=True) SZ= int(resp.headers[' Content-length ']) BLOCK_SZ= Max(SZ//Max_thread_count, Per_thread_min) task=[] CNT= 0 forIinch Range(0, SZ, BLOCK_SZ): NOW_SZ=Sz-IifSz-I-Block_sz<Per_thread_minElseBLOCK_SZ it={' Beg 'I' End ': I+NOW_SZ,' path ': Os.path.join (Temp_folder,Str(CNT)),' last ': I+Now_sz==SZ} task.append (IT) CNT+= 1 ifit[' last ']: BreakLock=Threading. Lock ()defMerge (): with Open(Target_file_name,"WB") asF: forJ, Iinch Enumerate(Task): with Open(i[' path '],' RB ') asFf:f.write (Ff.read (i[' End ']-i[' Beg '])) EndTime=Time.time ()Print(EndTime-Begtime)@retry. Retry(Tries= -)defGo (IT):nonlocalCntPrint(IT) RESP=Requests.get (URL, headers={' Range ':"Bytes=%d-%d" %(it[' Beg '], it[' End ']- 1) })ifResp.status_code not inch[ $,206]:Print(It, Resp.status_code,' crawler failure ')Raise Exception("Crawler Failure")if Len(resp.content)!=it[' End ']-it[' Beg ']:Print("Wrong Length")Raise Exception("Wrong Length") with Open(it[' path '],' WB ') asF:f.write (resp.content)Print(IT, it[' End ']-it[' Beg '],Len(Resp.content),' over ', Resp.status_code) Lock.acquire (timeout=0) CNT-= 1 ifCnt== 0: Merge () Lock.release ()defStart_threading (): forIinchTask:threading. Thread (target=Go, args=(i,)). Start () start_threading ()# one_thread ()Multi_thread ()
Multi-Threaded Download