A breakpoint continuation of large file upload based on Nodejs

Source: Internet
Author: User

Then "grilled a grilled nodejs formidable onpart" and "also said file Upload compatibility IE789 progress bar---lost flash"; the front has completed a compatible IE789 of large file upload: No flash of the low version of the progress bar, the high version of the multipart upload, and has been a good cushion for the continuation of the breakpoint;

Said what to pave, originally thought Nodejs end has no problem, only the front-end rely on HTML5 then listen to abort event, save interrupt upload to the first few pieces (the breakpoint continued only support file is larger, and then accidentally upload interrupted, temporarily set 50M open breakpoint continued), Through the file content hash and the file only upload token to record the breakpoint information; the next upload to determine whether the hash and token is required to continue transmission, well, it is not difficult, but also the waste of my big half a day, dead a lot of brain cells; it seems that just stay in the thinking stage is not good ah, It's still going to get moving!

According to the above thinking, generally no problem, can be tested in my multi-test, try again and again, there will always be a problem: the front and back end occasionally will not connect; For example: back-end Console display upload 5 paragraphs, but I abort the supervisor heard is uploaded 4 paragraphs, and sometimes 6 paragraphs, This kind of cohesion is not the problem of torture me long time, and the number of breakpoints more, this situation is more; Last but not the end of the adjustment; the day before yesterday also said a good cushion, a action up began to teach me to behave;

Recently has not had the opportunity to learn angular, the accident is everywhere to see about react articles and tutorials, some uneasy, my big angular of the thunder in the react of the strong attack, more and less brave? Vue is also increasingly favored for its lightness; in fact, I have been constantly watching and watching these avant-garde framework, because the current work of the project in the traditional industry, but also with IE789 these old wonderful deal with, so just focus on and simple test water, recently to react more restless, Estimated that the large file upload to determine a version, the system will restart the learning react;

Do not pull away, today is a breakpoint continued to preach a man, but still intend to take advantage of the holiday, a little to organize, simple sharing under;

In view of the above bug, now change the train of thought, not to monitor in abort, the problem of convergence is fatal; What to do: before uploading to determine whether the file is more than 50M, if more than 50M read the contents of the file MD5, and then read whether the local save the MD5 value, If there is the token of the file is set to save the corresponding MD5 token, if there is no corresponding MD5 value, add a record: including the MD5 of the file, whether the status of upload completed, the token of the file, the corresponding index of multiple files; When the file is uploaded, the information is saved (or updated) locally, the file is uploaded and the corresponding information is cleared by the callback, and the next upload determines that the value is still there, then it is continued; Yes, the uploaded fragment index value is not saved locally, because there is a problem that the front and back end does not connect Now have to do the adjustment, by the back end after writing the contents of the file, the index value is saved (or updated), the front end if it is judged as a continuation, and then request to continue the index value, and then can begin to continue to pass; The results proved to be feasible, the facts speak louder, less nonsense, the code (see it)

1Postfiles:function(){2   var$self = This;3   if( This. files.size>50*1024*1024) {4 5     varFileReader =NewFileReader (), spark =NewSparkmd5.arraybuffer ();6Filereader.onload =function(e) {7 spark.append (E.target.result); 8$self. hash=Spark.end (); 9window.__hash__=$self. Hash; Ten           varStored=localstorage.getitem (' Fileuploadinfos '); One           varOdata=Utils.url.query.other_data, AIns=0; -               if(OData) { -ins=Json.parse (OData). Index; the               } -           if(stored&&json.parse (stored). Length) { -             varC=0,tk=0; -Json.parse (stored). ForEach (function(SD) { +               if(OData) { -                 if(sd&&sd.hash== $self. hash&&ins==sd.ins) { +tk=Sd.__token;  AC++; at                 } -}Else{ -                 if(sd&&sd.hash==$self. Hash) { -tk=Sd.__token;  -C++; -                 } in               } -             }); to             if(c) { +               var_data={ - TOKEN:TK, theGetfileinfo:1 *               } $               if(OData) {Panax Notoginseng_data.ins=ins; -               }; the$.post ('/components/uploader ', _data). Then (function(data) { +                 if(data.mes==1) { A$self. index=data.index*1+1; the                 } +$self. __token__=tk; - $self. Postslice (); $               }); $}Else{ -               varWillload={ - __token: $self. __token__, theStatus: ' Would ', - hash: $self. HashWuyi               } the               if(OData) { -willload.ins=ins; Wu               }; -               varLodd=json.parse (stored); About Lodd.push (willload); $Localstorage.setitem (' Fileuploadinfos ', Json.stringify (lodd)); - $self. Postslice (); -             }; -}Else{ A             varWillload={ + __token: $self. __token__, theStatus: ' Would ', - hash: $self. Hash $             } the             if(OData) { thewillload.ins=ins; the             }; theLocalstorage.setitem (' Fileuploadinfos ', Json.stringify ([willload])); - $self. Postslice (); in           } the       }; theFilereader.readasarraybuffer ( This. Files.slice (0, 10240)); About}Else{ the      This. Postslice (); the   }; the}

Here is sparkmd5.js to read the contents of the file MD5, and you can go through slice (s,e), to specify where to start reading, and how much to read, because the large file to read all the contents of the MD5, but very time-consuming oh, we can try it, very interesting, So it's slice! Use GitHub for details! Not the same as before the bedding said the place is more request $.post ('/components/uploader ', _data), and then is the callback to see whether the upload is successful to determine whether to clear the pre-upload saved information;

1window.callback=function(data) {2   if(localstorage) {3     varOdata=Utils.url.query.other_data,4Ins=0;5       if(OData) {6ins=Json.parse (OData). Index;7       }8     varStoredinfo=localstorage.getitem (' Fileuploadinfos ');9     var_hash=Location.hash;Ten     if(window.__hash__) { Onestoredinfo=Json.parse (storedinfo); A        for(vari=0;i<storedinfo.length;i++){ -         if(OData) { -           if(storedinfo[i].hash==window.__hash__&&ins==storedinfo[i].ins) { the Storedinfo.splice (i); -           } -}Else{ -           if(storedinfo[i].hash==window.__hash__) { + Storedinfo.splice (i); -           } +         } A       } atLocalstorage.removeitem (' Fileuploadinfos '); -Localstorage.setitem (' Fileuploadinfos ', Json.stringify (Storedinfo)); -     } -   } -   //...... -   //... in}

Save value has an index, from the URL of the optional Other_data, this value is to expand, if later use found to add additional values, or callback other values, or multiple IFRAME upload, to distinguish between different uploads, just to other_data Riga value is OK, He is a JSON-formatted value, you know, it is important that multiple IFrame can also share a callback through Other_data;

In order to ensure that the continuation of the breakpoint is not constantly adding new files, multiple uploads, even the same file is not the same name, the problem is quite a lot of, of course, this problem is also very serious, so can not think too simple, not only to judge the contents of the file MD5, but also to determine the token, because each retransmission token refreshed, So remember to reset token, but also to determine how many IFRAME upload how to distinguish;

, compared to the test method of cheap: Because this upload is the use of the IFRAME, as a single page management; multiple IFRAME upload the same large file, but also support the continuation of the breakpoint;

I do not know whether I share clearly, the students have any way to welcome to share with me;

Finally, the perfect large file upload details, if possible, you can also drag HTML5 dragged in, since as a separate page management, for static resources, it is likely to be related to cross-domain issues, continue to follow ...

Originally from: Flower House (http://www.famanoder.com/bokes/575ace3880e446c026079232)

A breakpoint continuation of large file upload based on Nodejs

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.