Nodejs Study File Upload

Source: Internet
Author: User

  Recently to do a picture upload needs, because the service side of the Spring festival leave home has not come, so on my own first toss, probably made a effect, backstage on the Nodejs, just start to do when want to find information online, found that most of the information is used node-formidable plug-in implementation upload. But I wanted to do it manually, so I started to toss it. Writing this blog is also a record.

First of all, probably tidy up the whole idea, the effect you want to achieve is to be able to upload a picture on the page without a refresh and display (and then do it to become all the file upload, but all the same).

In the front-end part, want no refresh first think of IS Ajax, but Ajax can't upload files, so still honest with form upload, if use form words and ensure page no refresh, then use IFRAME to achieve. So the front-end requires two pages, a user Action page index.html The main page, there is a dedicated page to upload upload.html,html as follows:

index.html:<Body>What you have uploaded is:<BR><BR>    <Divclass= "Data">(None)</Div>    <BR>    <Buttonclass= "Choose">Uploading Things</Button>    <iframesrc= "UPL"frameborder= "0"ID= "UPL"></iframe></Body>upload.html:<Body>    <formAction= "/upload"Method=postenctype= "Multipart/form-data"Accept-charset= "Utf-8">        <inputtype= "File"ID= "Data"name= "Data" />        <inputtype= "Submit"value= "Upload"ID= "Sub"/>    </form></Body>

  Index.html page click on the Upload button, JS will trigger the IFRAME in the upload page of the input file click event, so file selection, select OK and then trigger the upload page of the Submit click event, the file will start uploading, After the file upload is successful, the background will return an HTML code containing the file link. The index.html page gets a link to the file, and if it is a picture, the picture is displayed and the download link is displayed if it is otherwise. Index.html JS code is as follows:

Window.onload =function(){            varframe = $ ("#upl") [0]; varcd; Frameinit () Frame.onload=function() {frameinit ()if($ (CD). Find ("#path"). length>0){                    varPath = $ (CD). Find ("#path") [0].innerhtml; if(/png|gif|jpg/g.test (path)) {                        $(". Data"). HTML ("<br> ")                    }Else {                        $(". Data"). HTML ("<a href=" "+path+" ' target= ' _blank ' > "+path+" </a><br> ")} frame.src= "UPL"; }            }            $(". Choose"). Click (function() {$ (CD). Find ("#data"). Click ();            }); functionFrameinit () {CD=Frame.contentDocument.body; varIMG = $ (CD). Find ("#data") [0]                if(IMG) {img.onchange=function() {$ (CD). Find ("#sub"). Click (); }                }            }        }

  Gets the link returned in the background through the onload event of the IFRAME. The above code is relatively simple, it is not specifically explained.

The following is the implementation of the background:

First to build an HTTP server, and then, because there are two pages, plus there are file downloads and so on, so first get the simplest route:

varHTTP = require (' http ');varFS = require (' FS '); Http.createserver (function(req, res) {varImaps = Req.url.split ("/"); varMaps = []; Imaps.foreach (function(m) {if(m) {Maps.push (M)}}); Switch(maps[0]| | " Index){         Case"Index":            varstr = Fs.readfilesync ("./index.html"); Res.writehead ($, {' Content-type ': ' text/html ' }); Res.end (str,"Utf-8");  Break;  Case"UPL":            varstr = Fs.readfilesync ("./upload.html"); Res.writehead ($, {' Content-type ': ' text/html ' }); Res.end (str,"Utf-8");  Break;  Case"Upload":             Break; default :            varPath = Maps.join ("/"); varValue = ""; varfilename = maps[maps.length-1]; varCheckreg =/^.+. (GIF|PNG|JPG|CSS|JS) +$/; if(maps[0]== "Databox") {Checkreg= /.*/            }            if(checkreg.test (filename)) {Try{Value=fs.readfilesync (Path)}Catch(e) {}}if(value) {res.end (value); }Else{Res.writehead (404); Res.end (‘‘); }             Break; }}). Listen (9010);

  The above code is also very simple, the route index point to Index.html,upl point to upload.html, and other if the non-point databox link is only allowed to access the picture, CSS, JS file, if the link to Databox to allow access to everything, Databox is the folder used to store uploaded files. The above code in the upload route is the file upload submission address, so file upload, the processing of the file is here.

The usual way to deal with the data that comes from post is:

var chunks = []; var size = 0; Req.on (function(chunk) {    chunks.push (chunk);    Size+ =chunk.length;}); Req.on ("End",function() {    var buffer = buffer.concat (chunks, size);}); 

  That buffer is all the data from the post, and when we Console.log (Buffer.tostring ()), we can see the format of the data that came in the post:

  

  Among them, the red box garbled is actually the file data, the front is the file information header. If you want to get the data inside, you have to filter out the non-file data, according to the console output information that the filter is very simple, according to \ r \ n to split it, the data at the beginning of four \ r \ n is the file data, and the end is to remove \r\ N--webkitformblabla--\r\n is also filtered according to \ r \ n. So the above code to complete the completion of the following:

varchunks = [];varSize = 0; Req.on (' Data ',function(Chunk) {Chunks.push (chunk); Size+=chunk.length;}); Req.on ("End",function(){    varBuffer =Buffer.concat (chunks, size); if(!size) {Res.writehead (404); Res.end (‘‘); return; }    varRems = []; //separating data and headers by \ r \ n     for(vari=0;i<buffer.length;i++){        varv =Buffer[i]; varV2 = buffer[i+1]; if(V==13 && v2==10) {Rems.push (i); }    }    //Picture Information    varpicmsg_1 = Buffer.slice (rems[0]+2,rems[1]). toString (); varfilename = Picmsg_1.match (/filename= ". *"/g) [0].split (' "') [1]; //Picture Data    varNbuf = Buffer.slice (rems[3]+2,rems[rems.length-2]); varPath = './databox/' +filename;    Fs.writefilesync (path, nbuf); Console.log ("Save" +filename+ "Success"); Res.writehead ($, {' Content-type ': ' Text/html;charset=utf-8 '}); Res.end (' <div id= ' path > ' +path+ ' </div> ');});

  Filtering the data directly through the analysis of the buffer, the beginning of their own writing is the buffer into a string to analyze, but the problem arises, when the filter is finished, the data before writing to the file needs to be translated into buffer, the result is written in the file is wrong. Change all kinds of code to buffer is not, toss the n long, finally found the corresponding scheme, is in the buffer to a string when written buffer.tostring ("binary"), and then filtered and then processed into the buffer when written new Buffer (str, ' binary ') is OK, but check the file, it seems that the binary encoding in buffer is deprecated, or not recommended. So I just want to not turn string, directly analyze buffer. It is easy to find the \ r \ n By checking the ASCII table through a for loop. So the problem is solved.

Works well:

  

  This seems to upload the function of the file to realize, but carefully think, as if there are too many problems, because they want to implement a file upload, and not just the image upload, so if I upload the data hundreds of M, then one-time buffer all read out and then deal with, do not say that processing speed is slow, Just this file data will make the memory consumption almost. So this kind of data all received over the process of the method seems not good, it is best to accept the data side processing, do not let all the data squeeze in memory. So, I used the stream.

The entire processing code is changed to be processed when the data is received and processed at the time of receiving the data: 

varImgsays = [];varnum = 0;varIsstart =false;varws;varfilename;varPath;req.on (' Data ',function(chunk) {varStart = 0; varEnd =chunk.length; varRems = [];  for(vari=0;i<chunk.length;i++){        if(Chunk[i]==13 && chunk[i+1]==10) {num++;            Rems.push (i); if(num==4) {Start= i+2; Isstart=true; varstr = (NewBuffer (Imgsays)). ToString (); FileName= Str.match (/filename= ". *"/g) [0].split (' "') [1]; Path= './databox/' +filename; WS=fs.createwritestream (path); }Else if(i==chunk.length-2) {//It shows the end of the data \ r \ nEnd = Rems[rems.length-2];  Break; }        }        if(num<4) {Imgsays.push (Chunk[i])}}if(Isstart) {ws.write (Chunk.slice (Start, end)); }}); Req.on ("End",function() {ws.end (); Console.log ("Save" +filename+ "Success"); Res.writehead ($, {' Content-type ': ' Text/html;charset=utf-8 '}); Res.end (' <div id= ' path > ' +path+ ' </div> ');});

 The principle is similar to each receive the buffer segment to judge, when after four \ r \ n After the analysis file header to get the file type, create a write stream, and start writing, and add to the end of the data is determined, the data tail will follow a \ r \ n, if the tail, then filter out the tail information.

As a result, the uploaded files will not be too big for the memory to burst.

Attached to GitHub address: https://github.com/whxaxes/upload-test/interested can down

I front side side dishes, if have improper place please correct me.

Nodejs Study File Upload

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.