node. js Stream Copy File

Source: Internet
Author: User
Tags diff file copy

Transferred from: http://segmentfault.com/a/1190000000519006

The Nodejs fs module does not provide a copy method, but we can easily implement one, such as:

source = fs.readFileSync(‘/path/to/source‘, {encoding: ‘utf8‘});fs.writeFileSync(‘/path/to/dest‘, source);

This way is to read all the contents of the file into memory, and then write to the file, for small text files, this is not a big problem, such as grunt-file-copy this is done. However, for larger binary files, such as audio, video files, often several GB size, if you use this method, it is easy to make the memory "burst". The ideal method should be to read part, write part, no matter how big the file, as long as the time allows, always processing complete, here will need to use the concept of flow.

As shown on the tall picture above, we compare the file to the bottled water, and the water is the contents of the file, we use a pipe (pipe) to connect two barrels so that the water from one bucket into the other barrels, so slowly realize the large file copy process.

StreamThe implementation is in Nodejs EventEmitter , and there are many implementations, such as:

    • HTTP Responses Request
    • FS Read Write Streams
    • Zlib streams
    • TCP sockets
    • Child process stdout and stderr

The above file copy can be implemented simply:

var fs = require ( ' FS ');  var readstream = Fs.createreadstream ( var writestream = Fs.createwritestream ( ' data ', function ( Chunk) {//when there is data outflow, write data writestream.write (chunk);}); Readstream.on ( ' end ', function< Span class= "Hljs-params" > () {//when there is no data, close data flow writestream.end ();});    

There are some problems with the above notation, which can result in data loss if the speed of the write cannot keep up with the speed of the read. The normal situation should be, write a paragraph, and then read off a paragraph, if not finished, let the read stream first pause, and so on, and then continue to write, so the code can be modified to:

var fs =Require' FS ');var readstream = Fs.createreadstream ('/path/to/source ');var writestream = Fs.createwritestream ('/path/to/dest '); Readstream.on (' data ', function(chunk) { ///when there is data outflow, write the data if ( Writestream.write (chunk) = = = false) { //If not finished, pause reading stream readstream.pause ();}); Writestream.on (' drain ', function() { //after writing, continue reading Readstream.resume ();}); Readstream.on (' end ', function() { //) Close Data Flow writestream.end () when there is no data;});     

Or use a more directpipe

// pipe自动调用了data,end等事件fs.createReadStream(‘/path/to/source‘).pipe(fs.createWriteStream(‘/path/to/dest‘));

The following is a more complete process for copying files

var fs =Require' FS '), PATH =Require' Path '), out = Process.stdout;var FilePath ='/users/chen/movies/game.of.thrones.s04e07.1080p.hdtv.x264-batv.mkv ';var readstream = Fs.createreadstream (FilePath);var writestream = Fs.createwritestream (' file.mkv ');var stat = Fs.statsync (FilePath);var totalsize = stat.size;var passedlength =0;var lastsize =0;var startTime =Date.now (); Readstream.on (' Data ',function(Chunk) {passedlength + = Chunk.length;if (writestream.write (chunk) = = =False) {readstream.pause ();}}); Readstream.on (' End ',function() {writestream.end ();}); Writestream.on (' Drain ',function() {Readstream.resume ();}); SetTimeout (functionShow() {var percent =Math.ceil ((passedlength/totalsize) *100);var size =Math.ceil (Passedlength/1000000); var diff = size-lastsize; lastsize = size; Out.clearline (); Out.cursorto (0); Out.write ( "completed" + size +  ' MB, ' + percent + "%, Speed: ' + diff * 2 +  ' MB/s '); if (Passedlength < totalsize) {setTimeout (show, 500);} else {var endTime = date.now (); console.log (); console.log ( "total: ' + (endtime-starttime)/1000 + " seconds. ‘); }}, 500);             

You can save the above code as a copy.js test.

We added a recursive setTimeout (or direct use of setinterval) to be a bystander, observing the progress every 500ms, and writing the completed size, percentage and copy speed to the console, when the copy is complete, the total time spent, the effect

We copied a set of 1080p right game fourth season 7th episode, about 3.78G size, because the use of SSD, you can see the speed is very good, ha ha ~
When the copy is complete, the total time spent is displayed

Combined with Nodejs readline , and process.argv other modules, we can add overlay hints, mandatory overrides, dynamic specified file path, such as a complete replication method, interested can be realized, to achieve completion, you can

-s /path/to/copy.js /usr/local/bin/mycopy

This allows you to use your own mycopy commands to replace the system cp commands.

node. js Stream Copy File

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.