This article analyzes node. the use of streamapi in js is described in detail in this article, involving node. jsapi, node. if you are interested in jsstream, refer to this article to introduce node. for details about the js stream api, see the following.
Basic Introduction
In Node. js, there are two methods to read files: fs. readFile and fs. createReadStream.
Fs. readFile is very familiar to every Node. js user. It is easy to understand and easy to use. But its disadvantage is that it will first read all the data into the memory. Once a large file is encountered, the reading efficiency of this method will be very low.
Fs. createReadStream reads data through Stream. It splits files (data) into small blocks and triggers some specific events. We can listen to these events and write specific processing functions. This method is not easy to use, but it is very efficient.
In fact, Stream in Node. js is not only used for file processing, but can be seen elsewhere, such as process. stdin/stdout, http, tcp sockets, zlib, and crypto are all useful.
This article is a summary of the Stream API in Node. js. I hope it will be useful to you.
Features
Event-based communication
You can use pipe to connect to a stream.
Type
Readable Stream Readable data Stream
Writeable Stream writable data Stream
Duplex Stream bidirectional data Stream, which can be read and written simultaneously
Transform Stream converts data streams, which are readable and writable, and can also convert (process) data.
Event
Readable data flow events
Triggered when readable data is routed out
For data streams that are not explicitly paused, adding the data event listening function will switch the data stream to the stream dynamics and provide data as soon as possible.
Triggered when end reads data. Note that it cannot be confused with writeableStream. end (). writeableStream does not have an end event and only has the. end () method.
Triggered when the close data source is closed
Error triggered when an error occurs when reading data
Event of writable data stream
After drain writable. write (chunk) returns false, all the cache writes are completed. This will be triggered when you re-write the data.
When the. end method is called by finish, All cached data is triggered after it is released. Similar to the end event in the readable data stream, it indicates that the write process ends.
Triggered when pipe is used as the pipe target
Triggered when unpipe is used as the unpipe target
Error triggered when data writing error occurs
Status
The readable data stream has two statuses: stream dynamic and paused. The method for changing the data stream status is as follows:
Paused state-> stream dynamics
Add a data event listener Function
Call the resume Method
Call the pipe method
NOTE: If there is no data event listening function or the pipe method's destination when the stream is changed to dynamic, the data will be lost.
Stream dynamics-> paused
The pause method is called when the pipe method's destination does not exist.
When the pipe method has a destination, the listening function of all data events is removed, and the unpipe method is called to remove the destination of all pipe methods.
Note: Only the listening function of the data event is removed, and the data stream is not automatically throttled 」. In addition, when the pause method is called for the destination of the pipe method, the data stream is always in the paused state. Once the destination sends a Data Request, the data stream may continue to provide data.
Usage
Read/write files
Var fs = require ('fs'); // create a readable data stream var rs = fs. createReadStream ('. /test1.txt '); // create a writable data stream var ws = fs. createWriteStream ('. /test2.txt '); // listen to rs for the end event of the readable data stream. on ('end', function () {console. log ('read text1.txt successfully! ') ;}); // Listen to the writable data stream end event ws. on ('finish', function () {console. log ('write text2.txt successfully! ') ;}); // Converts a readable data stream into a stream dynamics and streams into a writable data stream rs. pipe (ws); reads CSV files and uploads data (I have written it in the production environment) var fs = require ('fs '); var es = require ('event-stream'); var csv = require ('csv'); var parser = csv. parse (); var transformer = csv. transform (function (record) {return record. join (',') ;}); var data = fs. createReadStream ('. /demo.csv '); data. pipe (parser ). pipe (transformer) // process the data transmitted from the previous stream. pipe (es. map (function (data, callback) {up Load (data, function (err) {callback (err) ;}) // listens to the end event of the previous stream. pipe (es. wait (function (err, body) {process. stdout. write ('done! ');}));
More Methods
You can refer to the https://github.com/jeresig/node-stream-playground, go to the sample site and click add stream to see the results.
Common pitfalls
Writing a file using rs. pipe (ws) does not append the rs content to the back of ws, but directly overwrites the original content of ws with the rs content.
Ended or closed streams cannot be reused. You must recreate the data stream.
The pipe method returns the target data stream. For example, a. pipe (B) returns B. Therefore, when listening to an event, pay attention to whether the object you are listening for is correct.
If you want to listen to multiple data streams and use the pipe method to concatenate data streams, you need to write:
Data
.on('end', function() { console.log('data end');}).pipe(a).on('end', function() { console.log('a end');}).pipe(b).on('end', function() { console.log('b end');});
Common Class Libraries
Event-stream has the feeling of functional programming.
Awesome-nodejs # streams I have never used any other stream libraries, so if you have any requirements, please check it here.
The above content is the use of Stream API in Node. js, which I hope everyone will like.