node. js stream-Basic article

Source: Internet
Author: User
Tags event listener

node. JS Stream-Basic article

Bingbin · 2016-07-08 11:51

Background

When building a more complex system, it is often disassembled into several parts that are functionally independent. The interfaces of these parts follow certain specifications and are connected in some way to accomplish more complex tasks together. For example, the shell | connects parts through a pipeline, and its input and output specifications are text streams.

In node. js, the built-in stream module also implements a similar function, with sections .pipe() connected.

In view of the fact that there are fewer articles on stream in the domestic system, and that more and more open source tools are using stream, this series of articles will introduce the relevant content from the following aspects:

    1. Basic types of streams, and basic usage of stream modules
    2. How streaming and back pressure work
    3. How to develop a streaming program, including an analysis of Gulp and browserify, and a practical example.

This article is the first article in a series of articles.

Four types of streams

Stream provides the following four types of streams:

var Stream = require (' stream ')var readable = stream.readablevar writable =  Stream.writablevar Duplex = Stream.duplexvar Transform = Stream.transform

Use Stream streaming that enables data, such as:

var fs = require (' FS ')//  ' fs.createreadstream ' creates a ' readable ' object to read ' Bigfile ' Content, and output to standard output //  if using ' fs.readfile ' may fail Fs.createreadstream (Bigfile) due to a large file size . PIPE ( Process.stdout)
Readable

Create a readable stream.

Example: Streaming consumes data from an iterator.

' Use strict 'Const readable= require (' stream '). Readableclass Toreadable extends Readable {constructor (iterator) {super () This. iterator =iterator}//subclasses need to implement this method  //This is the logic of production data_read () {Const RES= This. Iterator.next ()if(res.done) {//data source is exhausted, call ' push (NULL) ' Notification flow      return  This. Push (NULL)} setTimeout (()= {      //adding data to a stream through the ' push ' method       This. push (Res.value + ' \ n '))    }, 0)}}module.exports= Toreadable

When actually used, new ToReadable(iterator) a readable stream is returned, downstream of which the data in the iterator can be streamed.

Const ITERATOR =function(limit) {return{Next:function () {      if(limit--) {        return{done:false, Value:limit +math.random ()}} return{done:true }    }  }}(1e10) Const readable=Newtoreadable (iterator)//listen for the ' data ' event and get one data at a timeReadable.on (' data ', data =process.stdout.write (data))//all data is read outReadable.on (' End ', () = Process.stdout.write (' Done '))

Execute the above code, there will be 10 billion random numbers are continuously written into the standard output stream.

When you create a readable stream, you need to inherit and Readable implement the _read method.

    • _readThe method is to read the logic of the specific data from the underlying system, i.e. the logic of the production data.
    • In a _read method, the push(data) data is placed in a readable stream by a call for downstream consumption.
    • In a _read method, you can call synchronously push(data) or asynchronously.
    • When all the data is produced, it must be called push(null) to end the readable stream.
    • Once the stream has ended, it is no longer possible to invoke the push(data) add data.

You can data consume readable streams by listening for events.

    • After the first time the event is monitored, it is continuously data readable invoked _read() to data output the data by triggering the event.
    • The first data event is triggered in the next tick, so it is safe to place the logic before the data output in the event listener (in the same tick).
    • Events are triggered when the data is all consumed end .

In the example above, it process.stdout represents the standard output stream, which is actually a writable stream. The use of a writable stream is described in the next section.

Writable

Creates a writable stream.

The first way to create a class of readable streams is through inheritance, which is also useful for creating a class of writable streams, but it is only a _write(data, enc, next) method, not a method, that needs to be implemented _read() .

There are some simple cases where you don't need to create a class of streams, but just a stream object that you can do in the following ways:

Const WRITABLE = require (' stream '). Writableconst writable=writable ()//implementing the ' _write ' approach//This is the logic that writes the data to the underlyingWritable._write =function(data, enc, next) {//writes data from the stream to the underlyingProcess.stdout.write (Data.tostring (). toUpperCase ())//when the write is complete, call the ' next () ' method to notify that the next data is passed inProcess.nexttick (Next)}//all data is written to the underlyingWritable.on (' Finish ', () = Process.stdout.write (' Done ')))//writes a data to the streamWritable.write (' a ' + ' \ n ')) Writable.write (' B ' + ' \ n ') Writable.write (' C ' + ' \ n ')//the ' end ' method needs to be called when no data is written to the streamWritable.end ()
    • Upstream writable.write(data) writes data to a writable stream by calling. write()method is called _write() to data write to the underlying.
    • In _write , when the data is successfully written to the underlying, you must call the next(err) tell stream to start processing the next data.
    • nextThe call can be either synchronous or asynchronous.
    • Upstream must be called writable.end(data) to end a writable stream, which data is optional. After that, the new data can no longer be called write .
    • endafter a method call, an event is triggered when all underlying writes are completed finish .
Duplex

Creates a readable writable stream.

Duplexis actually a Readable class of streams that inherit the same Writable .
Therefore, an Duplex object can be used as a readable stream (Implementation method required _read ) or as a writable stream (requires implementation _write method).

varDuplex = require (' stream '). DuplexvarDuplex =Duplex ()//Readable end-of-layer read logicDuplex._read =function () {   This. _readnum = This. _readnum | | 0if( This. _readnum > 1) {     This. Push (NULL)  } Else {     This. Push ("+" ( This. _readnum++))  }}//Writable end-of-layer write logicDuplex._write =function(buf, enc, next) {//A, BProcess.stdout.write (' _write ' + buf.tostring () + ' \ n ') Next ()}//0, 1Duplex.on (' data ', data = Console.log (' OnData '), Data.tostring ())) Duplex.write (A) Duplex.write (' B ') duplex.end ()

The method is implemented in the code above _read , so you can listen for data events to consume Duplex the resulting data.
At the same time, the method is realized _write , which can be used as downstream consumption data.

Because it is both readable and writable, it is said to have two ends: the writable and the readable end.
The interface of the writable end is Writable consistent, used as downstream, and the interface of the readable end is Readable consistent, and is used as upstream.

Transform

In the above example, the data in the readable stream (0, 1) is separated from the data in the writable stream (' A ', ' B '), but the data written in the writable Transform side is transformed and automatically added to the readable end.
TranformInherited from Duplex , and has implemented _read and _write methods, while requiring the user to implement a _transform method.

' Use strict 'Const Transform= require (' stream '). Transformclass Rotate extends Transform {constructor (n) {super ()//rotate the letter ' n ' position     This. Offset = (n | | 13)% 26  }  //convert data written to writable end to readable_transform (buf, enc, next) {varres = Buf.tostring (). Split ("). Map (c + = {      varCode = c.charcodeat (0)      if(c >= ' a ' && c <= ' z ') {Code+= This. Offsetif(Code > ' z '. charCodeAt (0) ) {Code-= 26        }      } Else if(c >= ' A ' && C <= ' Z ') {Code+= This. Offsetif(Code > ' Z '. charCodeAt (0) ) {Code-= 26        }      }      returnString.fromCharCode (Code)}). Join (‘‘)    //call the Push method to add the transformed data to the readable end     This. Push (RES)//call the next method to prepare to process the nextnext ()}}varTransform =NewRotate (3) Transform.on (' Data ', data =process.stdout.write (data)) Transform.write (' Hello, ') Transform.write (' world! ') transform.end ()//Khoor, zruog!
Objectmode

In the previous sections of the example, you often see the call data.toString() . toString()is this call necessary?
This section describes how to control the type of data in a flow and naturally has an answer.

In the shell, | connect the upstream and downstream with a pipe (). The upstream output is the text stream (the standard output stream), and the downstream input is also the text stream (the standard input stream). In the flow described in this article, the default is the same.

For a readable stream, push(data) data only the String or Buffer type, and data the data that is consumed by the event output are Buffer types. In the case of a writable stream, write(data) data only the String or type is used Buffer , and _write(data) the call simultaneous comes in as a data Buffer type.

That is, the data in the stream is type by default Buffer . Once the resulting data is put into the stream, it is turned into consumption, and the written data is also converted to the type when it is passed to the Buffer underlying write logic Buffer .

But each constructor receives a configuration object and has an option to objectMode true "man sows, reap" as soon as it is set.

ReadableWhen not set objectMode :

Const readable = require (' stream '= readable () Readable.push (' a ') Readable.push (' B ') Readable.push (null) readable.on (' data ', data = console.log (data))

Output:

<buffer 61><buffer 62>

ReadableobjectModeafter setting:

Const readable = require (' stream 'true  }) Readable.push (' a ') Readable.push (' B ') ) Readable.push ({}) Readable.push (null) readable.on (' data ', data = console.log (data))

Output:

ab{}

As objectMode you can see, push(data) the data is output as it was set. At this point, you can produce any type of data.

Notice

Stream Series Total Three article:

    • The first part: The basic article, introduced the stream interface fundamental use.
    • The second part: Advanced article, focus on how stream bottom support streaming data processing, and its back pressure mechanism.
    • The third part: the actual combat article. Describes how to use stream for program design. Summarize two design patterns from browserify and gulp, and build an example of an application that automatically generates changelog for a git repository based on stream.
Reference documents
    • Github,substack/browserify-handbook
    • Github,zoubin/streamify-your-node-program

node. js stream-Basic article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.