Note the use of stream Writing Tools in NODE. JS _ node. js

Source: Internet
Author: User
By default, the transmission process of Nodejs read/write stream is transmitted in the form of buffer, unless you set other encoding formats for it. For details, refer. The stream in Node. js is very powerful. It provides support for processing potential large files and abstracts data processing and transmission in some scenarios. Because it is so easy to use, we often write some tool functions/libraries based on it in practice, but it is often due to the negligence of some features of its own flow, as a result, the written functions/Libraries may fail to achieve the desired effect in some cases, or some hidden mines may be laid. This article will provide two stream-based tools that are used in private.

1. Guard against EVENTEMITTER Memory leakage

In a function that may be called multiple times, if you need to add an event listener to the stream to perform some operations. Therefore, we need to be cautious about the memory leakage caused by adding listeners:

'use strict';const fs = require('fs');const co = require('co');function getSomeDataFromStream (stream) { let data = stream.read(); if (data) return Promise.resolve(data); if (!stream.readable) return Promise.resolve(null); return new Promise((resolve, reject) => {  stream.once('readable', () => resolve(stream.read()));  stream.on('error', reject);  stream.on('end', resolve); })}let stream = fs.createReadStream('/Path/to/a/big/file');co(function *() { let chunk; while ((chunk = yield getSomeDataFromStream(stream)) !== null) {  console.log(chunk); }}).catch(console.error);

In the above Code, the getSomeDataFromStream function completes the Promise when the stream reports an error or no data by listening to the error event and end event. However, when executing the code, we will soon see the alarm information in the console: (node) warning: possible EventEmitter memory leak detected. 11 error listeners added. use emitter. setMaxListeners () to increase limit ., every time we call this function, we add an extra error event listener and end event listener for the incoming stream. To avoid this potential memory leakage, we need to ensure that after each function execution is completed, all the additional listeners added for this call are cleared to ensure the function is pollution-free:

function getSomeDataFromStream (stream) { let data = stream.read(); if (data) return Promise.resolve(data); if (!stream.readable) return Promise.resolve(null); return new Promise((resolve, reject) => {  stream.once('readable', onData);  stream.on('error', onError);  stream.on('end', done);  function onData () {   done();   resolve(stream.read());  }  function onError (err) {   done();   reject(err);  }  function done () {   stream.removeListener('readable', onData);   stream.removeListener('error', onError);   stream.removeListener('end', done);  } })}

2. Ensure that the callback of the tool function is called only after the data is processed.

A utility function usually provides a callback function parameter. After all the data in the stream is processed, it is triggered with a specified value. Generally, the callback function is called in the end event of the stream, however, if the processing function is a time-consuming asynchronous operation, the callback function may be called before all data processing is completed:

'Use strict '; const fs = require ('fs'); let stream = fs. createReadStream ('/Path/to/a/big/file'); function processSomeData (stream, callback) {stream. on ('data', (data) =>{// perform some asynchronous time-consuming operations on data setTimeout () => console. log (data), 2000) ;}); stream. on ('end', () => {//... callback ()} processSomeData (stream, () => console. log ('end '));

The above code callback may be called when the data is not fully processed, because the end event of the stream is triggered only when the data in the stream is read. Therefore, we need to check whether the data has been processed:

Function processSomeData (stream, callback) {let count = 0; let finished = 0; let isEnd = false; stream. on ('data', (data) =>{ count ++; // perform some asynchronous time-consuming operations on data setTimeout () =>{ console. log (data); finished ++; check () ;}, 2000) ;}); stream. on ('end', () => {isEnd = true ;//... check () ;}) function check () {if (count = finished & isEnd) callback ()}}

In this way, the callback will be triggered after all data is processed.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.