Details about the nodeJs File System (fs), stream, and nodejsstream

Source: Internet
Author: User

Details about the nodeJs File System (fs), stream, and nodejsstream

I. Introduction

This article describes the parameter usage of Some APIs of the node. js File System (fs) and stream.

Ii. Directory

The file system introduces the following methods:

1. fs. readFile

2. fs. writeFile

3. fs. open

4. fs. read

5. fs. stat

6. fs. close

7. fs. mkdir

8. fs. rmdir

9. fs. readdir

10. fs. unlink

Four types of stream events: readable, writable, duplex, transform, and stream object.

Iii. Introduction to file system fs

1. fs. readFile

The readFile method mainly reads the file content and performs asynchronous operations.

Var fs = require('fs'{fs.readfile('a.txt ', function (err, data) {if (err) {return console. error (err);} else {console. log ("Asynchronous read:" + data. toString ());}})

2. fs. writeFile

WriteFile writes files asynchronously, fs. writeFile (file, data [, options], callback)

Var fs = require ('fs') console. log ("prepare to write files" into fs.writefile('input.txt ', "written content", function (err) {if (err) {return console. error (err);} else {console. log ("successfully written ");}})

3. fs. open ()

Open the file in asynchronous mode, fs. open (path, flags [, mode], callback)

Var fs = require ("fs"); // asynchronously open the file console. log ("prepare to open the file" when using fs.open('a.txt ', 'r +', function (err, fd) {// r + is opened in read/write mode, fd is the returned file descriptor if (err) {return console. error (err);} console. log ("file opened successfully! ");});

4. fs. read ()

This method reads files asynchronously in the format of fs. read (fd, buffer, offset, length, position, callback)

Var fs = require ("fs"); var buf = new Buffer (1024); console. log ("prepare to open the file! "Cannot parse fs.open('at.txt ', 'r +', function (err, fd) {if (err) {return console. error (err);} fs. read (fd, buf, 0, buf. length, 0, function (err, bytes) {if (err) {console. log (err);} // only output the read bytes if (bytes> 0) {console. log (buf. slice (0, bytes ). toString ());}});});

5. fs. stat ()

This method is used to obtain file information asynchronously. The format is fs. stat (path, callback)

fs.stat('fs.js', function (err, stats) {  console.log(stats.isFile());     //true})

The stats class instance returned asynchronously has many methods, such as stats. isFile () to determine whether it is a file, stats. isDirectory () to determine whether it is a directory ,......

6. fs. close ()

Fs. close () closes the file asynchronously. The syntax format is fs. close (fd, callback). The parameters are as follows:

D-the file descriptor returned by the fs. open () method.

Callback-callback function, with no parameters.

7. fs. mkdir ()

This method is used to create a directory in the format of fs. mkdir (path [, mode], callback). The parameters are as follows:

Path: path.

Mode: directory permission. The default value is 0777 ..

Callback: callback, no parameter.

Var fs = require ("fs"); console. log ("create directory/test/"); fs. mkdir ("/test/", function (err) {if (err) {return console. error (err);} console. log ("The/test directory is created successfully. ");});

8. fs. rmdir ()

Delete the directory. Syntax format: fs. rmdir (path, callback)

9. fs. readdir ()

This method is used to read the directory. The syntax format is fs. readdir (path, callback). The callback function has two parameters: the first is err, and the second is file array files under the directory.

Var fs = require ("fs"); console. log ("View/tmp directory"); fs. readdir ("/tmp/", function (err, files) {if (err) {return console. error (err);} files. forEach (function (file) {console. log (file );});});

10. fs. unlink ()

This method is used to delete files in the format of fs. unlink (path, callback)

Var fs = require ("fs"); console. log ("You are about to delete the file! "Deleted successfully fs.unlink('input.txt ', function (err) {if (err) {return console. error (err);} console. log (" the file is deleted successfully! ");});

Iv. stream types and events

1. stream: a stream is an abstract interface and has four stream types:

  1. Readable: readable;
  2. Writable: writable;
  3. Duplex: readable and writable;
  4. Transform: The operation is written into the data and then read the result.

All stream objects are EventEmitter instances. Common events include:

  1. Data: triggered when data is readable,
  2. End: No data readable trigger,
  3. Error: triggered when an error occurs,
  4. Finish: the trigger is completed.

2. read data from the stream

Var fs = require ("fs"); var data = ''; // create a readable stream var readerStream = fs.createReadStream('input.txt '); // set the encoding to utf8. ReaderStream. setEncoding ('utf8'); // Process Stream events --> data, end, and errorreaderStream. on ('data', function (chunk) {data + = chunk;}); readerStream. on ('end', function () {console. log (data) ;}); readerStream. on ('error', function (err) {console. log (err. stack) ;}); console. log ("program execution completed ");

3. Write stream:

Var fs = require ("fs"); var data = 'write stream data'; // create a writable stream and write it to the file output.txt var writerStream = fs.createWriteStream('output.txt '); // use utf8 encoding to write data into writerStream. write (data, 'utf8'); // mark writerStream at the end of the file. end (); // Process Stream events --> data, end, and errorwriterStream. on ('finish ', function () {console. log ("write complete. ") ;}); WriterStream. on ('error', function (err) {console. log (err. stack) ;}); console. log ("program execution completed ");

4. pipe)

Var fs = require ("fs"); // create a readable stream var readerStream = fs.createReadStream('input.txt '); // create a writable stream var writerStream = fs.createWriteStream('output.txt '); // pipeline read and write operations // read the content of the input.txt file and write the content into the readerStream file of output.txt. pipe (writerStream); console. log ("program execution completed ");

5. chained stream

A chain is a mechanism for connecting output streams to another stream and creating multiple stream operation chains. Chained streams are generally used for pipeline operations.

Next we will use pipelines and chains to compress and decompress files.

// Compress var fs = require ("fs"); var zlib = require ('zlib '); // compress the input.txt file into input.txt.gzfs.createReadStream('input.txt '). pipe (zlib. createGzip () .pipe(fs.createWriteStream('input.txt.gz '); console. log ("File compression completed. "); // Decompress var fs = require (" fs "); var zlib = require ('zlib '); // decompress input.txt.gz to input.txtfs.createReadStream('input.txt.gz '). pipe (zlib. createGunzip () .pipe(fs.createWriteStream('input.txt '); console. log ("the file is decompressed. ");

V. Summary

The above is all the content of this article. I hope it will be helpful for your learning and support for helping customers.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.