1. Grunt-Gulp
The early mention of building tools is inevitably reminiscent of a long history, Make
Ant
and later, for more convenient construction of similar Java projects Maven
. Node spawned a number of automated tools, such as Bower,yeoman,grunt. And now the front-end mentions that the build tool will naturally be remembered Grunt
. Maven in the Java World provides a powerful package dependency management and build lifecycle management.
In the world of JavaScript, Grunt.js is an automated task runner based on node. js. February 18, 2013, Grunt v0.4.0 released. Fractal Company has actively participated in the development of several popular node. JS modules, which released a new build system gulp last year, hoping to take its essence and replace Grunt as the most popular JavaScript task runner.
2. Features of the Grunt
Grunt has a complete community of plugins rich
It's easy to learn, you can install plugins and configure them casually
You don't need more advanced ideas, and you don't need any experience.
完善
–grunt Plugin data: According to community results, a total of 3,439 plugins, of which 49 official plugins.
易用
–grunt is rich in plugins: many common tasks have ready-made grunt plugins, and there are many third-party plugins such as:,,,,,,, CoffeeScript
Handlebars
Jade
JsHint
Less
RequireJS
Sass
Styles
. And can be used with reference to the document configuration.
3. Similarities and differences of gulp and grunt
Easy to use: With code better than configuration strategies, Gulp keeps simple things simple and complex tasks manageable.
Efficient: Build can be done faster by leveraging node. JS's powerful stream, without writing intermediate files to disk.
High quality: Gulp Strict plug-in guidelines to ensure that the plug-in is simple and works the way you expect.
Easy to learn: by minimizing the API, you can learn to gulp in a very short period of time. The build work is the same as you would imagine: a series of flow pipelines.
易用
Gulp is more concise than grunt, and following code is better than configuration policy, maintaining gulp is more like writing code.
高效
Gulp more design than grunt, the core design is based on the concept of Unix flow and is connected by pipelines without writing intermediate files.
高质量
Each plugin of Gulp only completes one function, which is one of the design principles of UNIX, and each function integrates and accomplishes complex tasks through the flow. For example: The grunt imagemin
plugin not only compresses the picture, but also includes the caching function. In gulp, he says, the cache is another plug-in that can be used by other plugins, which facilitates the reusability of plugins. There are currently 673 plugins listed in the official list.
易学
Gulp's core API is only 5, mastering 5 APIs and learning to gulp, then you can combine the tasks you want with your pipeline flow.
4. Yeoman Team Talks
The Yeoman team also raised a issue on GitHub last December to discuss the use of gulp instead of grunt: they mention that Gulp is a new stream-based pipeline build system that requires little configuration and is faster.
5. Gruntfile.js
Module.exports = function (grunt) {grunt.initconfig ({ concat: { ' dist/all.js ': [' src/*.js '] }, Uglify: { ' dist/all.min.js ': [' Dist/all.js ' }, jshint: { files: [' gruntfile.js ', ' src/*.js '] } , Watch: { files: [' gruntfile.js ', ' src/*.js '], tasks: [' jshint ', ' concat ', ' uglify '] }});//Load Our pluginsgrunt.loadnpmtasks (' Grunt-contrib-jshint '); Grunt.loadnpmtasks (' Grunt-contrib-concat '); Grunt.loadnpmtasks (' grunt-contrib-uglify '); Grunt.loadnpmtasks (' Grunt-contrib-watch ');//Register Default Taskgrunt.registertask (' Default ', [' jshint ', ' concat ', ' uglify ']);
6. Gulpfile.js
var gulp = require (' gulp '); var jshint = require (' Gulp-jshint '); var concat = require (' Gulp-concat '); var rename = require (' G Ulp-rename '); var uglify = require (' gulp-uglify ');//Lint Jsgulp.task (' Lint ', function () {return gulp.src (' Src/*.js ') . Pipe (Jshint ()) . Pipe (Jshint.reporter (' Default ');}); /Concat & minify jsgulp.task (' minify ', function () {return gulp.src (' Src/*.js ') . Pipe (Concat (' all.js ')) . Pipe (Gulp.dest (' dist ')). Pipe (Rename (' All.min.js ')). Pipe ( uglify ()) . Pipe (Gulp.dest (' dist '));}); /Watch our Filesgulp.task (' Watch ', function () {gulp.watch (' src/*.js ', [' Lint ', ' minify ']); /defaultgulp.task (' Default ', [' Lint ', ' minify ', ' Watch ']);
7. Difference and Difference
Stream: Gulp is a stream-based build system that uses code better than the configured policy.
Plugins: Gulp plug-ins are more pure, single-function, and stick to a plugin that only does one thing.
Code is better than configuration: Maintaining Gulp is more like writing code, and Gulp follows the COMMONJS specification, so there's no difference between writing a node program.
No intermediate file produced
8. Differences in I/O processes
In the I/O process using grunt, some intermediate temporary files are generated, some tasks generate temporary files, other tasks may be processed based on temporary files, and the resulting post-build files are generated.
The advantage of using Gulp is to use the flow of the file processing, through the pipeline to connect multiple tasks and operations, so there is only one I/O process, the process clearer, more pure.
9. Core of Gulp: Flow
Gulp simplifies task writing by streaming and code better than configuration policies. This looks like a "jquery" approach, which strings together to create a build task. In the early days of UNIX, the flow already existed. Streams also play an important role in the node. JS ecosystem, similar to *nix, which abstracts almost all of the devices into files, and node abstracts almost all IO operations into stream operations. So writing tasks with Gulp can also be seen as writing tasks with node. js. When using a stream, Gulp removes the intermediate file and writes only the last output to the disk, so the process becomes faster.
Doug McIlroy, then head of the Bell Labs csrc (Computing Sciences Center), and inventor of the Unix pipe, Summari Zed the Unix philosophy as follows:
This is the Unix philosophy:write programs, do one thing and does it well. Write programs to work together. Write programs to handle text streams, because a universal interface.
Flow-based module features:
Write modules that does one thing and do it well.
Write modules that work together.
Write modules that handle events and streams.
Examples of UNIX pipelines:
Tput SETAF 88; WhoAmI | Figlet | Tr _ ... | TR \ \ ' | TR \| ¡ | TR/√
10. Why should I use a stream?
The I/O operations in node are asynchronous, so both read-write and network operations of the disk require a callback function to be passed.
var http = require (' http '); var fs = require (' FS '); var server = Http.createserver (function (req, res) { Fs.readfile (__d Irname + '/data.txt ', function (err, data) { res.end (data); }); Server.listen (8000);
This node. JS application is simple, and it is estimated that all the people who have studied node have done this kind of exercise, which can be said to be node's Hello world. This code does not have any problems, you can use node to run normally, using the browser or other HTTP client can be normal access to run the program host's 8000 port read the Data.txt file on the host. But this approach implies a potential problem, and node caches the entire Data.txt file in memory in response to requests from the client, and as client requests increase the memory consumption will be staggering, and the client will have to wait for a long transfer time to get the results. Let's take a look at another way to use the flow:
var http = require (' http '); var fs = require (' FS '); var server = Http.createserver (function (req, res) { var stream = FS . Createreadstream (__dirname + '/data.txt '); Stream.pipe (res);}); Server.listen (8000);
A very big change in this is the use of Createreadstream, the FS method, to create the stream variable, and the Pip method of the variable to respond to the client's request. Using stream, this variable allows node to read data.txt a certain amount of time and start sending a response to the client without the service cache and the client waiting.
The type of stream in node
Readable (readable)
Writeable (writable)
Duplex (Duplex)
Transform (op-Duplex)
The stream can be either readable (readable) or writable (writable), or both (Duplex, duplex). All flows are eventemitter, but they also have other custom methods and properties, depending on whether they are readable, writable, or Duplex.
Depends on
Liftoff
Through2
Vinyl
,Vinyl-fs
Orchestrator
Liftoff
Module solves the problem is to install a CLI tool globally, but support multiple projects multiple configuration files, and the current directory does not have a configuration file, you can locate the existing configuration file to the parent directory, or you can specify the configuration file directory when executing command lines outside the project directory. So gulp is based on multiple liftoff
projects that can be implemented with multiple gulpfile, and can be executed gulp
when specifying a profile path.
Through2
is a streams2.Transform
small package for node, to avoid subclassing
the annoyance. It is easier to create a stream with a function instead of the tedious setup of the prototype chain, _transform
_flush
as well as the extension of the transform class to call the constructor, so that the buffering settings can be initialized correctly.
Vinyl
is a very simple meta-information object used to describe a file, the vinyl object has two main properties: path
and content
. Because a file is not only something on your hard drive, it may be something that you host on s3,ftp, or even Dropbox, so vinyl can describe all of these sources of files. It provides a concise way to describe the file, but if you need to access a file on the local file system, you also need to pass a so-called Vinyl Adapter
, it will burst some methods: such as, .src(globs)
.dest(folder)
, and watch(globs, fn)
. Globs is a path pattern match.
Orchestrator
is actually a node-based module that is responsible for task dependency definition, processing and execution, much like the AMD module loader we currently use, and the default is to maximize the way of parallel loading.
In fact, the task-running system in Gulp is not implemented by itself, but rather uses the orchestrator directly. In the source code of gulp, it can be found that gulp inherits orchestrator, and Gulp.task is simply an alias of Orchestrator.add:
Gulp source Codevar util = require (' util '); var Orchestrator = require (' Orchestrator '); function Gulp () {
Orchestrator.call (this);
}
Util.inherits (Gulp, Orchestrator);
Gulp.prototype.task = Gulp.prototype.add;
One. Gulp's API
gulp.task
gulp.run
gulp.watch
gulp.src
gulp.dest
gulp.task
In orchestrator, there are three ways to resolve the above task dependencies:
Returns a data stream in a task-defined function that ends when the end event of the data flow is triggered.
Returns a Promise object in a task-defined function that ends the task when the Promise object resolve.
The callback variable is passed in the function defined by the task, and when callback () executes, the task ends.
The gulp script can use these three methods to implement task dependencies, but since the tasks in gulp are mostly data flow operations, the first approach is dominated.
Gulp Plug-in development
All Gulp.js plug-ins are basically through
(no longer using the word transform) streams, which is the consumer (receiving gulp.src()
the transmitted data, then processing processing), and the producer (passing the processed data out). The use of gulp.js and the development of plug-ins are very simple, of course, there are a lot of details, to see the official documents Gulp.js.
January 16, 2015 published original address: 1190000002491282
Building tools for front-end engineering vs. Gulp versus Grunt