Example of using Node. js to implement simple Webhook

Source: Internet
Author: User
Tags commit json string format git clone webhook

It has been a long time since Node. js came out. I feel that if Node. js is not clicked on the front end, it will be a little behind the times. I started to get in touch with it from the summer vacation last year. When I was writing something amazing, I got in touch with it by the way. Although the npm community on the Internet is not very good, I have been using it for so long and feel that Node. js is still a good tool. This article is divided into two parts: the first half is used to introduce Node. js, and the second half is a small project written using Node. js: a simple WebHook.

Although it is a popular science, you still need to be familiar with the basic syntax of JavaScript, its asynchronous ideas, and some database query statements and command line operations, in addition, the following instance uses Coding as an example, so you need to understand the basic operations of Coding.

What is Node. js?
If you are using the Chrome browser, you will surely think it is faster than other browsers, one of the reasons is that Chrome has something called V8 that can efficiently parse JavaScript. Node. in fact, the author of js initially planned to use Ruby to write a local running platform, but later found that Ruby was not performing well, so he started to use the V8 engine and made many modifications, finally, Node was born. js.

So what is Node. js? To put it bluntly, it is nothing more than a local JavaScript interpreter. In fact, it cannot be said "interpreter", because V8 will compile it into native machine code (IA-32, x86-64, ARM, MIPS, etc.), and will use inline caching and other methods to improve performance. It is said that with the help of V8, the JavaScript operating efficiency is directly forced to binary programs. However, compared with V8, Node. js has more functions, such as direct access to the file system and binary data processing.

When I heard about Node. js, many people think that this is used to write servers. A little more open-minded. I didn't just say, can I directly access the file system and process binary data? This means you can use JavaScript syntax to write a variety of local tools. The most famous ones are front-end automated building tools: Webpack, Gulp, Grunt ...... By the way, we can insert a front-end story.

A long time ago in a galaxy far, far away...

The front-end concept is nothing more than HTML, CSS, and JavaScript. At that time, the style and interaction of pages were not as complex as they are now, so you only need to complete basic style display and data operations.

As time went...

Various complex pages emerged one after another, and even large projects such as Angular and React emerged. To speed up webpage loading, front-end users have to splice all files and compress them together before publishing to save traffic and requests.

Any of the three tools mentioned above can meet this requirement. After the configuration is complete, we only need to execute a grunt build statement on the command line to splice various scattered code files, compress them together, and even compress the images; execute a gulp serve statement to enable a small server locally to preview the write effect.

A good helper for Node. js: NPM
In fact, Node. js programmers seldom input node commands. The most commonly used commands are npm. So what is NPM? This has to mention two concepts: Package and dependency.

If you have used Linux, you must be familiar with these two concepts. For example, if you want to install Ruby, you must first install libreadline and libruby, because Ruby must depend on them to run. Why is there no dependency concept in Windows? Because Windows programs are automatically installed during installation, and of course there are exceptions. For example, to run a large game, you must first install the VC ++ runtime library and DirectX runtime library.

Do you still remember the "use grunt build to compress images" mentioned earlier? In fact, this step of compression is not done by Grunt, but by a tool called imagemin. If you want to install it, you can download the corresponding code from GitHub, and then download the code of the 36 projects that this guy depends on. They are: gulp-imagemin, node-atlas, cropshop ...... And then add the dependencies of these projects ......

Ah!

Fortunately, we have NPM. We only need npm install-g imagemin, and NPM will read imagemin dependencies from the specified source (the official source by default, then read the dependencies in these dependencies ...... Generate an installation sequence by means of topological sorting, and then automatically install everything you need. If-g is included in your command, it is global installation, the execution is just as natural as the native command line tool. You can also delete them with a command.

A tool is a package. The full name of NPM is Node Package Manager.

Write a Node. js program
A long time ago, one of my teams had a Webhook written in PHP, but sometimes the network speed is poor and the execution time is too long, it will be forcibly broken by PHP. Of course, it can be like this: The Web end is only responsible for receiving Webhook requests and then saving them to the database. Then, the backend writes a daemon constantly polling the database to see if there are any projects that require pull/deploy. However, JavaScript is based on a single-thread event queue and can monitor various events in real time without occupying resources. Therefore, I try to write a Webhook program using Node. js.

My requirement is very simple: all configurations of projects that need to be added to Webhook are stored in the configuration file, and the running records of Webhook are stored in the database. The Web end listens to a specific port, you only need to provide several APIs.

First, create a project directory, use npm init to create a project, fill in various information in it, and finally generate the package. json file. Note that the running method of our program is node index. js. You can bind a command for it: npm start. In fact, we can also set more commands for npm.

Then you can write a project in this directory! The configuration file is easy to write:

// Listening port
Var port = 9091;
// Project configuration
Var projects = {
Mall :{
Path: '/data/www ',
Url: 'Git @ git.coding.net: Click_04/mall. git'
},
Lib :{
Path: '/store ',
Url: 'Git @ git.coding.net: Click_04/lib. git'
},
// More projects...
};
// Database configuration
Var db = {
Host: 'localhost ',
User: 'root ',
Password: 'root ',
Database: 'webhook'
};
Module. exports = {
Projects: projects,
Port: port,
Db: db
};
Module is related to the Node. js module organization. Node. js almost complies with the CommonJS standard. However, this is beyond the scope of this article.

So how can we write a server that can listen to the port? In fact, it is very simple, because Node. js comes with the http module, we only need to do this:

Var http = require ('http ');
Var config = require ('./config. Js ');
Var server = http. createServer (function (req, res ){
// Receives POST data. If the request method is not POST, the variable is an empty string.
Var POST = '';
Req. on ('data', function (chunk) {POST + = chunk ;});
Req. on ('end', function (){
// Execute the back-end logic code
});
});
Server. listen (config. port );
Console. log ("Server runing at port:" + config. port + ".");
Http. the callback function of createServer is what needs to be done after the server is created. The http mechanism is that there is always only one thread and various req events are monitored. For example, a data event is receiving data, the end event indicates that the data of the current request has been received. Of course, the data here refers to POST data, and something like header exists directly in the req variable (you can try the console. log (req) to output the req variable to the terminal ). Then we can output data through some methods provided by res.

The next question is how to connect to the database. Node. js does not come with this plug-in, so we must manually install it:

Npm install -- save mysql
Option -- save indicates to add this library to package. json, so that people who get the code can directly execute npm install to install all dependencies. Mysql is used like this:

Var mysql = require ('mysql ');
// Config is the above config
Var pool = mysql. createPool (config. db );
Pool. getConnection (function (err, conn ){
If (err) throw err;
// You can use conn to do some things.
});
The last problem to be solved is how to execute the git command of the command line. This Node. js also comes:

Var exec = require('child_process'cmd.exe c;
Exec (performance_str, function (err, stdout, stderr ){
Var status = err? -1: 1,
Performance_result = err? Stderr: stdout;
// Error information, standard output, and standard error output can be obtained. Continue to process the error.
});
All technical issues have been cleared up, so you can start thinking!

First, I analyzed the data transmitted by the Coding Webhook. First, it must be a JSON string. Second, if the zen attribute exists, it is a test request. If The commits attribute exists, it is a normal request. According to the JSON string format, you can obtain the required data and insert it into the database:

Data = (POST = '')? {}: JSON. parse (POST );
If (data. commits ){
// Obtain data
Var project_name = data. repository. name,
Trigger_user = data. user. global_key,
Commit_user = data. commits [0]. committer. name,
Commit_user_email = data. commits [0]. committer. email,
Commit_message = data. commits [0]. short_message;
If (! Config. projects [project_name]) {
Return;
    }
// Database query
Conn. query ('Insert INTO 'log' ('Project _ name', 'triggers _ user', 'commit _ user', 'commit _ user_email ', 'commit _ message ') VALUES (?, ?, ?, ?, ?) ',
[Project_name, trigger_user, commit_user, commit_user_email, commit_message],
Function (err, results ){
If (err) throw err;
// Concatenate the git command string
Var cmd_str = 'CD' + config. projects [project_name]. path + '/' + project_name + '& git pull origin master ',
Log_id = results. insertId;
// Execute the command
Exec (performance_str, function (err, stdout, stderr ){
Var status = err? -1: 1,
Performance_result = err? Stderr: stdout;
// Update the database
Conn. query ('update' log 'set 'status' = ?, Performance_result =? WHERE 'log _ id' =? ', [Status, cmd_result, log_id], function (err, results ){
// End the write operation on the returned data
Res. end ();
});
});
});
}
JSON. parse is a JavaScript method that converts a JSON string to a JSON object. We only need to set the Webhook address on the Coding to http: // ip: 9091/or port forwarding through Nginx or other programs, so we can see the effect of Webhook!

Most of the code is quite understandable, that is, res. end is a bit awkward. For most languages, after the execution is complete, it will automatically stop writing data to the Response body, and can notify the browser that "I have finished writing, you don't have to wait", but Node. js http does not work. You must manually add this sentence. If it is not added, the browser will wait. In fact, some Node. js frameworks such as Express allow you to focus on backend logic without worrying about these details.

I noticed that query-> exec-> query has three layers of callback. This is a big pitfall of JavaScript. Of course we can change it to Promise, but it doesn't actually change much, but it just makes you more comfortable to write. How to use asynchronous thinking to write programs is also a very interesting issue, but it is also a headache. There are many solutions to avoid falling into the trap of callback functions, but this project in this article is very small, so it is not necessary.

In fact, this function is enough for a Webhook, but I want to do something else: directly display the log on the web page, or display all the projects that have already been added to the Webhook. We can write the if statement of the previous code:

Else {
// Process various GET requests or POST requests with empty body
Res. writeHeader (200, {'content-type': 'application/json '});
// Try to determine the request type through URL
Var match = '';
// Display log
If (req. url = '/log '){
Conn. query ('select * FROM 'log' order by 'log _ id' desc limit 30', [], function (err, results ){
If (err) throw err;
Res. write (JSON. stringify (results ));
Res. end ();
});
    }
// Display all project information added to Webhook
Else if (req. url = '/Project '){
Res. write (JSON. stringify (config. projects ));
Res. end ();
    }
// Manually pull/clone a project
Else if (match = req. url. match (// (pull | clone) \/(. +)/I )){
If (! Config. projects [match [2]) {
Res. end ();
Return;
        }
Conn. query ('Insert INTO 'log' ('Project _ name') VALUES (?) ', [Match [2], function (err, results ){
If (err) throw err;
Var comment _str = '';
If (match [1] = 'Clone '){
Pai_str = 'CD' + config. projects [match [2]. path + '& git clone' + config. projects [match [2]. url;
            }
Else if (match [1] = 'pull '){
Pai_str = 'CD' + config. projects [match [2]. path + '/' + match [2] + '& git pull origin master ';
            }
Var log_id = results. insertId;
Exec (performance_str, function (err, stdout, stderr ){
Var status = err? -1: 1,
Performance_result = err? Stderr: stdout;
Conn. query ('update' log 'set 'status' = ?, Performance_result =? WHERE 'log _ id' =? ', [Status, cmd_result, log_id], function (err, results ){});
});
});
Res. end ();
    }
}
Output the header through res. writeHeader and output a piece of text through res. write. JSON. stringify is a built-in JavaScript method that converts a JSON object to a string. Because it is manually triggered (Manual), only the project name can be obtained, and the submission information cannot be displayed (although it can be obtained through the git command, it is troublesome ), the preceding automatic trigger is a Coding request, which contains the complete information.

At last, I used the supervisor to guard the Node. js process and used Nginx for port forwarding. Of course these are not covered in this article.

Let's take a look at the effect. push it in a project, or manually execute pull/clone, and then check the log on the server. For convenience, I wrote a page, requested logs in AJAX form, and then displayed the data in a table. Previous screenshot

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.