automate the deployment of Web sites using GitHub webhooks
Transferred from my genuine blog: using GitHub webhooks to automate the deployment of the site
Using MWeb to do their own blog, the server did not directly use the Gh-pages function of GitHub, but deployed to its own server.
Since then, the blog has become three steps away: 1. Use MWeb to generate static Web 2. Push to GitHub 3. The login server pulls up the latest content.
Yesterday thought, can simplify a few steps, let my article push to GitHub, let the server automatically pull article, deploy new content. Do what you say, and implement your ideas. 1. Target
The server automatically pulls the push to the new article on the GitHub. 2. The idea
Idea One: Check the submission of the top repo regularly, and have the update, then start the deployment process. (Active Query method)
Idea two: Whether GitHub supports event reminders or Third-party support. (Passive wakeup mode) (equivalent to message push) 3. Thinking
Active query, consuming CPU time and flow, and will inevitably produce synchronization interval with GitHub.
Passive wake-up, will not consume unnecessary resources, if support is necessarily the first option. 4. Access to information (feasibility analysis)
GitHub support Webhooks and a large number of third-party services, can be very good for the repo push and other operations to respond.
What did Webhooks do?
When GitHub receives a repo action, a POST request with a description of the action is sent to the specified URL. 5. Realization Idea (Summary)
Register Webhooks for the specified repo, point to the interface on my server, the server parses the data, and if the action is push, deploy the behavior. 6. Implementation of 6.1 Department scripts:
deploy.sh
#!/bin/bash
log_file= "/var/log/blog_deploy.log"
date >> "$LOG _file"
echo "Start Deployment" > > "$LOG _file"
cd/path/need/be/deployed/
echo "Pulling source code ..." >> "$LOG _file"
git Checkout Origin gh-pages
Git pull Origin gh-pages
echo "finished." >> "$LOG _file"
Echo >> $LOG _file
Executes the above script whenever a POST request with push is received. 6.2 processing Post Requests
Note: The following Nodejs content is excerpted from Zeng blog-Dust settles
Then we're going to write a script to accept the POST request here at http://dev.lovelucy.info/incoming. Because I ran the node on the machine, I found a nodejs middleware github-webhook-handler. If you are deploying a PHP site, then you should find a version of the world's best language PHP, or write one yourself, just to receive $_post well, so simple, not much nonsense. Mody (̀ω ́)
$ NPM install-g Github-webhook-handler
In view of the fact that NPM pulls repo than poop in the celestial server, we can use the mirror of Ali, which is said to be synchronized with the official 10 minutes. (: 3"∠)
$ npm install-g cnpm--registry=http://r.cnpmjs.org
$ cnpm install-g Github-webhook-handler
Okay, everything's all right, here's the Nodejs monitor program Deploy.js
var http = require (' http ') var Createhandler = require (' Github-webhook-handler ') var handler = Createhandler ({path: '/inc Oming ', Secret: ' Myhashsecret '}///above secret Keep and GitHub background set consistent function run_cmd (cmd, args, callback) {var spawn =
Require (' child_process '). Spawn;
var child = spawn (cmd, args);
var resp = "";
Child.stdout.on (' Data ', function (buffer) {resp + + + buffer.tostring ();});
Child.stdout.on (' End ', function () {callback (RESP)}); } http.createserver (Function (req, res) {handler (req, res, function (err) {Res.statuscode = 404 res.end (' No S Uch location ')}). Listen (7777) handler.on (' Error ', function (err) {console.error (' ERROR: ', Err.message)}) Handl
Er.on (' push ', function (event) {Console.log (' Received a push event for%s to%s ', Event.payload.repository.name,
EVENT.PAYLOAD.REF);
Run_cmd (' sh ', ['./deploy.sh '], function (text) {Console.log (text)}); })/* Handler.on (' issues ', function (event) {Console.log (' Received an issueEvent for% action=%s: #%d%s ', Event.payload.repository.name, Event.payload.action, Event.payload.issue.numbe R, Event.payload.issue.title)}) * *
Then you can run the server.
$ Nodejs Deploy.js
In order to prevent the service from being hung up, there are many ways we can handle it. I chose the nohup with the system itself.
$ nohup Nodejs Deply.js &
Zeng predecessors used Nodejs's forever or Python's supervisor.
Zeng predecessor Blog-Dust down have related introduction. 6.3 Configuration Webhooks monitoring
To point the payload URL to the interface of your own server
var handler = Createhandler ({path: '/incoming ', Secret: ' Myhashsecret '})
Http.createserver (function (req, res) {
handler (req, res, function (err) {
Res.statuscode = 404
res.end (' No such location ')
})
. Listen ( 7777)
This is the key code for deploy.js.
Listen (7777), indicating that the server is listening for 7777 ports
Path: '/incoming ', which means receiving post requests in ip:7777/incoming
Secret: ' Myhashsecret ' requires the same as the secret field above, otherwise the server will refuse to receive requests because it does not match. Mainly to prevent the third direction of this port to send requests. 7. Finally comb
6.3 There know when someone is submitting an article and then tell 6.2 people to push
6.2 from 6.3 Get the message, look at your password (secret) same as mine no, if the same, I will tell the news 6.1
6.1 Start running to the GitHub database pull up the latest data, deployment complete clarified
A friend told me, copy and paste the part more. Even if you add a reprint, it is not very good.
To clarify here:
Technical article features on the network: many , miscellaneous , full
Available Classic examples: less
Cost from production instance: time-consuming
A full original quality article needs: thinking + Original example + code word + Repeat the top three items .
But for the learner, thought + instance + idea already met 80%.
So I think, a technical article that can learn something, do not need the whole original .
Clear Ideas + predecessors to provide a classic example + personal thinking, communication in place can be.
predecessors have summed up well, you send again, not to create network garbage it.
To create a clear idea of the blog, focus on technical articles collation, re-written is the meaning of this blog. I am not the producer of rubbish, I am the scavenger of nature.
Welcome to the personal micro-Boscott, technology, non technical exchange.