A docker-based Jenkins pipeline workflow.

Source: Internet
Author: User
Tags docker hub jenkins docker k8s groovy script
This is a creation in Article, where the information may have evolved or changed.

1, the traditional development model of our project is to put forward the demand for product research, development team Research decided to choose the development plan. Then start a cycle of development, module development after the completion of the inter-module of the tune. Package is delivered to the Test team after the transfer is complete. Test team, System test or automated test, then submit bug, development team fix bug, cycle.

2, the traditional model, there are more uncertainties. For example, the development environment, the compilation environment, the test environment, the production environment, and other uncertainties. The uncertainty of human intervention in packaging, lack of unit testing and integration of automated testing. As a result, the development-test-repair cycle is long, and many small problems can be covered entirely by unit tests.

Continuous delivery is not a specific software, but a result. This result requires the team to be able to release a new and accurate version at any time, and requires automated testing in the process of compiling the release, through automated testing to detect and locate existing bugs in a timely manner, fix bugs before they are quickly released to the test environment, and test teams test directly. The difference from the traditional model is that continuous delivery can detect bugs in advance and quickly fix them without having to wait until the testers have intervened.  Continuous delivery breaks down to be "continuous" and "delivered." Continuous: Continuous demand for any time, the situation can be accurately released, to achieve accurate release needs to pay attention to the following key points.

The continuation should be a periodic one, which can be a point in time of day, or it can be a commit of a code, or a human trigger at a time. So it is not possible to build manually, automated builds are required, and any process required to build automation must run as a script, code checkout, code building, module code unit testing, integration testing, UI Automation testing, and so on.

Released version of the program does not allow the individual modules to compile a version in the development environment as a delivery, but requires a clean build environment.

The build process should require the maximum possible curing, such as the version of the operating system, the build environment version, dependent dependencies, and so on.

Avoid getting related files from the network, which is especially important for projects that are developed or compiled with Nodejs, it is always a long process to install the dependency package for node, even if there is a domestic source, the general project also needs a two-minute node dependency package, which is not in line with the fast build.

Delivery: In the process of continuous compilation, the use of automation has been able to avoid most of the errors. However, there is a need for human intervention system testing, after all, automated testing can only cover about 70%.

Based on the effectiveness of this approach within our team, continuous integration does make our work a lot easier, and every time code is built and automated testing lets us spot bugs in time. Good working patterns also need to be adhered to by team members, and team members should actively embrace this way of working, and team members need to do the following.

Use version tools such as Git. Git has a powerful version of backtracking, and members do code submissions every time a small feature point is completed. Merged into the master branch, the continuous delivery tool should be configured to trigger for code updates. The team should wait until the end of the continuous delivery process, verify that the compilation, the automated test passes before the next version of the submission, so that the bug is easy to locate. Without causing this bug to affect the work of other members of the team.

Code bugs in the main branch should not be kept too long, and other issues will be introduced when other members of the team merge the code.

Test-driven development, any new feature development should first write the unit test script, and actively update the automated test script. and embrace the test actively, although you understand that the problem of this test does not lead to a large systemic problem, but it should be repaired instead of trying to skip this automated test.

Do not submit code when you are close to work, mainly because of compliance with 2nd.

Docker has become increasingly hot, CICD and DevOps are also an important scenario for Docker. There are advantages to using Docker in continuous delivery.

Docker's strong environmental isolation allows the environment and programs to be packaged together, tested, operational, and people don't need to know how our programs are configured, and it takes just one docker command to run our programs, which makes it easier to deploy continuously.

Reduce the pollution of the compilation environment, because Docker's natural isolation, but also avoids the traditional compilation environment difficult to configure multiple sets of compilation environment problems. In a continuous Docker-based release, we can compile different versions of the Java project on the same host, with different versions of the Python project, without any configuration, and the image is only taken from the Docker hub.

Continuous integration

In terms of continuous integration, we choose Jenkins. Jenkins is an open source software, with a number of excellent plug-ins, relying on these plugins, we can complete a number of cycles, tedious, complex tasks. For example, the ongoing release we share today, although Jenkins solves our tedious and complex cyclical operations, does not address our need to build builds in multiple environments. And this scenario is what Docker's strengths are. Through the pipeline of Jenkins we can automate the process of code checkout, unit testing, compiling, building, publishing, testing, and finally through the Jenkins Docker plug-in to create a mirror image of the output to facilitate deployment to the Docker environment.

Continuous deployment

Continuous integration allows our new code to be built into images that have undergone unit testing, automated testing, but have not been rigorously tested by test teams. Jenkins is a powerful, continuous integration tool, but continuous deployment is not the strength of Jenkins, but Jenkins has a lot of powerful plugins. 22. And we continue to integrate the output is the mirror, so the continuous deployment, we only need to run the mirror, or take advantage of the third-party container management platform provided by the API for deployment.

Local deployment applies To Docker: local deployment to Docker containers can use Jenkins's Docker plug-in, as described below.

Docker, Appsoar deployed to a remote host. Both Docker and Appsoar support open API calls. With the existing API we can run our build image version. To achieve the latest version of continuous deployment.

Deploy to Kubernetes. Kubernetes can also create or update deployments in the same way that KUBECTL is configured in Jenkins, except through API calls.

It's easy to run Jenkins:docker to deploy Jenkins in Docker, and we'll show you how to run Jenkins with Docker.


Here, the Docker.sock and Docker executables are mounted in the Jenkins container so that we can use Docker in the container.

Jenkins container, the default user is Jenkins because we need to use Docker so we need to use the root user.

/var/jenkins_home is optional, jenkins_home stores all tasks, logs, certifications, plugins and other files after the Jenkins run. Can be used for data recovery.

Configuring Jenkins

1. Unlock Jenkins: Unlocked password can be viewed in the container's log, or directly view jenkins_home specified file


Select Plugins


Create pipeline below we create a pipeline of Jenkins to complete the simple CICD process.

Create a new pipeline and select pipeline on the left.

Create a new git and mirrored warehouse in credentials on the left credentials

Configure pipeline, such as timed trigger, code UPDATE trigger, Webhook trigger, etc.

In the pipeline script, fill in the following demo.

Here is the pseudo-code, which only provides ideas




The scripting syntax of Jenkins pipeline is the syntax of groovy, where Docker and git are the capabilities that plug-ins provide. The execution flow of the code is as follows:


Get the latest code from the Git plugin to the work area of Jenkins, such as '/var/jenkins_home/workspace/pipelinedemo.

Docker.image (). Inside how to compile our code, you can see the following log from the console of Jenkins.

The ability to build images via the Docker plugin is dockerfile stored in the code directory. After the image is built, push to the Mirror warehouse, the private warehouse needs to configure the mirrored warehouse itself.

Once the image is built, you can delete the old version and rerun a new version.


Friends who are familiar with Docker commands should be easy to understand, originally Docker.image (). When inside starts, it hangs the current directory in the container and executes it in the container./script/ Build.sh, so we can do unit tests or build builds using the environment that exists in the container.

With a simple example, the combination of Jenkins and Docker brings Cicd convenience and power. All we need to prepare is a compiled script that can use any environment and any version in the compilation script.

Pipeline Introduction

The task of Jenkins is two major versions.

The free style is just an automated script, and the script type is shell. All scripts run on a single machine, and the required environment needs to be prepared in advance. Configuration is not centralized, confusing. But in general it is enough.

Pipeline is a jenkins2 version that uses a groovy script-based task type that combines different parts of the build into a single pipline through a series of stages. And with step you can complete the asynchronous operation. Because groovy-based programmability is more

Powerful, and scripts can be stored in the source code, and the script changes do not need to be modified directly into Jenkins.

Some experience and skills of pipeline

Jenkins less information, the official website can view the content is not much, the general requirements of Jenkins built-in pipeline-syntax inside there is a common command generator. can meet most requirements.

After the pipeline script debugging is completed, the script should be placed in the source directory as a file, which is easy to modify. and multiple branches need to be compiled to isolate each other.

Instead of using SH to solve the problem by executing a script, you should find the appropriate plug-in.

The Jenkins_home directory should be hung out, if the encounter with Jenkins crashes can be timely recovery data.

A new timed pipeline should be created to clean up the generated image and reduce the use of hard disk resources.

Page new pipeline, after the page is deleted, the corresponding project file in Jenkins_home/workspace will not be deleted.

Q: How does kubernetes combine Jenkins for continuous integration?

A: Deploy to Kubernetes. Kubernetes can also create or update deployments in the same way that KUBECTL is configured in Jenkins, except through API calls.

Q: Does Jenkins have to build the code into a docker image by pipeline?

A: Not necessarily, the use of Docker is mainly convenient for the isolation of the compilation environment, you can also configure the NFS, after the completion of the build copy to a fixed server, which we generally called the product library

Q:docker currently the official private warehouse registry does not provide the image deletion function, how do you have the image version management?

A:apphouse is a mirrored warehouse product of our company Docker-based registry, we have expanded the functionality of delete, copy, and so on. If you are interested, you can get our apphouse in our company's website.

Q:P How can ipeline be deployed to different nodes through Docker container deployment? How did the release experience the rollback version of the issue?

A: As mentioned in my previous manuscript, Jenkins is more capable of doing continuous integration, and deploying and rolling back is not the strength of Jenkins, especially when it's hard for a rollback single to rely on Jenkins for a perfect solution. However, deployments to different Docker nodes can be deployed using third-party management platforms, such as the API capabilities provided by Appsoar and card k8s. Jenkins directly calls the Curl command to execute the API provided by the container management platform.

How is the report for each link in q:pipeline fast? such as code static check, engineering construction, test report and so on?

a:http://jenkins:8080/job/clearimages/86/wfapi/

With the Jenkins API, you get some status and time information, and for detailed code static checks, each language has a different syntax check. You need to configure it yourself. Of course detailed needs to view the output log.

Q: How do I trigger a workflow?

A:jenkins pipeline is available in three ways (if the plug-in that has SCM installed may have other ways to trigger), go to the Pipeline Settings page respectively. Wbhook (triggers a remote build (for example, using a script), timed trigger (build periodically), Code UPDATE trigger (Poll SCM)

Q:jenkins compiler environment is how to deal with, the actual user's compilation requirements and environment are different.

A: Users need to know the basic situation of the compilation environment you use, such as Golang's compilation environment, where the Gopath in the container is, where you need to place the code of your ln to what directory you want to compile, and you need to study the use of the build environment yourself.

Q:jenkins in the user Rights management, your company's CI CD is how to achieve user isolation, each user can only see their own projects.

A:jenkins does not have user rights. Companies in the development of products, there is a virtual concept called the user group, corresponding to k8s in one or more namespaces. Administrators add member users to this user group, and resources created by members within the group (pipeline, clusters, services, and so on) are visible within the group. User groups for logical conceptual isolation


Q: How does Jenkins and Kubernetes in your company work together? What is the deployment form?

A: I see a lot of friends asking questions, how Jenkins deploys across hosts or how to deploy to kubernetes clusters, and how to roll back. Jenkins is weak in this area and can only support kube-api-server calls, and if it's hard to complete the demand with Jenkins, we have a module dedicated to Kubernetes's deploy. An App Store module, a uflow module that encapsulates Jenkins, the Uflow module gets the template from the store and replaces the template with the mirrored tag number that is built for the current compilation and is delivered to the Deploy module for creation. Rollback and upgrade are the responsibility of the Deploy module. This separates each other and does their duties.





Content sharing from the PTZ Docker technology Group

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.