1. Background
Agile development has been popular for a long time, and now more and more enterprises are beginning to practice agile development advocated by people-centric, iterative, progressive development concept. The first goal of introducing Docker technology in this scenario is to use the virtualization provided by Docker to create a reusable development environment for the development team, allowing the development environment to be shared with all development members of the project through image to simplify the development environment. However, before Docker technology has been available, such as Vagrant's development environment distribution technology, software developers can create similar requirements of the environment configuration process. So in the context of development, the advantages of Docker technology are not very good to play out. I think the advantage of Docker is that it simplifies the building process of CI (continuous Integration), CD (continuous delivery), and allows developers to devote more effort to development.
Each company has its own development technology stack, and we need to continuously improve it in combination with the actual situation to optimize our own build process. When we are ready to take the first step, we first have to establish a blueprint to be confident, so that the next thing will soon be realized.
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912000.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912000.png"/>
This timing diagram outlines all aspects of the current agile development process. In combination with the Blueprint framework given in the above sequence diagram, the focus of this paper is to explain the practical experience of introducing Docker technology to each link.
2. Create a continuous Release team
When the development team introduced Docker technology, the biggest problem was that there was no industry standard to follow. We often use best practices as a slogan and introduce a variety of toolchain, leading to no focus on using Docker. It involves Docker sizing and spending a lot of time on tool learning rather than choosing the right tools to build a sustainable product development team. Based on this scenario, we can use the "easy-to-use" principle as a criterion to introduce the Docker Technology tool selection Reference. In the process of introducing Docker technology, the first thing that the development team needs to solve is to let team members master the use of Docker command line as soon as possible. After familiar with the Docker command line, the team needs to address several key issues as follows:
1) Base image selection, such as Phusion-baseimage
2) Configure the selection of tools to manage Docker images, such as Ansible, Chef, Puppet
3) host system selection, such as CoreOS, Atomic, Ubuntu
base Image includes the smallest collection of operating system command lines and class libraries, and once enabled, all apps need to create an app image on its basis. Ubuntu, as the default version of the official use, is the most available version, but the system has not been optimized to consider using a third-party version, such as Phusion-baseimage. For the selection of the Rhel, CentOS branch base Image, the security framework SELinux use, block-level storage file system Devicemapper and other technologies, these features are not common with the Ubuntu branch. It is also important to note that the operating system branch is different, the method of its cutting system is completely different, so everyone in the choice of operating system must be cautious.
Configuration Management Docker The Mirrored tool is primarily used for configuration management that creates an image based on Dockerfile. We need to combine the status of the development team and choose a tool familiar to the team as a common tool. Configuration tool has a lot of options, including Ansible as a rising star, in the use of configuration management experience is very simple and easy to use, recommended for reference.
Host The host system is the operating environment for the Docker background process. From a development point of view, it is a normal single OS system, we only deploy the Docker background process and cluster tools, so we want the host host system to be less expensive the better. The host system recommended here is CoreOS, which is currently the least expensive hosting system. In addition, Red Hat's open source atomic host system, there are based on Fedora, CentOS, Rhel Multiple versions of the branch selection, but also a good candidate. Another case is to choose the minimum installed operating system and customize the host system yourself. If your team has this capability, consider customizing the system yourself.
3. Continuous integration of building systems
When the development team submits the code to the GIT application repository, I'm sure all the developers want a system that will help them deploy the application to the application server to save unnecessary labor costs. However, complex application deployment scenarios make this idea easy to implement.
First, we need to have a docker-enabled build system, which is recommended by Jenkins. Its main features are open source projects, easy to customize, easy to use. Jenkins can easily install a variety of third-party plug-ins, making it easy to integrate third-party applications quickly and easily.
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912001.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912001.png"/>
With the job trigger mechanism of the Jenkins system, we can easily create various types of integration job use cases. However, the lack of uniform standard job use case usage can lead to confusion in the use of project job cases and difficult to manage maintenance. This also makes it impossible for development teams to take advantage of the benefits of a well-integrated system, which is not the result of our expectations. Therefore, the agile practice approach presents a concept deploymentpipeline (pipeline deployment) that can be delivered continuously. With Docker technology, we can easily understand and implement this approach.
Jenkins's pipeline deployment visualizes the deployment process as a long pipeline, with one node per interval, that is, job, to complete the job before entering the next session. The form is as follows:
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912002.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912002.png"/>
Image Source:google Image Search
As you can see, every panel in the board after introducing Docker technology, it is possible to use Docker to modularize the task and then make a targeted image to run the desired task. The creation of each task image can be done in the developer's own environment, and a similar scenario can be consulted:
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912003.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912003.png"/>
Image Source:google Image Search
So, after using Docker, the modularity of the task is naturally defined. Through the pipeline diagram, you can see the execution time for each step. Developers can also define stringent performance criteria for each task for the needs of the task, which has been used as a reference base for subsequent testing efforts.
4. The best release environment
The application is tested and we need to publish it to the test environment and production environment. How to use Docker more rationally in this phase is also a challenge, and the development team needs to consider how to build a scalable distribution environment. In fact, this environment is a private, Docker-based cloud, and what we might expect is a PAAs cloud service that provides API interfaces. To build this PAAs service, here are a few of the most popular tools you can use to customize your enterprise-private PAAs services.
1) Apache Mesos + marathon
apache Mesos System is a set of resource management scheduling cluster system, the production environment uses it can realize the application cluster. This system is an Apache open source project launched by Twitter. In this cluster system, we can use zookeeper to open 3 Mesos Master Services, when 3 Mesos master through Zookeeper Exchange information will elect leader service, then sent to the other two slave Messos Requests on Master are forwarded to the Messos Master leader service. Mesos slave server will send memory, storage space and CPU resource information to Messos master when it is turned on. Mesos is a framework that is designed to perform data analysis only when it is used for job execution. It does not run a long-running service such as Web service Nginx, so we need to use marathon to support this requirement. Marathon has its own rest API, we can create the following configuration file Docker.json:
{"Container": {"type": "Docker", "Docker": {"image": "Libmesos/ubuntu"}}, "id": "Ubuntu", "Instances": "1", "CPUs": "0 .5 "," mem ":" + "," URIs ": []," cmd ":" While sleep 10; Do date-u +%t; Done "}
and then call
Curl-x post-h "Content-type:application/json" Http://<master>:8080/v2/apps [email protected]
We can create a Web service on the Mesos cluster. For the specific case of marathon, you can refer to the official case.
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912004.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912004.png"/>
Image Source:google Image Search
2) Google Kubernetes
Google's container cluster management tool, which presents two concepts:
Pods, each pod is a collection of containers and deployed on the same host, sharing IP addresses and storage spaces, such as Apache,redis, into a set of container collections.
Labels provides a service tag that facilitates the invocation of collaboration between Pod containers.
With an introduction to the official architecture design documentation, you can learn more about the design ideas of each component. This is the only open-source container solution that has been launched on the basis of the experience of deployment in the production environment, and can foresee the future as the industry reference standard for container management systems.
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912005.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912005.png"/>
Image Source:google Image Search
3) Panamax
650) this.width=650; "Src=" http://cdn1.infoqstatic.com/statics_s1_20160301-0105u1/resource/articles/ Docker-integrated-test-and-deployment/zh/resources/0912006.png "width=" "style=" border:1px solid; Vertical-align:middle;padding:2px;margin-left:auto;margin-right:auto;height:auto;background:rgb (238,238,238); " alt= "0912006.png"/>
5. Conclusion
Docker's integrated deployment solution is a flexible and simple set of toolset solutions. It overcomes the complex and difficult dilemmas of previous cluster tools and deploys software applications using the concept of a unified Docker application container. Through the introduction of Docker technology, the development team in the face of complex production environment, can be combined with the actual situation of their own team, to customize the appropriate software release plan for their own infrastructure.
The address of this paper Docker (four): Docker integration test deployment of the road
This article is from the "Free Tutorials" blog, so be sure to keep this source http://tutorialonfree.blog.51cto.com/11272822/1747894
The way Docker (iv): Docker's integration test deployment