650) this.width=650; "title=" Docker-use-cases.png "class=" img-polaroid "alt=" docker-use-cases.png "src="/HTTP/ Dockerone.com/uploads/article/20150106/802e00800b6dca3038b072bffdd56e15.png "/>1. Simplified configuration
This is the main usage scenario for Docker's advertised Docker. The biggest benefit of a virtual machine is the ability to run a variety of different platforms (software, systems) on your hardware infrastructure, and Docker provides the same functionality in the event of an overhead reduction. It allows you to put the runtime environment and configuration in code and deploy it, and the same Docker configuration can be used in different environments, reducing the coupling between hardware requirements and the application environment.
2. Code Line Pipeline Management
The previous scenario was a great help in managing the code pipeline. The code from the developer's machine to the final deployment on the production environment requires a lot of intermediate environments. While each of these environments has its own small difference, Docker gives the application a consistent environment from development to on-line, making the code pipeline a lot easier.
3. Improve development efficiency
This brings some additional benefits: Docker can improve developer productivity. In different development environments, we want to do two things well. First, we want to make the development environment as close as possible to the production environment, second, we want to quickly build a development environment.
Ideally, to achieve the first goal, we need to run each service in a separate virtual machine to monitor the running state of the service in the production environment. However, we do not want to need a network connection every time, each time the recompilation of the remote connection is particularly troublesome. This is where Docker makes a particularly good place, the machines for the development environment are usually small in memory, and we often need to add memory to the machine in the development environment before using virtual, and now Docker can easily get dozens of services running in Docker.
4. Isolate the application
There are a number of reasons why you might choose to run different applications on a single machine, such as the previously mentioned scenarios for improving development efficiency.
We often need to consider two points, one is to reduce the cost of server consolidation, the second is to split a monolithic application into a single loosely-coupled service.
5. Consolidating servers
Just as with virtual machines to consolidate multiple applications, Docker's ability to isolate applications enables Docker to consolidate multiple servers to reduce costs. With no memory footprint for multiple operating systems and the ability to share unused memory across multiple instances, Docker provides a better server consolidation solution than virtual machines.
6. Ability to debug
Docker provides a number of tools that are not necessarily just for containers, but for containers. They provide a number of features, including the ability to set checkpoints, set versions, and view the differences between two containers, which can help you debug a bug.
7. Multi-tenant environment
Another interesting usage scenario for Docker is in multi-tenant applications, which avoids rewriting of critical applications. One of our special examples of this scenario is the development of a fast, easy-to-use multitenant environment for IoT applications. The basic code for this multi-tenancy is very complex and difficult to handle, and re-planning such an application consumes both time and money.
With Docker, you can create an isolated environment for multiple instances of each tenant's application tier, which is simple and inexpensive, thanks to the speed of the Docker environment and its efficient diff
commands.
8. Rapid Deployment
Before a virtual machine, the introduction of new hardware resources takes several days to consume. Docker's virtualization technology has reduced this time to a few minutes, and Docker just takes seconds to create a container process without starting the operating system. That's what Google and Facebook are all about.
You can create destruction resources in the datacenter without worrying about the overhead of restarting. Typically, the data center has a resource utilization of only 30%, and the utilization of resources can be increased by using Docker and efficient resource allocation.
This article is from the "Cloud Life" blog, make sure to keep this source http://ovcer.blog.51cto.com/1145188/1617924
Eight real-world Docker scenarios