In recent years, Docker in the IT industry is full of scenery, the major technical forum to earn enough eyeballs, the company and inside and outside there are quite a lot of introduction and try, looks so tall on the technology, seemingly will give cloud, service deployment, operations and other fields to bring disruptive innovation.
Recently reviewed some of the documents, more in-depth understanding of Docker technical details, found that Docker is really in the solution to some of the needs of the right, but also absolutely not omnipotent balm.
First, what is Docker
Official definition:
Develop, ship and Run any application, Anywhere
Docker is a platform for developers and sysadmins to develop, ship, and run applications. Docker lets quickly assemble applications from components and eliminates the friction so can come when shipping code . Docker lets you get your code tested and deployed into production as fast as possible.
Interpretation:
The authorities define Docker as the platform for application packaging, deployment, and not virtualization technology;
Docker itself is not revolutionary innovation, its core is Container,container is a lot of years ago put forward the idea, and in Linux, Unix, FreeBSD has the technology to achieve, Linux mature program is LXC;
Docker is a LXC (Linux container), as well as cgroup of the upper-level tools, through the Lxc,cgroup and related system command encapsulation, so that users can very convenient use of Linux container;
Docker Hub provides version management and distribution capabilities for application packages;
Second, the value of Docker and similar technology comparison
Container The core value lies in the isolation and encapsulation, there are more similar solutions in these two aspects, simple analysis and comparison:
1. Isolation
Docker isolation is achieved through container, and is based on LXC; another solution that corresponds to this is virtualization; the comparison is as follows:
Advantages:
Very light weight
Compared to creating or starting a virtual machine, container can assume that there is no overhead, about the equivalent of creating a process under Linux;
There is an exaggeration to say that a single machine can easily start up to thousands of container, but it is not likely to start thousands of virtual machines;
However, the premise of this comparison is that 1000 applications need to be isolated on a single machine, and if only two applications are isolated and each application runs 500 processes, the comparison should be two container and two VMS;
Disadvantages:
High isolation Levels
Docker isolating on top of Linux kernel is exactly the separation between processes, which determines:
1 Docker only supports 64-bit Linux, and must not support Windows, UNIX, BSD ..., and virtualization can support all major operating systems;
2 Docker is isolated on kernel and relies on the kernel feature, so it can only emulate a newer version of the Linux distribution; it must not find an image like Ubuntu 9.04;
3 The kernel used by container is consistent with the kernel of the host, unable to customize the container kernel;
Hardware resource metrics and quota capabilities
Through the Cgroup,docker has certain ability to control the use of container resources, but in the network, disk, CPU usage, measurement and isolation capacity has not yet mature VM;
In addition, from the user's point of view, the VM can be very transparent, can be directly understood as a host, but container in the network connectivity, data storage needs to undergo special processing;
Therefore, compared with the virtualization, from the isolation, Docker's advantage lies in the low performance loss, the disadvantage lies in the high level of isolation;
Docker is preferred if you need to isolate as much as possible on a single machine;
If your needs are as thorough as possible, and the hardware resources are precisely controlled and measurable, and the number of quarantined copies is not much, then virtualization is preferred;
2. Encapsulation
From the Docker logo and the official definition, its main ability is not in isolation, but in the encapsulation, that is, the application of packaging, deployment, operation;
The advantages in encapsulation are mainly embodied in:
Package size
compared to the VM, the image of Docker is really small, the official CentOS image is probably only 220M (was refined into the most basic CentOS, even the ifconfig command is not, install the common tools, you will find it is not small);
Highly customizable and packaged for applications
The application package can be defined through dockerfile, the deployment in image can be highly customized and encapsulated, the external interface is convergent and clear, and the user does not need to pay attention to details;
Image Warehouse and version management
very mature public and private warehouse building and use, as well as version management, and can be easily shared, distributed, deployment of the warehouse;
Application and Operation separation
The image we see is still, read-only, and the image is container after it is run, and changes at run time only affect the current container, without affecting the image;
As you can see from the above, the Docker design concept is advanced, has obvious advantages and has few drawbacks compared to the deployment capabilities of VM image.
However, from a package deployment perspective, Docker should not be a VM, but a combination of rpm+yum,deb+apt.
You will find that the ability is surprisingly similar compared to these two combinations, but there are some differences:
Package Capacity:
Docker image, can be easily implemented on the CentOS run Ubuntu, and very light, this is rpm,deb do not have;
This is very valuable in some testing, learning, or small-scale PAAs scenarios;
But for any larger-scale application, I think it is unlikely that such a deployment will occur;
Package granularity and Dependency management:
Docker since the kernel, fully packaged in a piece, which contains the Linux user environment, System class library, applications, it can be understood that a Docker image external dependence only Linux kernel, on the other hand, greatly improved the ability of transplant;
Rpm , Deb's packaging granularity is finer than Docker, only packaging a single application itself, external dependencies through only records are not stored in the package, and through apt,yum such an auxiliary tool is also very easy to solve external dependencies, the advantage is that the more lightweight, theoretically the packet size is certainly smaller than the Docker In large-scale business deployment, thousands of devices released 20M rpm, with the release of 200M Docker image, the efficiency gap should still be relatively large;
In practical application, in solving the problem of external dependence, C + + program can be implemented simply by static link; Java/php/golang This kind of procedure basically does not exist the System class library dependence, the cross-platform ability also surpasses the Docker itself;
Cross-platform:
Under the major Linux distributions, you can use Docker in the same way to manage packaging and deployment, but Rpm,deb has its own fans and distributions, which are not common among distributions;
But Docker's own learning and deployment costs are higher than rpm,deb;
So, from the packaging, Docker is far better than the VM, and rpm+yum&deb+apt ability, from the user's point of view is a more choice;
Leave the isolation, assuming you need to run a MySQL on a single machine, do you want to install Docker and then use Docker to deploy MySQL? Or is it direct yum install MySQL?
In general, the core value of Docker lies in isolation and encapsulation, and in a good balance between the two, it is foreseeable that in the small-scale PAAs field is the best solution.
Third, the application of Docker in the company's thinking
The core value of Docker is in terms of isolation and encapsulation, and discussing the application of Docker in the company's business should also be seen in both respects.
Which scenarios can be used for isolation:
Research and development testing process, single-machine multi-version parallel deployment and operation;
Intra-company class PAAs services, such as TDW computing, Blue whale;
The official environment of massive service, the basic individual service can run full of single machine, can say that there is no isolation requirement at all;
Which scenarios are used for encapsulation:
All business needs to be packaged and deployed, but each major business has a very mature package release system, integration of research and development, compilation, testing, publishing and architecture, configuration, Routing and other aspects of the ability to tear down the existing package publishing system, based on Docker refactoring, development and user learning costs are quite high, But there is no obvious value in technology.
Docker would be a better choice if it was a newly developed technical framework and operational system, but the individual would prefer a centos+yum+rpm combination.
Iv. Other
IT technology is evolving, new technologies and components are emerging, and usage like Docker is bound to require that the underlying operating environment be kept up to date with the industry.
In order to stay in sync with the industry, the technical team still has to devote enormous effort to work without any business output.
free pick up brother Lian it education original Linux OPS engineer video / detail linux tutorial, For more information, please contact: http://www.lampbrother.net/linux /
or hooking up with Q2430675018.
Welcome to the Linux Communication Group 478068715
Value of Docker and analysis of application scenarios