[Docker] managing data in docker containers

Source: Internet
Author: User

Previously, we introduced the basic concepts of docker (not translated before...), learned how to use docker images for work, and learned how to use docker images.
Links between containers. This section describes how to manage data in and between containers.

We will look at the following two main methods to manage docker data.

  • Data Volume
  • Data Volume container
Data Volume

A data volume is specially designed to provide some features in one or more containers through the UFS file system.
Realize data persistence or sharing.

  • Data volumes can be shared and reused between containers.
  • You can directly modify the content in the data volume.
  • Updating the image does not change the content of the data volume.
  • The volumes continue until no containers use them.
Add a data volume

You can use the docker run command with the-V parameter to add a data volume to the container.
You can use the-V parameter multiple times to mount multiple data volumes.
Single volume.

$ sudo docker run -d -P --name web -v /webapp training/webapp python app.py

This will create a/webapp volume in the container

Tip: You can also use the volume command in dockerfile to add a volume to any container created from that image.
One or more new data volumes

 

Mount the host folder to the data volume

In addition, you can use the-V parameter to mount the host's folder to the container.


$ sudo docker -d -P --name web -v /src/webapp:/opt/webapp training/webapp python app.py

In this way, the local Folder/src/webapp will be mounted to the/opt/webapp directory in the container. For testing, this is
For example, we can mount the source code to a container and modify the source code to view the application running status.
The folder on the host must be an absolute path and will be created automatically when the folder does not exist.

Tip: the host File Mounting function for portable and shared wood cannot be used in dockerfile.
For files, the host dependency may cause the container to not work on all hosts.

By default, docker mounts data volumes with read and write permissions, but we can also mount them in read-only mode.

$ sudo docker run -d -P --name web -v /src/webapp:/opt/webapp:ro training/webapp python app.py

Here we have mounted the same directory/src/webapp as above. But we have added the RO option to define
The file permission should be read-only during mounting.

 

Create and attach a data volume container

If you want to share some persistent data between containers or want to use non-persistent containers, the best way is to create
Create a named data volume container and mount the data from the data volume container.

Create a named container with volumes to share data.


$ sudo docker run -d -v /dbdata --name dbdata training/postgres

You can use the -- volumes-from tag in another container to mount the/dbdata volume.


$ sudo docker run -d --volumes-from dbdata --name db1 training/postgres

Then, the/dbdata volume is mounted to another container at the same time:


$ sudo docker run -d --volumes-from dbdata --name db2 training/postgres

You can use multiple -- volumes-from parameters to put multiple volumes in multiple containers together.

You can also mount the containers db1 and DB2 implemented by mounting the dbdata container to expand the relationship chain.


$ sudo docker run -d --name db3 --volumes-from db1 training/postgres

 

Back up, restore, and migrate data

Another useful feature is to use them for backup, recovery, or migration of data.
-- Volumes-from Mark to create a container with the volume to be backed up.


$ sudo docker run --volumes-from dbdata -v $(pwd):/backup ubuntu tar cvf /backup/backup.tar /dbdata

Here we create and log on to a new container, mount the data volumes in the dbdata container, and mount a local directory
At last, we passed a tar command to back up the dbdata volume to/backup. When the command runs
After the rows are complete, the container stops running and retains a backup of dbdata.

Then you can restore the data to the same or another container. Create a new container:


$ sudo docker run -v /dbdata --name dbdata2 ubuntu /bin/bash

Decompress the backup file to the data volume of the new container:


$ sudo docker run --volumes-from dbdata2 -v $(pwd):/backup busybox tar xvf /backup/backup.tar

You can use the above technology and your favorite tools for automatic data backup, migration and recovery.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.