Python Operations Azure Blob Storage

Source: Internet
Author: User
Tags azure sdk



The basic concept of Azure Blob Storage is described in the article "Azure Basics: Blob Storage" and shows how to do basic operations through C # code. Recently I need to do something similar in Linux, so I decided to use Azure Storage SDK for Python provided by Azure to manipulate Blob Storage. This way, you can use Python on both Windows and Linux in the future. For those of you who are not familiar with the concept of Azure Blob Storage, please refer to the preceding article first.


Install the Azure Storage SDK for Python


The simplest way is to execute the following command directly on a machine that has Python and Pip installed:\

pip install azure-storage
After the installation is complete, use the pip freeze command to view the installed version:


Since the Azure Storage SDK for Python is an open source project, you can also install it through source code, please refer to the official documentation.

Create Blob Container
Since any Blob must be contained in a Blob Container, our first task is to create a Blob Container.
The SDK provides us with an object called BlockBlobService. Through this object we can create and operate Blob Container. The following code creates a Container named "nickcon":


The code itself is very simple, where account_name and account_key are your storage account and its access key. We use the GUI tool Microsoft Azure Storage Explorer to view the results of the code operation:


The Blob Container named nickcon has been successfully created.

upload files
Next we have to upload the local file to the Blob Container just created. The Azure SDK provides us with the following four methods:

create_blob_from_path #Upload the file with the specified path.
create_blob_from_stream #Upload the contents of a data stream.
create_blob_from_bytes #Upload a bype array.
create_blob_from_text #Use a specific encoding format to upload strings.
Yes, you read that right, all methods do not have upload in the name, but use create instead. This also shows that the essence of uploading files is to create a Blob object in the cloud.

from azure.storage.blob import BlockBlobService
from azure.storage.blob import ContentSettings

mystoragename = "xxxx"
mystoragekey = "yyyy"
blob_service = BlockBlobService (account_name = mystoragename, account_key = mystoragekey)

blob_service.create_blob_from_path (
    ‘Nickcon’,
    ‘Myblobcortana.jpg’,
    ‘Cortana-wallpaper.jpg’,
    content_settings = ContentSettings (content_type = ‘image / jpg’))
This time we introduced the type ContentSettings, mainly to specify the type of file. Note the second parameter of the create_blob_from_path method, we need to specify a name for the new blob object. The first parameter is the target Container, and the third parameter is the local file path to be uploaded. Executing the above script will upload a local wallpaper cortana-wallpaper.jpg to Azure Blob Container:


The name of the Blob object created in the Container is no longer the name of the source file, but our specified myblobcortana.jpg.

Control access
The files stored in the Blob Container have corresponding URLs, which is the default policy of Azure Blob Storage. The reason is that we can access these files via URL from anywhere. For example, the URL of the myblobcortana.jpg file is:

https://nickpsdk.blob.core.windows.net/nickcon/myblobcortana.jpg
Paste this address directly into the address bar of the browser:


Ah, embarrassed, received a ruthless error!

Think about it carefully, it is reasonable to receive such an error. Otherwise, anyone can see the content of the file I saved, what is the privacy? Will anyone pay for Azure Blob Storage? The truth of the matter is this. By default, the Blob Container and Blob objects we create are private, that is, they must be accessed by account and access key. If you want to make the content a public resource that everyone can access, you can specify PublicAccess when you create it. You can also modify its property to PublicAccess after creation. Below we set the nickcon Container to PublicAccess:

from azure.storage.blob import BlockBlobService
from azure.storage.blob import PublicAccess

mystoragename = "xxxx"
mystoragekey = "yyyy"
blob_service = BlockBlobService (account_name = mystoragename, account_key = mystoragekey)

blob_service.set_container_acl (‘nickcon’, public_access = PublicAccess.Container)
Here, the PublicAccess type is imported, and the set_container_acl method is called to modify the access rights of the Container. Try to refresh the page again:


Don't put your private photos in your Blob Container at this time!

List all files in Blob Container
Checking which files are in the Container is an important operation. Of course, we can easily complete it:

generator = blob_service.list_blobs (‘nickcon’)
for blob in generator:
print (blob.name)
Use the list_blobs method to get all Blob objects in the Container. The above code prints the names of all Blob objects.

Download Blob object
Like creating a Blob object, there are four methods to download the Blob object. During the simple period, we only demonstrate the get_blob_to_path method, other usages are similar:

blob_service.get_blob_to_path (‘nickcon’, ‘myblobcortana.jpg’, ‘newimage.png’)
The second parameter is the name of the Blob object in the Container, and the third parameter is the path to the local file.

Delete Blob object
There are naturally created and deleted, the code is very simple, no longer verbose:

blob_service.delete_blob (‘nickcon’, ‘myblobcortana.jpg’)
Back up files in Blob Container
Yes, you heard it right!
We believe in the security of cloud storage, but it is also necessary to back up important data to other storage. The following code will backup the contents of all Blob Containers in an Azure Storage Account to a local disk:

from azure.storage.blob import BlockBlobService
import os

mystoragename = "xxxx"
mystoragekey = "yyyy"
blob_service = BlockBlobService (account_name = mystoragename, account_key = mystoragekey)

# Download all files in a Blob Container
def downloadFilesInContainer (blobContainName):
    generator = blob_service.list_blobs (blobContainName)
    for blob in generator:
        # Get the directory path of the blob file
        blobDirName = os.path.dirname (blob.name)
        # Add the name of the Blob Container as a first-level directory
        newBlobDirName = os.path.join (blobContainName, blobDirName)
        # Check if the file directory exists, create it if it does not exist
        if not os.path.exists (newBlobDirName):
            os.makedirs (newBlobDirName)
        localFileName = os.path.join (blobContainName, blob.name)
        blob_service.get_blob_to_path (blobContainName, blob.name, localFileName)

# Get user-owned Blob Container
containerGenerator = blob_service.list_containers ()
for con in containerGenerator:
    downloadFilesInContainer (con.name)
One thing to note here is that blob.name contains the directory of the file in the container. For example, the path of a file in the Blob Container is abc / test.txt, then its blog.name is abc / test.txt. To keep the name and path of the file in the Blob Container, it is necessary to create a corresponding directory structure locally.

to sum up
The final demo can simply implement the function of backing up all Blob files. Because Microsoft encapsulates the relevant interface very clearly, the code is very short. The advantage of using Python is that you can run the same code on different platforms. This is great when you need to do the same thing in different operating systems!

Python operation Azure Blob Storage

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.