Uploading files from a Windows EC2 instance to a S3 bucket using Python Boto3

Source: Internet
Author: User
Tags python script aws iam

First, create the end node

Why create endpoints and manage your vpc and S3? If you do not manage the VPC and S3 through the endpoint, the EC2 instance Access S3 bucket in the VPC is through the public network, and once associated, the EC2 instance in the VPC accesses the S3 bucket to the internal network. The benefits are two: 1. The internal network will not incur traffic costs; 2. Walking the internal network is fast and will not cause our Python script to produce an exception for network reasons.

Vpc-> endpoint, create endpoint, associate VPC and S3, associated subnets


Ii. installing the Python3 compiler and the Boto3 library in Windows

1.: https://www.python.org/

2. Double-click Install, default installation path "C:\Users\ user \appdata\local\programs\python\python36"

3. Configure Environment variables

4. Install the Boto3 Development Library (the environment variable is ready to use the PIP command)


Third, generate AWS IAM user keys and configure

1. iam-> user, select User with Access S3 permissions, security certificate, create Access security key, download key file to local

2. Configure AWS key authentication on a Windows instance

A) Create the ~/.aws/credentials file with the following file contents: [default]aws_access_key_id = Xxxxxxaws_secret_access_key = xxxxxxb) Create ~/.aws/config File, the contents of the file are as follows: [default]region=cn-north-1


Third, edit the Python3 script, the foot name is "s3_upload.py"

Import osimport datetimeimport boto3import loggingfrom boto3.s3.transfer import  transferconfiglogging.basicconfig (level=logging.info,                 format= '% (asctime) s % (filename) s[line:% (Lineno) d] % ( LevelName) s % (message) s ',                 datefmt= '%a, %d %b %y %h:%m:%s ',                 filename= ' E:\\xxx\\xxx\\xxx\\aws_upload.log ',                 filemode= ' a ') Delta = datetime.timedelta (days=2) now = datetime.datetime.now () s3 = boto3.client (' S3 ') bucket_name =  ' daily-backup ' file_dir= ' e:\\xxx\\xxx\\xxx ' gb = 1024 ** 3#   A single file larger than 10GB, you need to set this value config = transferconfig (MULTIPART_THRESHOLD=5&NBSP;*&NBSP;GB) os.chdir (file_dir) file_list =  Os.listdir () for file in file_list:    #  upload only zip files      if file.endswith ('. zip '):        #  uploaded two days ago files generated          ctime = datetime.datetime.fromtimestamp (Os.path.getctime ( File)         if ctime <  (Now-delta):             try:                 s3.upload_file (file, bucket_name, file,  Config=config)             except exception  as e:                 logging.error (e)   &nbsP;             logging.error ("%s upload  failed. "  % file)             else:                 #  upload succeeds delete local file                  logging.info ("%s  upload successful. "  % file)                  os.remove (file)


Iv. testing and scheduling of scheduled tasks

1. Manually run the Python script you just edited on the Windows cmd command line

2. If successful, edit the Windows timed task to upload files from the local directory to the S3 bucket on a daily schedule


V. Setting the S3 bucket life cycle

For files uploaded to a S3 bucket, we want to delete files from the previous 30 days regularly, and we can set the bucket life cycle and automatically delete outdated files.

Add life cycle Rules

Uploading files from a Windows EC2 instance to a S3 bucket using Python Boto3

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.