This section describes how to export data from one or more dynamodb tables to S3 buckets. Before you perform the export, you need to create S3 buckets in advance.
NoteIf you haven't used AWS Data Pipeline before, you'll need to create two IAM roles before executing the process below. For more information, please visit creating IAM Roles for AWS Data Pipeline.
export data from Dynamodb to S3
- Log in to the AWS Administrator console and open the Dynamodb console. https://console.aws.amazon.com/dynamodb/.
- On the Amazon DynamoDB Tables page, click export/import.
- On the export/import page, select the table you want to export and click Export from DynamoDB.
- On the Create Export Table Data Pipeline (s) page, follow the process below:
-
- fill in the Amazon S3 URI in the s3 Output folder text box, and the exported file will be stored in the appropriate folder in S3. For example:
s3://mybucket/exports
The rule for this URI should be this way < Code style= "font-family: ' Courier New ', Courier,mono" >s3://bucketname
/
-
-
bucketname
is the name of the bucket in S3
-
folder
Represents the name of the folder under this bucket. If this folder does not exist, it will be created automatically. If you do not specify this name, it will be automatically granted a name, the rule of the name is: s3:// bucketname
/ region
/ Span style= "Color:rgb (255,0,0)" >tablename
.
- Enter a S3 URI in the S3 log Folder text box and the log of the export process will be stored in the corresponding Folder. For example:
s3://mybucket/logs/
S3 Log Folder The format of the URI is the same as the format of the S3 Output Folder .
- You can select a percentage in the throughput rate text box. This ratio indicates that the upper limit of read throughput is consumed during the export process. For example, suppose you want to export a table with a read throughput of 20, and you set the percentage to 40%. Then the throughput consumed by the export will not exceed 8.
If you are exporting multiple tables, this throughput rate will be applied to each table.
- Execution Timeout text box, enter the time-out for the export task. This task fails if the export task does not complete within this length of time.
- Send notifications to text box, enter an email address. After pipeline is created, you will receive an email invitation to subscribe to Amazon SNS, and if you accept this invitation, you will receive an email notification each time you perform an export operation.
- Schedule option, select one of the following:
-
- one-time Export-the exported task will be executed immediately after pipeline is created.
- Daily Export-the exported task will be executed at the moment you specify, and will be repeated at that time of day.
- Data Pipeline Role, select datapipelinedefaultrole.
- Resource Role, select datapipelinedefaultresourcerole
- Confirm the above settings and click Create Export Pipeline.
Your pipeline will now be created, and this process may take several minutes to complete. To view the current status, move to managing export and Import pipelines. If the schedule you selected is one-time export, the export task executes immediately after the pipeline creation succeeds. If you select Daily Export, the export task will be executed at the specified time, and the export task will be performed at that time of day. When the export task ends, you can go to Amazon S3 console to view the export file. This file will be in a folder named after your table name, and the filename will be this format:
YYYY-MM-DD_HH.MM。文件内部结构会在
Described in Verify Data Export File.
AWS dynamodb Data Export to S3