AWS dynamodb Data Export to S3

Source: Internet
Author: User
Tags dynamodb aws data pipeline amazon dynamodb

This section describes how to export data from one or more dynamodb tables to S3 buckets. Before you run the export, you need to create S3 buckets in advance.

NoteAssuming you haven't used AWS Data Pipeline before, you'll need to create two IAM roles before running the following process. For a lot of other information, please go to creating IAM Roles for AWS Data Pipeline. export data from Dynamodb to S3
  1. Log in to the AWS Administrator console and open the Dynamodb console.

    https://console.aws.amazon.com/dynamodb/.

  2. On the Amazon DynamoDB Tables page, click export/import.
  3. On the export/import page, select the table you want to export and click Export from DynamoDB.
  4. On the Create Export Table Data Pipeline (s) page. Follow these steps:
    1. In the S3 Output Folder text box, fill in the Amazon S3 URI, and the exported file will be stored in the corresponding directory in S3. Like what:s3://mybucket/exports
      The rules for this URI should look like this s3://bucketname/folder :
      • bucketname   is the name of the bucket in S3
      • folder   represents the name of the directory under this bucket. Assuming that the directory does not exist, it will be created on its own initiative.

        If you do not specify this name, it will be granted a name on its own initiative, the rule of the name is: s3 ://bucketname / Region / TableName .

    2. Enter a S3 URI in the S3 Log Folder text box. The log of the export process is stored in the corresponding folder. Like what:s3://mybucket/logs/
      S3 Log Folder The format of the URI is the same as the format of the S3 Output Folder .
    3. You can select a percentage in the throughput rate text box.

      This ratio indicates that the upper limit of read throughput is consumed during the export process. For example, if the read throughput of the table you are exporting is 20, the same time you set the percentage to 40%.

      Then the throughput consumed by the export will not exceed 8.
      Assuming you are exporting multiple tables, this throughput rate will be applied to each table.

    4. execution The Timeout text box. Enter the timeout period for the export task. Assuming that the export task has not finished running within this length of time, this task will fail.
    5. Send notifications to text box, enter an email address.

      After the pipeline is created. You will receive an email invitation to subscribe to Amazon SNS. Let's say you accept this invitation. You will receive an email notification each time you run the export operation.

    6. Schedule option, select one of the following:
      • one-time Export-the exported task will run immediately after pipeline is created.

      • Daily Export-the exported task will run at the moment you specify. The same time will be repeated at that time of day.
    7. Data Pipeline Role, select datapipelinedefaultrole.
    8. Resource Role, select datapipelinedefaultresourcerole
  5. Confirm the above settings and click Create Export Pipeline.
Your pipeline will now be created. This process can take a few minutes to complete. To view the current status. Move managing export and Import pipelines. Assuming that the schedule you selected is one-time export, the export task will run immediately after the pipeline creation is successful.

Assuming that you choose daily export, the export task will run at the specified time, and the export task will run at the same time every day.

When the export task ends, you can go to Amazon S3 console to view the export file. This file will be in a directory named after your table name. And the file name will be in this format: YYYY-MM-DD_HH.MM。

文件内部结构会在Verify Data Export File describes the narrative.

AWS dynamodb Data Export to S3

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.