[Oracle] Data Pump detailed tutorial (1)-Overview

Source: Internet
Author: User

[Oracle] Data Pump detailed tutorial (1)-Overview starting from 10 Gb, Oracle provides more efficient Data Pump (expdp/impdp) for Data import and export, old exp/imp can be used, but it is not recommended. Note: expdp/impdp AND exp/imp are incompatible. In other words, exp exported files can only be imported using imp, and expdp exported files can only be imported using impdp. Data Pump consists of the following three parts: client tool: expdp/impdp Data Pump API (DBMS_DATAPUMP) Metadata API (DMBS_METADATA, we all equate expdp/impdp with Data Pump, but as we can see from the above, it is actually only a part of Data Pump. In fact, what actually works is two APIs, they are hidden in the background and are rarely noticed at ordinary times. However, if some inexplicable errors (such as internal error) occur, it is usually because these two APIs are damaged and you can run scripts to recompile them. By default, users can export/import Data in their own schema. However, if you want to export/import Data in other schemas, you must assign the following two roles to the user:



Of course, sys, system account, and dba roles have the above two roles by default. Data Pump Job when executing expdp/impdp, it is actually a job to execute Export and Import. A Data Pump job consists of the following three parts: master process ): controlling the entire job is the coordinator of the entire job. Master table: records the metadata of the database object in dumpfile. When the expdp ends, it is written to dumpfile and read from impdp, so that the content in dumpfile can be known. Worker processes: executes the Export and Import tasks. Multiple worker processes are automatically created and executed in PARALLEL according to the actual situation, but cannot exceed the number defined by the PARALLEL parameter. The status of the monitored Job can be viewed in the output and logfile of the screen. You can also view DBA_DATAPUMP_JOBS, USER_DATAPUMP_JOBS, or DBA_DATAPUMP_SESSIONS in the database. For long-time jobs, you can view the current Job completion status in the dynamic view V $ SESSION_LONGOPS and estimate how long the Job can be completed. The specific fields are as follows:
[plain] USERNAME - job owner  OPNAME - job name  TARGET_DESC - job operation  SOFAR - megabytes transferred thus far during the job  TOTALWORK - estimated number of megabytes in the job  UNITS - megabytes (MB)  MESSAGE - a formatted status message of the form:  'job_name: operation_name : nnn out of mmm MB done'  


Creating Directory Data Pump is not like executing exp/imp on the client. It must be executed on the server. All the files it generates are stored on the server. Therefore, you must first create a directory object in Oracle, the following is an example:
[sql] SQL> CREATE DIRECTORY dpump_dir1 AS '/usr/apps/datafiles';  


After creating a directory object, you must grant the read and write permissions to the user who executes Data Pump, as shown below:
[sql] SQL> GRANT READ, WRITE ON DIRECTORY dpump_dir1 TO hr;  


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.