Oracle Data Pump Detailed (1) overview

Source: Internet
Author: User
Tags create directory file copy

Since 10g, Oracle has provided more efficient data Pump (i.e., EXPDP/IMPDP) for the import and export of imports, and old exp/imp can be used, but is not recommended. Note: EXPDP/IMPDP and Exp/imp are incompatible, that is, exp exported files can only be imported with IMP, EXPDP exported files can only be imported with IMPDP.

Part of the Data pump

The Data pump has the following three components:

Client Tools: EXPDP/IMPDP

Data Pump API (ie dbms_datapump)

Metadata API (ie dmbs_metadata)

Normally, we equate EXPDP/IMPDP with the data Pump, but from the above we can know that actually it is just a part of the data Pump, actually work is two APIs, but they are hidden in the background, usually very little attention, but if there is a Some inexplicable errors (such as internal error), usually because these two APIs are corrupted and run scripts recompile them.

Data Pump-related roles

By default, users can export/import data under their schema, but if you want to export/import data under other schemas, you must give the user the following two roles:

Datapump_exp_full_database

Datapump_imp_full_database

Of course, the Sys,system account and DBA role have the above two roles by default.

Data import Method Pump

Data file copy: This is the fastest way, dumpfile only contains metadata, at the operating system level copy data files, related parameters are: transport_tablespaces,transportable=always

Direct path Loading: This is the fastest way to go except for file copies, unless it is not available (such as bfile), otherwise this method is used

External table: 1th, 2 cannot use the external table

Traditional path loading: Traditional path loading is used only if all of the above methods are not available, and this method is poor performance

Data Pump Job

When performing EXPDP/IMPDP, the job performs export import, and a data Pump job consists of the following three parts:

Master process: Controls the entire job and is the coordinator of the entire job.

Master table: Records the meta information of a database object in DumpFile, writes it to the dumpfile at the end of the EXPDP, and reads it at the beginning of the IMPDP to know the contents of the DumpFile.

Worker process (worker processes): Perform export import work, automatically create multiple worker processes to execute in parallel, but not more than the number of parameter parallel definitions.

Monitoring Job Status

In the screen output, logfile can see the current data Pump job operation, in the database can also query view dba_datapump_jobs,user_datapump_jobs, or dba_datapump_sessions.

For a longer job, you can view the current job completion in Dynamic View v$session_longops and how long it will take to complete, with the following meanings:

Username-job owner  
opname-job name  
target_desc-job operation  
Sofar-megabytes transferred thus far Duri ng the job totalwork-estimated number of megabytes in the  
job  
units-megabytes (MB)  
message-a formatted s Tatus message of the form:  
' job_name:operation_name:nnn out of mmm MB done '

Create directory

Data Pump Unlike exp/imp can be executed on the client, it must be executed on the server side, all the files it generates are placed on the server side, so it is necessary to create a directory object in Oracle, and here is an example:

Sql> CREATE DIRECTORY dpump_dir1 as '/usr/apps/datafiles ';

After you create a directory object, you also assign read and write permissions to the user who executes the data pump, as follows:

Sql> GRANT READ, WRITE on DIRECTORY dpump_dir1 to HR;

See more highlights of this column: http://www.bianceng.cnhttp://www.bianceng.cn/database/Oracle/

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.