Oracle Learning (21) Importing and exporting using data pump bump

Source: Internet
Author: User
Tags create directory valid backup

Data pump enables fast data migration between test environments, development environments, production environments, and advanced replication or hot backup databases. Data pump can also implement some or all of the database logical backups, as well as a cross-platform, removable tablespace backup.

The corresponding tool for data pump technology is data pump export and data pump import, that is, EXPDP and IMPDP, which are similar in function to exp and IMP, which are different from the faster data pump technology. In addition, the data pump technology can also implement breakpoint restart, that is, after the task interruption can be restarted from the breakpoint.

In the use of EXPDP and IMPDP will inevitably use the directory, the following is a simple introduction to directory creation and deletion, they are in the Sqlplus window.

(1) Create a directory:

Create directory Directory_name as ' real_directory ';

Sample code:

Create directory Mydir as ' D:\dir '; --Create a directory Mydir in Oracle, corresponding to the real directory "D:\dir" * *

(2) Delete directory:

Drop directory Directory_name;

Sample code:

Drop directory Mydir;

1, using EXPDP for export operations

This is similar to the operation of Exp, which is backed up by the EXPDP command at the Command Prompt window. You can use the EXPDP command to perform interactive backups directly, or you can use the action parameters of the EXPDP command to back up the operation. You can view the EXPDP help information by entering EXPDP help=y at the Command Prompt window.

Here's what you can see in the Command Prompt window by entering EXPDP help=y:

C:\USERS\ANDY>EXPDP help=y

Export:release 11.1.0.6.0-production on Friday, 01 June, 2012 21:34:35

Copyright (c) 2003, 2007, Oracle. All rights reserved.

The Data Pump Export utility provides a way to transfer between Oracle databases

The mechanism of the data object. The utility can be invoked using the following command:

Example: EXPDP scott/tiger directory=dmpdir dumpfile=scott.dmp

You can control how the export runs. To do this: after the ' EXPDP ' command, enter

Various parameters. To specify each parameter, use the keyword:

Format: EXPDP keyword=value or keyword= (value1,value2,..., Valuen)

Example: EXPDP scott/tiger dumpfile=scott.dmp directory=dmpdir Schemas=scott

or tables= (T1:P1,T1:P2), if T1 is a partitioned table

USERID must be the first parameter in the command line.

Keyword description (default)

------------------------------------------------------------------------------

ATTACH Connect to an existing job, such as ATTACH [= Job name].

COMPRESSION reduces the size of the dump file's contents, where the valid keywords

Values are: All, (metadata_only), Data_only, and NONE.

CONTENT specifies the data to unload, where the valid key

Values are: (All), Data_only and Metadata_only.

Data_options data-tier markers, where the only valid values are:

Xml_clobs-write XML data types using the CLOB format

directory object used by the directory for dump files and log files.

DumpFile the list of destination dump files (expdat.dmp),

such as Dumpfile=scott1.dmp, Scott2.dmp, Dmpdir:scott3.dmp.

Encryption encrypt part or all of a dump file, where valid keywords

Values are: All, data_only, Metadata_only,

Encrypted_columns_only or NONE.

ENCRYPTION_ALGORITHM Specifies how encryption should be completed, in which valid

Keyword values are: (AES128), AES192, and AES256.

Encryption_mode method for generating encryption keys, where valid keywords

Values are: DUAL, PASSWORD and (transparent).

Encryption_password The password keyword used to create encrypted column data.

Estimate calculates job estimates, where valid keywords

Values are: (BLOCKS) and STATISTICS.

Estimate_only calculates the job estimate without performing an export.

EXCLUDE excludes specific object types, such as exclude=table:emp.

FILESIZE specifies the size of each dump file in bytes.

FLASHBACK_SCN is used to set the session snapshot back to the previous state of the SCN.

Flashback_time is used to get the time of the SCN that is closest to the specified time.

Full exports the entire database (N).

Help displays a helpful message (N).

Include includes specific object types, such as Include=table_data.

Job_name the name of the export job to create.

LOGFILE log file name (Export.log).

Network_link the name of the remote database that is linked to the source system.

Nologfile is not written to the log file (N).

PARALLEL change the number of active workers for the current job.

PARFILE Specifies the parameter file.

The predicate clause that QUERY uses to export a subset of the table.

REMAP_DATA Specifies the data conversion function,

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.