EXPDP/IMPDP Parameters

Source: Internet
Author: User
C) Copyright 1985-2001 Microsoft Corp.

C: Documents and settingsnonao> expdp-help

Export: Release 10.2.0.1.0-Production on Saturday, April, 2007 11:06:58

Copyright (c) 2003,200 5, Oracle. All rights reserved.

The data pump export utility provides a method for transferring data between Oracle databases.
Data Object mechanism. The utility can be called using the following command:

Example: expdp scott/tiger DIRECTORY = dmpdir DUMPFILE = scott. dmp

You can control the export running mode. The specific method is: Enter
Parameters. To specify parameters, use the Keyword:

Format: expdp KEYWORD = value or KEYWORD = (value1, value2,..., valueN)
Example: expdp scott/tiger DUMPFILE = scott. dmp DIRECTORY = dmpdir SCHEMAS = scott
Or TABLES = (T1: P1, T1: P2). If T1 is a partitioned table

USERID must be the first parameter in the command line.

Keyword description (default)
------------------------------------------------------------------------------
ATTACH connects to an existing job, for example, ATTACH [= job name].
COMPRESSION reduces the size of valid dump files
Keyword values: (METADATA_ONLY) and NONE.
CONTENT specifies the data to be detached. The valid keyword is:
(ALL), DATA_ONLY and METADATA_ONLY.
DIRECTORY is the DIRECTORY object used by dump files and log files.
The list of DUMPFILE target dump files (expdat. dmp,
For example, DUMPFILE = scott1.dmp, scott2.dmp, dmpdir: scott3.dmp.
ENCRYPTION_PASSWORD is the key word used to create an encryption column.
ESTIMATE calculates the estimated job value. The valid keyword is:
(BLOCKS) and STATISTICS.
ESTIMATE_ONLY calculates the estimated job value without executing the export operation.
EXCLUDE is used to EXCLUDE specific object types, such as EXCLUDE = TABLE: EMP.
FILESIZE specifies the size of each dump in bytes.
FLASHBACK_SCN is used to set the session snapshot back to the SCN in the previous state.
FLASHBACK_TIME is used to obtain the SCN time closest to the specified time.
FULL export the entire database (N ).
HELP displays the HELP message (N ).
INCLUDE includes specific object types, such as INCLUDE = TABLE_DATA.
The name of the export job to be created in JOB_NAME.
LOGFILE log File Name (export. log ).
The name of the remote database to which NETWORK_LINK is linked.
NOLOGFILE does not write log files (N ).
PARALLEL changes the number of active workers of the current job.
PARFILE specifies the parameter file.
QUERY is the predicate clause used to export a subset of a table.
The percentage of data to be exported in SAMPLE;
List of schemes to be exported by SCHEMAS (logon scheme ).
When the default value (0) shows the new STATUS when available,
The frequency (in seconds) job status to be monitored.
TABLES identifies the list of TABLES to be exported-There is only one solution.
TABLESPACES identifies the list of TABLESPACES to be exported.
TRANSPORT_FULL_CHECK verifies the storage segments of all tables (N ).
TRANSPORT_TABLESPACES: List of tablespaces from which metadata is to be detached.
The VERSION of the object to be exported. The valid keyword is:
(COMPATIBLE), LATEST or any valid database version.

The following commands are valid in interactive mode.
Note: abbreviations are allowed.

Command description
------------------------------------------------------------------------------
ADD_FILE adds a dump file to the dump set.
CONTINUE_CLIENT returns to record mode. If the job is idle, the job is restarted.
EXIT_CLIENT exits the Client Session and keeps the job running.
The default file size (in bytes) of the subsequent ADD_FILE command ).
HELP summarizes interactive commands.
KILL_JOB separates and deletes jobs.
PARALLEL changes the number of active workers of the current job.
PARALLEL = <number of workers>.
START_JOB start/restore the current job.
When the default value (0) shows the new STATUS when available,
The frequency (in seconds) job status to be monitored.
STATUS [= interval]
STOP_JOB closes the executed job in sequence and exits the client.
STOP_JOB = IMMEDIATE will be closed immediately
Data pump operation.

Microsoft Windows XP [version 5.1.2600]
(C) Copyright 1985-2001 Microsoft Corp.

C: Documents and settingsnonao> impdp-help

Import: Release 10.2.0.1.0-Production on Saturday, October 28, 2007 11:12:21

Copyright (c) 2003,200 5, Oracle. All rights reserved.

The Data Pump import utility provides a method for transferring data between Oracle databases.
Data Object mechanism. The utility can be called using the following command:

Example: impdp scott/tiger DIRECTORY = dmpdir DUMPFILE = scott. dmp

You can control how the import runs. The specific method is: Enter
Parameters. To specify parameters, use the Keyword:

Format: impdp KEYWORD = value or KEYWORD = (value1, value2,..., valueN)
Example: impdp scott/tiger DIRECTORY = dmpdir DUMPFILE = scott. dmp

USERID must be the first parameter in the command line.

Keyword description (default)
------------------------------------------------------------------------------
ATTACH connects to an existing job, for example, ATTACH [= job name].
CONTENT specifies the data to be loaded. The valid keyword is:
(ALL), DATA_ONLY and METADATA_ONLY.
DIRECTORY is the DIRECTORY object used for dumping files, log files, and SQL files.
DUMPFILE: the list of dump files to be imported from (expdat. dmp,
For example, DUMPFILE = scott1.dmp, scott2.dmp, dmpdir: scott3.dmp.
ENCRYPTION_PASSWORD is the key word used to access the encrypted column data.
This parameter is invalid for network import jobs.
ESTIMATE calculates the estimated job value. The valid keyword is:
(BLOCKS) and STATISTICS.
EXCLUDE is used to EXCLUDE specific object types, such as EXCLUDE = TABLE: EMP.
FLASHBACK_SCN is used to set the session snapshot back to the SCN in the previous state.
FLASHBACK_TIME is used to obtain the SCN time closest to the specified time.
FULL import all objects from the source (Y ).
HELP displays the HELP message (N ).
INCLUDE includes specific object types, such as INCLUDE = TABLE_DATA.
The name of the import job to be created in JOB_NAME.
LOGFILE log File Name (import. log ).
The name of the remote database to which NETWORK_LINK is linked.
NOLOGFILE does not write log files.
PARALLEL changes the number of active workers of the current job.
PARFILE specifies the parameter file.
QUERY is the predicate clause used to import a subset of a table.
REMAP_DATAFILE redefines data file references in all DDL statements.
REMAP_SCHEMA loads objects in one scheme to another.
REMAP_TABLESPACE remaps a tablespace object to another tablespace.
REUSE_DATAFILES if the tablespace already exists, initialize it (N ).
List of SCHEMAS import schemes.
SKIP_UNUSABLE_INDEXES skips indexes that are set to useless indexes.
SQLFILE writes all SQL DDL statements to the specified file.
When the default value (0) shows the new STATUS when available,
The frequency (in seconds) job status to be monitored.
STREAMS_CONFIGURATION enable stream metadata Loading
The operation performed when the TABLE_EXISTS_ACTION import object already exists.
Valid keywords: (SKIP), APPEND, REPLACE, and TRUNCATE.
TABLES identifies the list of TABLES to be imported.
TABLESPACES identifies the list of TABLESPACES to be imported.
TRANSFORM must be applied to the metadata conversion of applicable objects.
Valid conversion keywords: SEGMENT_ATTRIBUTES, STORAGE
OID and PCTSPACE.
TRANSPORT_DATAFILES: list of data files imported in transmission mode.
TRANSPORT_FULL_CHECK verifies the storage segments of all tables (N ).
The list of tablespaces from which TRANSPORT_TABLESPACES will load metadata.
Valid only for the NETWORK_LINK mode import operation.
The VERSION of the object to be exported. The valid keyword is:
(COMPATIBLE), LATEST or any valid database version.
Valid only for NETWORK_LINK and SQLFILE.

The following commands are valid in interactive mode.
Note: abbreviations are allowed.

Command description (default)
------------------------------------------------------------------------------
CONTINUE_CLIENT returns to record mode. If the job is idle, the job is restarted.
EXIT_CLIENT exits the Client Session and keeps the job running.
HELP summarizes interactive commands.
KILL_JOB separates and deletes jobs.
PARALLEL changes the number of active workers of the current job.
PARALLEL = <number of workers>.
START_JOB start/restore the current job.
START_JOB = SKIP_CURRENT will be skipped before the job starts
Any operation performed when the job is stopped.
When the default value (0) shows the new STATUS when available,
The frequency (in seconds) job status to be monitored.
STATUS [= interval]
STOP_JOB closes the executed job in sequence and exits the client.
STOP_JOB = IMMEDIATE will be closed immediately
Data pump operation.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.