Usage of oracle expdp commands

Source: Internet
Author: User

1. DIRECTORY
Specifies the directory where the dump and log files are located
DIRECTORY = directory_object
Directory_object is used to specify the DIRECTORY object name. Note that the DIRECTORY object is an object created using the create directory statement, not an OS DIRECTORY.
Expdp scott/tiger DIRECTORY = dmp dumpfile = a. dump

Create or replace directory dmp
As 'd:/dmp'

Expdp zftang/zftang @ zftang directory = dmp dumpfile = test. dmp content = metadata_only

2. CONTENT
This option is used to specify the content to be exported. The default value is ALL.
CONTENT = {ALL | DATA_ONLY | METADATA_ONLY}
When the CONTENT is set to ALL, only the object data is exported when the definition of the exported object and ALL its data is set to DATA_ONLY. If the value is METADATA_ONLY, only the object definition is exported.

Expdp zftang/zftang @ zftang directory = dmp dumpfile = test. dmp content = metadata_only
---------- Export object definitions only
Expdp zftang/zftang @ zftang directory = dmp dumpfile = test. dmp content = data_only
---------- Export all data
3. DUMPFILE
Specifies the name of the dump file. The default name is expdat. dmp.
DUMPFILE = [directory_object:] file_name [,…]
Directory_object is used to specify the DIRECTORY object name, And file_name is used to specify the dump file name. Note that if directory_object is not specified, the export tool automatically uses the DIRECTORY object specified by the DIRECTORY option.
Expdp zftang/zftang @ zftang directory = dmp dumpfile = test1.dmp

Steps for Data Pump tool export:
1. create DIRECTORY
Create directory dir_dp as 'd:/oracle/dir_dp ';
2. Authorization
Grant read, write on directory dir_dp to zftang;
-- View directories and permissions
SELECT privilege, directory_name, DIRECTORY_PATH FROM user_tab_privs t, all_directories d
WHERE t. table_name (+) = d. directory_name order by 2, 1;
3. Execute Export
Expdp zftang/zftang @ fgisdb schemas = zftang directory = dir_dp dumpfile = expdp_test1.dmp logfile = expdp_test1.log;
Connect to: Oracle Database 10g Enterprise Edition Release 10.2.0.1
With the Partitioning, OLAP and Data Mining options
Start "ZFTANG". "SYS_EXPORT_SCHEMA_01": zftang/******** @ fgisdb sch
Ory = dir_dp dumpfile = expdp_test1.dmp logfile = expdp_test1.log ;*/
Note:
1. directory = dir_dp must be placed before, if placed last, will prompt ORA-39002: The operation is not valid
ORA-39070: Unable to open the log file.
ORA-39087: Invalid directory name DATA_PUMP_DIR;

2. During the export process, data dump creates and uses an object named SYS_EXPORT_SCHEMA_01, which is the JOB name used in the data dump export process, if you do not specify the name of the exported JOB when executing this command, a default JOB name is generated. If you specify the JOB name during the export process
Change it:
Expdp zftang/zftang @ fgisdb schemas = zftang directory = dir_dp dumpfile = expdp_test1.dmp logfile = expdp_test1.log, job_name = my_job1;
3. Do not use semicolons after the export statement. Otherwise, the job table name in the preceding Export Statement is 'my _ job1; 'instead of my_job1. Therefore, when the command expdp zftang/zftang attach = zftang. my_job1 is executed, it always prompts that the job table cannot be found.

Data Pump export modes:
1. Export in Table mode:
Expdp zftang/zftang @ fgisdb tables = zftang. B $ I _exch_info, zftang. B $ export dumpfile = expdp_test2.dmp logfile = expdp_test2.log directory = dir_dp job_name = my_job
2. Export based on query conditions:
Expdp zftang/zftang @ fgisdb tables = zftang. B $ I _exch_info dumpfile = expdp_test3.dmp logfile = expdp_test3.log directory = dir_dp job_name = my_job query = '"where rownum <11 "'
3. Export by tablespace:
Expdp zftang/zftang @ fgisdb dumpfile = expdp_tablespace.dmp tablespaces = GCOMM. DBF logfile = expdp_tablespace.log directory = dir_dp job_name = my_job
4. Export Solution
Expdp zftang/zftang DIRECTORY = dir_dp DUMPFILE = schema. dmp SCHEMAS = zftang, gwm
5. Export the entire database:
Expdp zftang/zftang @ fgisdb dumpfile = full. dmp full = y logfile = full. log directory = dir_dp job_name = my_job
Impdp import mode:
1. Import by table
The table in the p_street_area.dmp file, which is exported by the gwm user according to schemas = gwm:
Impdp gwm/gwm @ fgisdb dumpfile = p_street_area.dmp logfile = imp_p_street_area.log directory = dir_dp tables = p_street_area job_name = my_job
2. Import by user (user information can be directly imported if user information does not exist)
Impdp gwm/gwm @ fgisdb schemas = gwm dumpfile = expdp_test.dmp logfile = expdp_test.log directory = dir_dp job_name = my_job
3. Directly import the dmp file without using the expdp step:
-- Import the table p_street_area from the source database to the target database
Impdp gwm/gwm directory = dir_dp NETWORK_LINK = igisdb tables = p_street_area logfile = p_street_area.log job_name = my_job
Igisdb is the link name between the target database and the source data, and dir_dp is the directory on the target database.
4. Change the tablespace
Use the remap_tablespace Parameter
-- Export all data under the gwm user
Expdp system/orcl directory = data_pump_dir dumpfile = gwm. dmp SCHEMAS = gwm
Note:If the user data exported by the sys user, including user creation and authorization, is exported by the user, the user data is not included.
-- The following describes how to import all data under the gwm user to the gcomm table space (originally under the gmapdata table space ).
Impdp system/orcl directory = data_pump_dir dumpfile = gwm. dmp remap_tablespace = gmapdata: gcomm

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.