where Oracle database Export Data pump (EXPDP) files are stored

Source: Internet
Author: User
Tags create directory sqlplus

1. DIRECTORY

Specify the directory where dump files and log files are located
Directory=directory_object
Directory_object is used to specify the directory object name. Note that directory objects are objects created using the Create DIRECTORY statement, not an OS directory
EXPDP Scott/tiger directory= DMP dumpfile=a.dump

Create or replace directory DMP
as ' D:/dmp '

EXPDP Zftang/zftang@zftang directory=dmp dumpfile=test.dmp content=metadata_only

2. CONTENT

This option specifies what to export. The default value is all
Content={all | data_only | Metadata_only}
When the content is set to all, the object definition and all its data are exported. When data_only, only object data is exported, and when metadata_only, only the object definition is exported

EXPDP Zftang/zftang@zftang directory=dmp dumpfile=test.dmp content=metadata_only

----------Export only object definitions

EXPDP Zftang/zftang@zftang directory=dmp dumpfile=test.dmp content=data_only

----------Export all the data 3. DumpFile

Use to specify the name of the dump file, the default name is Expdat.dmp
Dumpfile=[directory_object:]file_name [,....]
Directory_object is used to specify the directory object name and file_name to specify the dump file name. Note that if you do not specify Directory_object, the export tool automatically uses directory objects specified by the directory option

EXPDP Zftang/zftang@zftang directory=dmp dumpfile=test1.dmp

 

Data Pump tool export steps:

1. Create Directory
Create directory DIR_DP as ' D:/ORACLE/DIR_DP ';
2. Authorization
Grant read,write on directory DIR_DP to Zftang;
--View directories and permissions
SELECT privilege, Directory_name, Directory_path from User_tab_privs T, All_directories D
WHERE T.table_name (+) = D.directory_name ORDER by 2, 1;
3. Perform export
EXPDP zftang/zftang@fgisdb Schemas=zftang directory=dir_dp dumpfile =expdp_test1.dmp;

Connect to: Oracle Database 10g Enterprise Edition Release 10.2.0.1
With the partitioning, OLAP and Data Mining options
Start "Zftang". Sys_export_schema_01 ": zftang/******** @fgisdb Sch
ORY=DIR_DP dumpfile =expdp_test1.dmp Logfile=expdp_test1.log; */
Note:
1, DIRECTORY=DIR_DP must be placed in front, if put it last, will prompt ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name data_pump_dir; Invalid

2. During the export process, data dump creates and uses an object named Sys_export_schema_01, which is the job name used in the data dump export process. If you do not specify the exported job name when you execute this command, a default job name is generated, if you specify the job name in the export process to appear with the specified name
If you change the following into:
EXPDP zftang/zftang@fgisdb Schemas=zftang DIRECTORY=DIR_DP dumpfile =expdp_test1.dmp NAME=MY_JOB1;
3. Do not have a semicolon after the export statement, otherwise the job table in the export statement above is named ' My_job1 ', not my_job1. As a result, EXPDP Zftang/zftang attach=zftang.my_job1 is always prompted to not find the job table when executing the command

various modes of data pump export:

1,   Export by Table mode:
EXPDP zftang/zftang@fgisdb  Tables=zftang.b$i_exch_info,zftang.b$i_manhole_info DumpFile =expdp_test2.dmp logfile=expdp_test2.log directory=dir_dp job_name=my_job

2, export by query condition:
EXPDP zftang/zftang@fgisdb  tables=zftang.b$i_exch_info dumpfile =expdp_test3.dmp logfile=expdp_test3.log directory= DIR_DP job_name=my_job query= ' where rownum<11 '

3, export by table space:
EXPDP, Zftang/zftang@fgisdb dumpfile=expdp_ Tablespace.dmp Tablespaces=gcomm. DBF logfile=expdp_tablespace.log directory=dir_dp job_name=my_job

4, export scheme
EXPDP Zftang/zftang DIRECTORY= DIR_DP dumpfile=schema.dmp SCHEMAS=ZFTANG,GWM

5, export the entire database:
EXPDP zftang/zftang@fgisdb DumpFile Full=y logfile=full.log directory=dir_dp job_name=my_job

IMPDP Import Mode:

 1, import by Table
P_street_area.dmp The table in the file, which is exported by GWM user by Schemas=gwm:
IMPDP gwm/gwm@fgisdb  DumpFile Street_area.dmp logfile=imp_p_street_area.log directory=dir_dp tables=p_street_area job_name=my_job

2, Import by User (you can import user information directly, that is, if the user information does not exist)
IMPDP gwm/gwm@fgisdb schemas=gwm dumpfile =expdp_test.dmp LOGFILE=EXPDP _test.log DIRECTORY=DIR_DP job_name=my_job

3, a method that is imported directly without generating DMP files through EXPDP steps:
--Import tables from the source database to the destination database p_street_ Area
IMPDP gwm/gwm directory=dir_dp network_link=igisdb tables=p_street_area logfile=p_street_area.log  job_ Name=my_job
Igisdb is the name of the destination database and the source data, DIR_DP is the directory on the destination database

4, replace tablespace
   take remap_tablespace parameters  
 --Export all data under the GWM user
EXPDP system/orcl directory=data_pump_dir dumpfile=gwm.dmp schemas=gwm
Note: If the user data exported by the SYS user, including the user creation, authorization section, is exported with its own user, it does not contain the content
--The following is to import all the data under the GWM user into the Tablespace gcomm (formerly Gmapdata tablespace)
IMPDP System /ORCL Directory=data_pump_dir dumpfile=gwm.dmp Remap_tablespace=gmapdata:gcomm







The data pump is a server-side tool, and the exported file is placed on the server where the database resides, and of course we know it can be controlled by directory objects. There are four levels of directory object defaults, of course, priority order, priority from the top down

1. Specify a specific directory for each file individually
2.EXPDP The specified directory parameter when exporting
3. User-defined environment variable data_pump_dir specified directory
4. Default directory Object Data_pump_dir

Of course, for oracle11g R2, another option is introduced, and we're going to be 5.
5.data_pump_dir_schema_name Directory


First, default directory object Data_pump_dir test

Sql> desc dba_directories
Name Null? Type
-----------------------------------------------------------------------------------------------OWNER Not NULL VARCHAR2 (30)
Directory_name not NULL VARCHAR2 (30)
Directory_path VARCHAR2 (4000)

Sql> Set linesize pagesize 100
Sql> Col OWNER for A5
Sql> Col directory_name for A22
Sql> Col Directory_path for A80


Sql> select * from Dba_directories;

OWNER Directory_name Directory_path
----- ---------------------- ----------------------------------------------------------

SYS Subdir/u01/app/oracle/product/11.2.0/db/demo/schema/order_entry//2002/sep
SYS ss_oe_xmldir/u01/app/oracle/product/11.2.0/db/demo/schema/order_entry/
SYS log_file_dir/u01/app/oracle/product/11.2.0/db/demo/schema/log/
SYS media_dir/u01/app/oracle/product/11.2.0/db/demo/schema/product_media/
SYS Xmldir/u01/app/oracle/product/11.2.0/db/rdbms/xml
SYS data_file_dir/u01/app/oracle/product/11.2.0/db/demo/schema/sales_history/
SYS data_pump_dir/u01/app/oracle/admin/tj01/dpdump/
SYS oracle_ocm_config_dir/u01/app/oracle/product/11.2.0/db/ccr/state

Through the query we see that all the directories belong to the SYS user, regardless of which user created, in the database has been built in advance of this directory object Data_pump_dir. If you do not specify a directory object parameter when exporting using EXPDP, Oracle uses the database default directory Data_pump_dir, but if you want to use this directory, the user needs to have Exp_full_database permissions.

Sql> Conn Scott/tiger
Connected.

Sql> select * from tab;
tname                           tabtype  clusterid
------------------ -----------------------------
bonus                           table
DEPT                             table
emp                              table
salgrade                        table


Sql> CREATE TABLE Demo as select Empno,ename,sal,deptno from EMP;
Table created.


Sql> exit
Disconnected from Oracle Database 11g Enterprise Edition release 11.2.0.3.0-64bit Production
With the partitioning, Automatic Storage Management, OLAP, Data Mining
and real Application testing options

Because there is no exp_full_database, so the export will be an error, you can choose to authorize or use a privileged user

[oracle@asm11g ~]$ expdp scott/tiger dumpfile=emp.dmp tables=emp
export:release 11.2.0.3.0-production on Fri No V 13:48:19
Copyright (c) 1982, Oracle and/or its affiliates.  all rights reserved.
Connected to:oracle Database 11g Enterprise Edition release 11.2.0.3.0-64bit Production
with the partitioning, Auto Matic Storage Management, OLAP, Data Mining
and real application testing options
Ora-39002:invalid operation
Ora-39070:unable to open the log file.
Ora-39145:directory Object parameter must be specified and Non-null


[oracle@asm11g ~]$ Sqlplus/as SYSDBA
Sql*plus:release 11.2.0.3.0 Production on Fri Nov 16 13:58:14 2012
Copyright (c) 1982, Oracle. All rights reserved.
Connected to:
Oracle Database 11g Enterprise Edition release 11.2.0.3.0-64bit Production
With the partitioning, Automatic Storage Management, OLAP, Data Mining
and real Application testing options

Sql> Grant Exp_full_database to Scott;
Grant succeeded.

Sql> exit
Disconnected from Oracle Database 11g Enterprise Edition release 11.2.0.3.0-64bit Production
With the partitioning, Automatic Storage Management, OLAP, Data Mining
and real Application testing options

After authorization, the export succeeds, and the exported files and logs go to the directory specified by the Data_pump_dir object
[oracle@asm11g ~]$ EXPDP Scott/tiger dumpfile=emp.dmp tables=emp

Export:release 11.2.0.3.0-production on Fri Nov 16 13:58:52 2012
Copyright (c) 1982, Oracle and/or its affiliates. All rights reserved.
Connected to:oracle Database 11g Enterprise Edition release 11.2.0.3.0-64bit Production
With the partitioning, Automatic Storage Management, OLAP, Data Mining
and real Application testing options
Starting "SCOTT". Sys_export_table_01 ": scott/******** dumpfile=emp.dmp tables=emp
Estimate in progress using BLOCKS method ...
Processing Object Type Table_export/table/table_data
Total estimation using BLOCKS method:64 KB
Processing Object Type Table_export/table/table
Processing Object Type Table_export/table/grant/owner_grant/object_grant
Processing Object Type Table_export/table/index/index
Processing Object Type Table_export/table/constraint/constraint
Processing Object Type Table_export/table/index/statistics/index_statistics
Processing Object Type Table_export/table/constraint/ref_constraint
Processing Object Type Table_export/table/statistics/table_statistics
. . Exported "SCOTT". EMP "8.562 KB rows
Master table "SCOTT". " Sys_export_table_01 "Successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT. SYS_EXPORT_TABLE_01 is:
/u01/app/oracle/admin/tj01/dpdump/emp.dmp
Job "SCOTT". " Sys_export_table_01 "successfully completed at 13:59:05


For oracle11g, we can also define a new directory object, the object name is Data_pump_dir_schema_name, if the directory object is defined and granted permissions, the user exports no specified

Specific directory parameters, the export file will go to this directory

[oracle@asm11g ~]$ mkdir Sdir
[oracle@asm11g ~]$ Sqlplus/as SYSDBA
Sql*plus:release 11.2.0.3.0 Production on Fri Nov 16 14:13:06 2012
Copyright (c) 1982, Oracle. All rights reserved.

Connected to:
Oracle Database 11g Enterprise Edition release 11.2.0.3.0-64bit Production
With the partitioning, Automatic Storage Management, OLAP, Data Mining
and real Application testing options

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.