ORCALE10G provides a new import and export tool, data pump.
Oracle officially describes this as: Oracle DataPump technology enables VERY high-speed movement of data and metadata from one database to Anoth Er. Where very high-speed is the highlight.
First, the main features of the data pump (including, but not limited to):
1. Support parallel processing of import and export tasks
2. Support for pausing and restarting import, export tasks
3. Support for exporting or importing objects from a remote database by using Database link
4. Support the import process through Remap_schema, Remap_datafile, remap_tablespace several parameters to automatically modify the object owner, data file or the table space of data.
5. Very fine-grained object control is provided when importing/exporting. With include, exclude two parameters, you can even specify whether or not an object is included or not.
Warning:
1. What is a directory object
The directory object is a new feature provided by the oracle10g version. He is a pointing, pointing to a path in the operating system. Each directory contains read,write two permissions that can be delegated to a specified user or role through the grant command. Users with Read and write permissions can read and write files under the operating system path specified by the directory object.
2. In addition to using the Network_link parameter unexpectedly, EXPDP generated files are on the server (directory-specified location)
How to Invoke
1. Command-line mode
The simplest invocation, but with limited written parameters, suggests the way to use the parameter file.
2. parameter file mode
The most common way. You usually need to write a parameter file first. Specifies the various parameters that are required when exporting. It is then called in the following manner.
EXPDP user/pwd Parfile=xxx.par
This xxx.par is the parameter file that we edited. Note that after this command line, you can also follow other parameters, even those specified in the par parameter file. If the parameters appended in the Execute command are duplicated with the parameters in the parameter file, which parameter is eventually taken, it will be determined by where the parameter last appears. For example: EXPDP user/pwd parfile=xxx.par Logfile=a.log, if logfile is also specified in the parameter file, the logfile in the command line will prevail; for example: EXPDP user/pwd logfile= A.log Parfile=xxx.par, and this, will take precedence in the parameter file, because Parfile=xxx.par is written at the back of the command line.
3. Interactive mode
The Data pump import and Export task supports stop, restart, and other state operations. If a user performs an import or export task and executes half of the time, the task (or other cause of interruption) is interrupted by using crtl+c, and the task is not canceled, but is moved to the background. You can reconnect to the interrupted task by using the EXPDP/IMPDP command again, attaching the Attach parameter, and selecting the subsequent action. This is how the interaction works.
Warning: What is the attach parameter, each import, or export, in the first line of the command, there is the following information: Starting "BAM". " Sys_export_schema_01″: bam/******** Parfile=expdp_tbs.par, this sys_export_schema_01 is our attach parameter.
-bash-3.00$ EXPDP Bam/bam Parfile=expdp_tbs.par
Export:release 10.2.0.4.0–64bit Production on Friday, August, 2010 16:35:18
Copyright (c) 2003, Oracle. All rights reserved.
Connected to:oracle Database 10g Enterprise Edition Release 10.2.0.4.0–64bit Production
With the partitioning, OLAP, Data Mining and Real application testing options
Starting "BAM". " Sys_export_schema_01″: bam/******** Parfile=expdp_tbs.par
If you want to use interactive mode, you can use such as: EXPDP attach sys_export_schema_01 to enter into interactive modes
operating mode
1. Full-Library mode
Import or export the entire database, corresponding to the full parameter in the IMPDP/EXPDP command, only users with DBA or Exp_full_database and Imp_full_database permissions can execute.
2. Schema mode
Export or import the own object under the schema, corresponding to the schema parameter in the IMPDP/EXPDP command, which is the default mode of operation. If you have DBA or Exp_full_database and Imp_full_database permissions, you can export or import objects from multiple schemas.
3. Table mode
Exports the specified table or table partition (if there is a partition) and the objects that depend on the table (such as the table's indexes, constraints, and so on, provided that they are in the same schema, or that the user being executed has the appropriate permissions). Corresponds to the table parameter in the IMPDP/EXPDP command.
4. Tablespace mode
Exports the contents of the specified table space. Corresponds to the tablespaces parameter in the IMPDP/EXPDP, which is similar to the complement of the table schema and schema schema.
5. Transfer table Space mode
Corresponds to the transport_tablespaces parameter in the IMPDP/EXPDP. The most significant difference between this pattern and the previous modes is that the generated dump file does not contain specific logical data, but only the metadata of the related object (that is, the definition of the object, which can be understood as the creation statement of the table), and the logical data is still in the table space data file, and the metadata and data files need to be Replicated to the destination server.
This kind of export is very efficient, and the time overhead is mainly spent on the I/O generated by copying the data files. EXPDP perform the export of the transport tablespace mode, the user must have the Exp_full_database role or the DBA role. When imported through transport tablespace mode, the user must have a imp_full_database role or DBA role.
Filtering Data
Filtering data is mainly dependent on query and sample two parameters. The sample parameters are mainly for the EXPDP export function.
1. Query
Similar to the query function in the EXP command, the parameter function is enhanced and the granularity of control is finer in EXPDP. Query in EXPDP also specifies a similar where statement to qualify the record. The syntax is as follows:
Query = [Schema.] [TABLE_NAME:] Query_clause
By default, if you do not specify Schema.table_name, Query_clause is valid for all exported tables, or you can specify a different query_clause for each table, such as exporting all id<5 records in a table, exporting all the name in table B = ' A ' record, then query should have the following parameters:
Query=a: "Where Id<5″,b:" Where name= ' A ' "
If the schema name or table name is not specified before the Where condition, the default is for all currently exported tables. such as: Query=where ID <5
Warning: It is recommended that you use the query parameter in the parameter file to avoid the hassle of escaping the character.
2. Sample
This parameter is used to specify the percentage of the exported data, which can be specified in the range from 0.000001 to 99.999999, with the following syntax:
Sample=[[schema_name.] Table_name:]sample_percent
When this parameter is specified, the EXPDP export automatically controls the amount of records exported, such as 50% of records in table A, setting the sample parameters as follows:
Sample=a:50
The value specified by Warning:sample_percent is just a reference value, and EXPDP acreage an approximate value based on the data.
Filter Objects
The filter object mainly relies on the include and exclude two parameters. These two parameters work just the opposite, in which you can specify any object type you know (for example, package, Procedure, table, and so on) or object name (wildcard characters are supported)
1. Exclude Anti-rule
Specifies the object type or object name that is not contained. When this parameter is specified, all objects that correspond to the specified object type are not imported or exported. Objects that are excluded will not be imported or exported if they have dependent objects. For example, by specifying that the Table object is not exported by the exclude parameter, not only the table specified will not be exported, but the index, check, etc. associated with those tables will not be exported. Warning: It is recommended to put the exclude parameter in the parameter file to avoid the hassle of escaping the character.
2. The Include formal
The opposite of exclude. Specifies the object type or object name that is contained.
Warning: Because the two parameters function exactly the opposite, two parameters cannot be used simultaneously when performing an import or Export command, or Oracle does not know what you want to do.
exclude/include parameter usage:
Exclude=[object_type]:[name_clause],[object_type]:[name_clause]--draining specific objects
Include=[object_type]:[name_clause],[object_type]:[name_clause]--Contains specific objects
The OBJECT_TYPE clause is used to specify the type of object, such as Table,sequence,view,procedure,package, etc.
The Name_clause clause can be used to filter a specific object name for an SQL expression. It filters specific objects in the specified object type by SQL operators and object names, which can be used with wildcards.
When Name_clause is not specified and only object_type is specified, all objects of that type are filtered or filtered. Multiple [Object_type]:[name_clause] are separated by commas.
Example:
EXPDP <other_parameters> Schemas=scott exclude=sequence,table: "In (' EMP ', ' DEPT ')"
IMPDP <other_parameters> Schemas=scott include=package,function,procedure,table: "= ' EMP '"
Commonly used filter SQL expressions
Exclude=sequence,view--Filter all the Sequence,view
Exclude=table: "In (' EMP ', ' DEPT ')"--Filter the Table object emp,dept
Exclude=sequence,view,table: "In (' EMP ', ' DEPT ')"--Filters all Sequence,view and Table objects Emp,dept
Exclude=index: "= ' indx_name '"--filters the specified Index object indx_name
Include=procedure: "Like ' proc_u% '"--contains all stored procedures beginning with Proc_u (_ symbol represents any single character)
Include=table: "> ' E '"--contains all table objects greater than the character E
Other commonly used operators aren't in, don't like, <,! = etc.
The filter operator is encapsulated directly in the parameter file, as in the following example
Parameter File:exp_scott.par
DIRECTORY = Dump_scott
DumpFile = Exp_scott_%u.dmp
LOGFILE = Exp_scott.log
SCHEMAS = Scott
Parallel= 2
EXCLUDE = TABLE: "In (' EMP ', ' DEPT ')"
EXPDP System/manager Parfile=exp.par
Handling of command-line escape characters
Windows platform:
d:\> EXPDP system/manager directory=my_dir dumpfile=exp_tab.dmp logfile=exp_tab.log SCHEMAS=scott
Include=table:\ "in (' EMP ', ' DEPT ') \"
Under the Windows platform, you need to escape the object double quotes, using the escape character \
UNIX platforms:
In the case where the Parfile file is not used, all symbols need to be escaped, including parentheses, double quotes, single quotes, etc.
% EXPDP system/manager directory=my_dir dumpfile=exp_tab.dmp logfile=exp_tab.log Schemas=scott
Include=table:\ "in \ (\ ' emp\ ', \ ' dep\ ' \) \"
Common mistakes in Exclude/include
Any character that needs to be escaped will produce a ora error if the error is not escaped or escaped. Several common Ora errors are given below.
Ora-39001:invalid argument value
Ora-39071:value for INCLUDE is badly formed.
Ora-00936:missing expression
Ora-39001:invalid argument value
Ora-39071:value for EXCLUDE is badly formed.
ORA-00904: "DEPT": Invalid identifier
Ora-39001:invalid argument value
Ora-39041:filter "INCLUDE" either identifies all object types or no object types.
Ora-39001:invalid argument value
Ora-39041:filter "EXCLUDE" either identifies all object types or no object types
Ora-39001:invalid argument value
Ora-39038:object path "USER" is not supported for TABLE jobs.
Advanced Filtering
In export/import, we often have the need to export/import the table structure, or just want to export/import data. Fortunately, the data pump also provides this capability. Use the content parameter. This parameter has three properties
1) All: Export/Import object definition and data, the default value for this parameter is all
2) Data_only: Export/import data only.
3) Metadata_only: Export/Import only object definition.
Warning: It's worth noting that when exporting, if you use advanced filtering, such as exporting only data, you need to ensure that the data definition already exists when you import. Otherwise the data will become no master. If the data definition already exists, it is best to specify data_only when importing, otherwise the ORA-39151 error will be triggered because the object already exists.
Filter data that already exists
We know that imported table objects already exist in the target library, and the target does not create data integrity constraints (RI) to validate the data, which can cause data to be imported repeatedly. The data pump provides a new parameter table_exists_action, which can reduce the generation of duplicate data to some extent. This parameter is used to control what to do if the table object being imported exists. There are several values for the parameters:
1) Skip: Skip the table and continue with the next object. This parameter is skip by default. It is important to note that if you specify the content parameter as Data_only, the skip parameter is invalid and the default is append.
2) APPEND: Adds data to an existing table.
3) truncate:truncate The current table, and then add the record. Use this parameter with caution unless you are sure that the data in the current table is really useless. Failure to do so may result in data loss.
4) REPLACE: Delete and rebuild the Table object, and then add the data to it. It is important to note that the Replace parameter is not valid if the content parameter is specified as Data_only.
Redefine a table's schema or table space
We may also encounter the need to transfer the object of a user to a B user, or to change the tablespace of the data. The data pump realizes this function through the Remap_schema and Remap_tablespace parameters.
1) Remap_schema: Redefine the SCHEMA to which the object belongs
This parameter acts like the Fromuser+touser in Imp, supports multiple schema transformations, and has the following syntax:
Remap_schema=source_schema:target_schema[,source_schema:target_schema]
If the object of a is converted to a C user, C is converted to the D user. Remap_schema=a:b,c:d
Warning: Cannot specify REMAP_SCHEMA=A:B,A:C in the same IMPDP command.
2) Remap_tablespace: Redefine the table space where the object resides.
This parameter is used to remap the table space of the imported object store, supporting the conversion of multiple table spaces at the same time, separating each other with commas. The syntax is as follows:
Remap_tablespace=source_tablespace:target_tablespace[,source_tablespace:target_tablespace]
Warning: If you use the Remap_tablespace parameter, ensure that the imported user has read and write access to the target tablespace.
optimize import/export efficiency
For large data volumes, we have to consider efficiency issues. The data pump also has a higher demand for efficiency. Even the official description is Oracle Data Pump technology enables Very high-speed movement of data and metadata from one database to another. The very high-speed relies on our parallel parameters.
All optimization operations have three outcomes: better, no change, and worse. The parallel parameter is the same, not specifying a parameter greater than 1, the performance will be improved.
1) for the exported parallel
For the export, because the dump file can only be manipulated by one thread (including I/O processing), if the output dump file has only one, even if you specify more parallelism, the actual work is still one, and the ORA-39095 error is triggered. Therefore, it is recommended to set this parameter less than or equal to the number of dump files generated. So, how to control the number of dump files generated?
The EXPDP command provides a filesize parameter to specify the maximum capacity of a single dump file, and the FileSize parameter is necessary to effectively utilize the parallel parameter.
Example: A user object takes up about 4G of space, the actual exported dump file is about 3G, we try to export the user to specify a degree of parallelism of 4, set a single file does not exceed 500M, the syntax is as follows:
$ EXPDP user/pwd directory=dump_file dumpfile=expdp_20100820_%u.dmp logfile=expdp_20100820.log Filesize=500M Parallel =4
2) for the imported parallel
For the import, the use of the parallel parameter is much simpler, I think the import more can reflect the advantages of parallel parameters. parameter is set to several, the contents of several tables are considered to be imported into the library at the same time.
Example: A DMP file contains 200 tables, and we try to specify a parallelism of 10 when importing the DMP file, the syntax is as follows:
$ IMPDP user/pwd directory=dump_file dumpfile=expdp_20100820.dmp logfile=impdp_20100820.log parallel=10
How to enter interactive mode
Here I am performing the import and I want to enter interactive mode to view the status of the import. There are two ways to enter interactive mode, the following steps:
I use CTRL + C to exit the current mode
II in command-line mode, execute the EXPDP/IMPDP command while specifying the attach parameter to connect to the import/export task currently being developed. Such as:
EXPDP Bam/bam attach=sys_import_full_01
Warning: If the attach parameter is not specified, the task that is currently running is entered by default. However, if there is no task currently being specified and no assignment is given to attach, the Ora-31626 error will be reported.
When the command line enters interactive mode, the following interface is displayed:
Export>
7) operation of Interactive mode
In interactive mode, the following actions are supported.
I view the running status of the job
export> status
II Fallback to command line
Export> continue_client
III Increasing parallelism
Export> parallel=4
Warning: When you use export, you cannot specify the parallel parameter directly, or you may encounter a ORA-39095 error, because if you are exporting in parallel, you must specify multiple export files, where parallel export refers to multiple threads working concurrently while exporting multiple DMP files from the database.
The following explanations are available in Oracle Database utilities:
Because each active worker process or I/O server process writes exclusively to one file at a time, an insufficient number of files can has adverse effects. Some of the worker processes would be idle while waiting for files, thereby degrading the overall performance of the job. More importantly, if any member of a cooperating group of parallel I/O server processes cannot obtain a file for output, T Hen the export operation is stopped with an ORA-39095 error. Both situations can be corrected by attaching to the job using the Data Pump Export utility, adding more files using the A Dd_file command while in interactive mode, and in the case of a stopped job, restarting the job.
You can supply multiple file_name specifications as a comma-delimited list or in separate dumpfile parameter specification S. If no extension is given for the filename and then Export uses the default file extension of. dmp. The filenames can contain a substitution variable (%u), which implies that multiple files could be generated. The substitution variable is expanded in the resulting filenames into a 2-digit, fixed-width, incrementing integer startin G at-ending at 99. If a file specification contains-substitution variables, both is incremented at the same time. For example, Exp%uaa%u.dmp would resolve to Exp01aa01.dmp, exp02aa02.dmp, and so forth.
IV Stop Job
Export> Stop_job
V Boot Job
Export> Start_job
VI Kill Job
Export> Kill_job
Vii. Exit Interaction Mode
Export> exit_client
VIII Specify File size
Export> filesize=1g
IX Help
Export> Help
Job_name
Specifies the name of the action to export, which defaults to Sys_xxx
Job_name=jobname_string
SELECT * from dba_datapump_jobs;--view existing job
Network_link exporting files to a local
Remote:
ip:192.168.10.55,os:rhel4.8 64bit,oracle:10.2.0.5 64bit
Exported Users: Monitor
Local:
Ip:172.*,windows xp,oracle 10.2.0.4.
Local Tns:cheniwo
Locally configured connection to the remote TNS:LOBOM55
1) determine local accessible remote
F:\oracle\product\10.2.0\db_2\network\admin\sqlnet.ora
Used the TNSNames adapter to resolve aliases
Attempting to contact (DESCRIPTION = (Address_list = (ADDRESS = (PROTOCOL = TCP
(HOST = 192.168.10.55) (PORT = 1521))) (Connect_data = (service_name = Lobomb)))
OK (40 milliseconds)
2) locally established dblink to remote
F:\Documents and Settings\administrator>sqlplus/as Sysdba
Sql*plus:release 10.2.0.4.0-production on Thu September 6 14:25:44 2012
Copyright (c) 1982, Oracle. All rights Reserved.
Connect to:
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0-production
With the partitioning, OLAP, Data Mining and Real application testing options
Sql> CREATE DATABASE link Dl_monitor connect to monitor the identified by monitor using ' lobom55 ';
The database link was created.
Sql> Select sysdate from [email protected]_monitor;
Sysdate
--------------
June-September-12
3) Data Pump Export
F:\Documents and SETTINGS\ADMINISTRATOR>EXPDP Network_link=dl_monitor dumpfile=n
Etwork_monitor.dmp
/*--------------------------------------------------
Description: Network_link is the database link you established to the remote
The user name entered below is the authentication of your local database
DumpFile will be placed in the local directory defined by the Data_pump_dir directory
-------------------------------*/
Export:release 10.2.0.4.0-production on Thursday, 06 September, 2012 14:40:17
Copyright (c) 2003, Oracle. All rights reserved.
User name:/as Sysdba
Connect to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0-production
With the partitioning, OLAP, Data Mining and Real application testing options
Start "SYS". " Sys_export_schema_01 ":/******** as SYSDBA network_link=dl_monitor
Dumpfile=network_monitor.dmp
Estimating using the BLOCKS method ...
Working with Object Types Schema_export/table/table_data
Total estimate using the BLOCKS method: 81.68 MB
Working with Object Types Schema_export/user
............................. Omit a number of lines ......... ...........
Working with Object Types Schema_export/table/index/statistics/functional_and_bitmap/index_st
Atistics
The main Table "SYS" was successfully loaded/unloaded. " Sys_export_schema_01 "
******************************************************************************
SYS. The sys_export_schema_01 dump file set is:
F:\ORACLE\PRODUCT\10.2.0\ADMIN\CHENIWO\DPDUMP\NETWORK_MONITOR. DMP
Job "SYS". " Sys_export_schema_01 "was successfully completed at 14:51:41
4) Check the results:
Sql> set head off
Sql> SELECT * from dba_directories where directory_name= ' Data_pump_dir ';
SYS Data_pump_dir F:\oracle\product\10.2.0\admin\cheniwo\dpdump\
F:\Documents and SETTINGS\ADMINISTRATOR>CD F:\oracle\product\10.2.0\admin\cheniwo\dpdump
f:\oracle\product\10.2.0\admin\cheniwo\dpdump>dir/w
The volume in drive F is the new volume
The serial number of the volume is 88f0-a421
Directory of F:\oracle\product\10.2.0\admin\cheniwo\dpdump
[.] [..] Cheniwo.sql
CHENIWO_DP. DMP Dp.log Expdat. DMP
Export.log Full_cheniwo. DMP Import.log
Network_monitor. DMP Tables_cheniwo. DMP TABLES_CHENIWO2. DMP
Ts_cheniwo. DMP User_cheniwo. DMP
12 Files 128,977,860 bytes
2 Directories 155,032,698,880 bytes available
Reference to: http://loveboth.iteye.com/blog/1745801
Http://bbs.chinaunix.net/thread-3643706-1-1.html
Http://www.cnblogs.com/iImax/p/oracle-11g-expdp-interactive.html
http://blog.csdn.net/liqfyiyi/article/details/7248911
Ext.: http://czmmiao.iteye.com/blog/2041703
ORACLE EXPDP/IMPDP Detailed