[Oracle] Data Pump detailed tutorial (3)-impdp

Source: Internet
Author: User

[Oracle] Data Pump detailed tutorial (3)-impdp [Oracle] Data Pump detailed tutorial (2)-expdp http://www.bkjia.com/database/201307/230731.html The import mode is the same as the export mode. The following five modes are also mutually exclusive. Set Full = yes in Full mode. If you do not have the DATAPUMP_IMP_FULL_DATABASE role, you can only set schema parameters in the data Schema mode of your Schema. The syntax is as follows: SCHEMAS = schema_name [,...] the following example imports hr data to [plain]> impdp hr SCHEMAS = hr DIRECTORY = dpump_dir1 LOGFILE = schemas under hr schema. log DUMPFILE = expdat. set the Table parameter in dmp Table mode. Syntax: TABLES = [schema_name.] table_name [: partition_name] If schema_name is not specified, the schema of the current user is imported by default, for example,> impdp hr DIRECTORY = dpump_dir1 DUMPFILE = expfull. dmp TABLES = employees. jobs can also import the specified partition:> impdp hr DIRECTORY = dpump_dir1 DUMPFILE = expdat. dmp TABLES = sh. sales: sales_Q1_2012, sh. sales: sales_Q2_2012 set the Tablespace parameter in Tablespace mode. The syntax is as follows: TABLESPACES = tablespace_name [,...] the following is an example. Note that the tablespace to be imported must already exist. Otherwise, the import will fail.> Impdp hr DIRECTORY = dpump_dir1 DUMPFILE = expfull. dmp TABLESPACES = tbs_1, tbs_2, tbs_3, and tbs_4 Transpotable Tablespace parameters are set in the Transpotable_tablespace mode. The syntax is defined as follows: TRANSPORT_TABLESPACES = tablespace_name [,...] only the metadata of the tablespace is imported. The data file must be copied to the local server at the operating system level and the path is specified by Transport_datafiles. The syntax is as follows: TRANSPORT_DATAFILES = datafile_name. The following is an example: DIRECTORY = dpump_dir1NETWORK_LINK = source_database_linkTRANSPORT_TABLESPACES = tbs_6TRANSPORT_FULL_CH ECK = NOTRANSPORT_DATAFILES = 'user01/data/tbs6.dbf' the filtering in the import process is similar to that in the export process. QUERY and SAMPLE are used for data filtering, and EXCLUDE and INCLUDE are used for metadata filtering. Description of the main parameters ACCESS_METHOD = [AUTOMATIC | DIRECT_PATH | EXTERNAL_TABLE | CONVENTIONAL] defines the import method. We strongly recommend that you use the default setting of AUTOMATIC. Do not change it. CONTENT = [ALL | DATA_ONLY | METADATA_ONLY] defines whether to import data only and metadata. DIRECTORY = directory_object specifies the folder where the data file is imported. DUMPFILE = [directory_object:] file_name [,...] specify the name of the Dump file to be imported. The wildcard % U can be used to match multiple Dump files. HELP = YESimpdp help = y displays HELP information. JOB_NAME = jobname_string specifies Job_name. Generally, LOGFILE = [directory_object:] is used by default. file_name: Specify the log file name MASTER_ONLY = [YES | NO] to import only the master table. Because the master table contains the dumpfile information, you can specify the data contained in the dumpfile. PARALLEL = integer specifies the degree of parallelism during import PARFILE = [directory_path] file_name specifies the parameter file REMAP_DATA = [schema.] tablename. column_name: [schema.] pkg. modify the data when the function is imported, for example, re-generate a PK to prevent conflicts with the original PK. REMAP_DATAFILE = source_datafile: target_datafile can solve the problem of different file naming rules between heterogeneous platforms. REMAP_SCHEMA = source_schema: target_schema is a common parameter that allows you to import data to different schemas. If target_schema does not exist, it is automatically created during import. The following is an example:> expdp system SCHEMAS = hr DIRECTORY = dpump_dir1 DUMPFILE = hr. dmp> impdp system DIRECTORY = dpump_dir1 DUMPFILE = hr. dmp REMAP_SCHEMA = hr: scott REMAP_TABLE = [schema.] old_tablename [. partition]: new_tablename can be used to rename a table or partition during import. The following is an example:> impdp hr D IRECTORY = dpump_dir1 DUMPFILE = expschema. dmpTABLES = hr. employees REMAP_TABLE = hr. employees: emps REMAP_TABLESPACE = source_tablespace: target_tablespace modify the tablespace name during import. The following is an example:> impdp hr REMAP_TABLESPACE = tbs_1: tbs_6 DIRECTORY = dpump_dir1 DUMPFILE = employees. dmp REUSE_DATAFILES = [YES | NO] whether to reuse data files. The default value is NO. Be careful. Once set to YES, the original data file with the same name will be overwritten by SQLFILE = [directory_object:] if this parameter is specified, file_name does not actually execute the import, but writes the ddl SQL required for the import to SQLFILE. TABLE_EXISTS_ACTION = [SKIP | APPEND | TRUNCATE | REPLACE] SKIP leaves the table as is and moves on to the next object. this is not a valid option if the CONTENT parameter is set to DATA_ONLY.APPEND loads rows from the source and leaves existing rows unchanged. TRUNCATE deletes existing rows and then loads rows from the source. REPLACE drops the existing table and then creates and loads it from the source. this is not a valid option if the CONTENT parameter is set to DATA_ONLY.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.