Problems and Solutions of impdp ORA-39002, ORA-39166 and ORA-39164

Source: Internet
Author: User

Problems and Solutions of impdp ORA-39002, ORA-39166 and ORA-39164

During performance tests on imp and impdp, we found that if the table contains a lob field, loading is really slow, and there are about 1000 entries per second. At this speed, basically, you don't need to work.

For example, 50000000/1000 million records, 60/60/13.89 = hours, the time is unacceptable.

So try to use impdp to see the performance improvement.

The exported table contains 300 million records and is partitioned. There are about partitions. If you use full table export and import, in the previous test, it would take about three hours to test the data of million yuan, which is a relatively long time. As the data volume increases, time will continue to grow.

I personally try to do some work from the partition perspective.

Export partitions and import them by partition.

The impdp command used is as follows and the remap_schema has been implemented. However, the following error will be thrown no matter how you try it. In fact, this partition exists.

Impdp mig_test/mig_test directory = memo_dir dumpfile = parw.mow.memo.dmp logfile = partition tables = partition: P9_A0_E5 TABLE_EXISTS_ACTION = append REMAP_SCHEMA = prdappo: MIG_TEST DATA_OPTIONS = Partition

With the Partitioning, OLAP, Data Mining and Real Application Testing options

ORA-39002: invalid operation.

ORA-39166: Object mig_test.mow.memo was not found.

ORA-39164: Partition mig_test.mow.memo: P9_A0_E5 was not found.

Tried a variety of methods. Still ineffective. Finally, we found some ideas about metalink. (Docid 550200.1)

Key Points of migrating 11G data to 10g platform through expdp & impdp

Oracle Data Pump examples and some precautions (expdp/impdp)

Oracle datapump expdp/impdp hang

Expdp/impdp for Oracle 10g to 11g data migration

CAUSE
Unlike fromuser/touser and tables functionality in traditional imp, dataPump assumes that if TABLES parameter does not include schema name then the table is owned by current user doing import and will not find correct table to import unless the user doing import is same user which owns tables in export dump and has IMP_FULL_DATABASE role so that user can import into other schemas.
 
SOLUTION
1. Either grant IMP_FULL_DATABASE to user which owns the objects in the export dump so that user can import into other schema referenced REMAP_SCHEMA and run DataPump import as that schema, ie
 

SQL> grant IMP_FULL_DATABASE to old_user;
 
Impdp old_user/passwd TABLES = TABLEA: TABLEA_PARTITION1/
REMAP_SCHEMA = old_user: new_user DUMPFILE = exp01.dmp, exp02.dmp, exp03.dmp/
DIRECTORY = data_pump_dir
 

Or:
 
2. Be sure to include the schema name in TABLES parameter so the correct table can be found to import from user/to user referenced in REMAP_SCHEMA, ie
 

Impdp system/passwd TABLES = old_user.TABLEA: TABLEA_PARTITION1/
REMAP_SCHEMA = old_user: new_user DUMPFILE = exp01.dmp, exp02.dmp, exp03.dmp/
DIRECTORY = data_pump_dir

Finally, I tried to use the following command and finally got a response. The partition is still empty. :)
 
Impdp mig_test/mig_test directory = memo_dir dumpfile = export logfile = export tables = prdappo. mow.memo: Export remap_schema = prdappo: mig_test TABLE_EXISTS_ACTION = append DATA_OPTIONS = Export
Master table "MIG_TEST". "SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "MIG_TEST ". "SYS_IMPORT_TABLE_01": mig_test/******** directory = memo_dir dumpfile = parw.mow.memo.dmp logfile = parw.mow.memo_imp.log tables = prdappo. mo1_memo: P9_A0_E5 remap_schema = prdappo: mig_test TABLE_EXISTS_ACTION = append DATA_OPTIONS = SKIP_CONSTRAINT_ERRORS
Processing object type TABLE_EXPORT/TABLE
Table "MIG_TEST". "mow.memo" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
.. Imported "MIG_TEST". "mow.memo": "P9_A0_E5" 0 KB 0 rows
Job "MIG_TEST". "SYS_IMPORT_TABLE_01" successfully completed at 17:23:04

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.