Error Caused by Oracle 11g import to 10g
Environment Introduction |
Old Environment |
|
New Environment |
|
Operating System: |
RedHat5.8 64-bit |
|
Redhat6.4 64-bit |
Database Version: |
Oracle 10.2.0.4 64-bit |
|
Oracle 11.2.0.4 64-bit |
Background: The old database rac was built based on oracle10g and has been running for several years. Now, the front-end application is ready to be upgraded. Considering the differences between the new database used by the front-end application and the database in the old environment, we are ready to import the database used in the new environment to the old environment, there is a cross-version import problem. Another way is to upgrade the old environment database to the same version as the new environment, and consider various potential problems caused by the upgrade and other factors. After communicating with the project owner, we decided to adopt the first method. The test results are as follows:
1. Import the 11g database to the local database through exp in oracle 10g environment. The error is as follows:
Encountering errors in Export logfile
EXP-00008: Oracle error # encountered.
ORA-01455: converting column overflows integer datatype
2. Use exp to export data in the 11g environment, and then import the data remotely using imp user/password @ 10g. The error is as follows:
IMP-00015: the object already exists.
Note: before importing data to 10 Gb, check that 10 Gb does not have task-related user objects. In this way, some tables are not imported, which is estimated to be caused by the clob field.
3. In the 11g environment, use expdp and specify the version to export the data and upload it to 10g. Then, use impdp and specify the version to import the data. The error is as follows:
Expdp user/password directory = impdp_dir dumpfile = file. dmp version = 10.2.0.4 (version number to be imported to 10 Gb)
Impdp user/password directory = impdp_dir dumpfile = file. dmp version = 10.2.0.4
ORA-39139: Data Pump does not support XMLTYPE objects in version 10.2.0.4 TABLE_DATA: "test". "table_name" will be skipped.
Note: This method is skipped directly during import. Because there is no data in this table, you can simply re-execute the statement for creating the table in 10 Gb.