Today, a bug was found in the data pump import operation.
The previous article recorded the phenomenon of the problem, and this one continues to be studied in depth.
The previous article has described the problem, and it is difficult to reproduce the problem. In any case, to simulate the actual situation, can not reproduce the problem.
To reproduce this problem, in a RAC database environment, the problem can not be reproduced by creating a partitio
platform database version of the environment
---------------------------------------------------Dividing line------------------------------------------------------------
Below is an import export of data using Oracle's EXPDP/IMPDP data pump in the 10g and 11g versions, the following Oracle-related commands are executed in the user Oracle environment:
(PS:EXPDP/IMPDP is a server-side tool that runs only on the database server, making it unaffected by
, an error message is not displayed.Transport_tablespacesSpecifies that the table space mode export is performedVERSIONSpecifies the database version of the exported object, the default value is compatible.version={compatible | LATEST | Version_string}When compatible, the object metadata is generated according to the initialization parameter compatible, and for latest, object metadata is generated based on the actual version of the database. Version_string is used to specify the database version
or the test environment in the previous article:
sql> CREATE TABLE T1
2 (ID number, NAME VARCHAR2 (30));
Table created.
Sql> INSERT into T1
2 SELECT rownum, Tname
3 from TAB;
Created rows.
sql> CREATE TABLE T2
2 (ID number, NAME VARCHAR2 (30));
Table created.
Sql> INSERT into T2
2 VALUES (1, ' A ');
1 row created.
Sql> INSERT into T2
2 VALUES (2, ' B ');
1 row created.
Sql> COMMIT;
Commit complete.
This column more highlights: http://www.bianceng.cn/database/Oracle/
The metho
meaning.Now let's talk about how to create it, and we need to build the partitions so that the data is automatically entered into the corresponding partition.Then we generally build partitions are divided by the time partition, divided well, such as June data, we go to the June partition to find, rather than in all the table to find a piece of data, efficiency will improve a lot.Partition statements are simple,2. Table Analysis3, Oracle data Pump fil
unlock;Grant DBA to Fmisdb;6. Import data (Note If you do not have a client installed, the imported statements are different because of the different versions of Oracle.) )Aaa. DMP is the D:\ORACLE\EXPDP corresponding import file. The first one in Remap_schema is the user name when exporting: The second is the user name that is imported. Remap_tablespace (table space)IMPDP fmisdb/[email PROTECTED]/ORCL directory=expdp_dir dumpfile=d:\oracle\expdp\aaa. DMP logfile=fedata1124.log remap_schema=fmi
Label:1. First, explain a few words directory: The general creation of directory is to use data pump import/export data, in fact, there are many other uses of directory, this article does not elaborate schemas: It's easy for you to understand it with user users, and you can have a set of non-interfering objects under each mode. If you want to access objects from other schemas,You need to specify the name of the schema, which is actually specifying
Oracle Data Pump (expdp) backup reports ORA-39006: internalerror in Linux
Problem description:
Oracle has been using a Data Pump backup, today at the terminal backup suddenly reported ORA-39006: internal error.
The data pump export log file contains the following information:
ORA-39097: Data Pump job encountered
Oracle Oracle Data Pump Step 2004/08/27
These two days in the test ora10g data pump, will carry out the steps to post it for everyone to see
Because my test database data is not much, the speed of data pump is not obvious, but the backup file size than exp out of a lot.
-----Lisalan 20040825 Oracle Data Pump
----Crea
1, the data pump work flow is as follows:
(1) Executing commands at the command line
(2) The EXPDP/IMPD command invokes the Dbms_datapump Pl/sql package. This API provides a high speed export import function.
(3) When data is moved, the data Pump automatically chooses direct path or external table mechanism or two combinations. When the metadata (object definition) moves
, the Data
Tags: Ota read content xxxx version Object CLI def extData pump is 10g launch function, personal inverted data like with data pump.It is easy to convert table space and schema using the remap parameter when importing, and can ignore the server and client character set problems (Exp/imp need to troubleshoot the character set).Data pump also has inconvenient place, if the remote export import, must install th
Global_name=true, the Dblink name must be the same as the global database name global_name the remote database; otherwise, it can be named arbitrarily. (3) To view the Global_name parameter method: sql> Show Parameters global_name; name type VALUE ---------------------------------------------------------------------------- global_names boolean FALSE 6. Remap_schema ParametersIt is well known that the Fromuser and Touser parameters of the Imp tool enable the migration of one user's
Tags: dblink data pump remote export dataUsing Dblink to realize data pump remote data exportA simple implementation of data in database A is derived using a data pump on a database B server.The premise is that 2 data versions are consistent.1. Create an export user on database B (as Administrator)Sql> create user Daochub identified by Daochub;2.1 Grant connectio
Tags: backing up schema database metadataBackup and recovery of Oracle database pumpDirectoryI. Preparation for database backup and pre-recovery ... 1 1. Authorize and create DIRECTORY objects for database users ... 1 _toc388455817second, the data pump backup and Recovery ... 11. Database backup ... 1 2. Database recovery ... 2 Third, enter the interactive mode ... 4I. Preparation for database backup and pre-recovery1, authorizing the d
being skipped due to error:ora-31617:unable to open dump file '/data/orabackup/exp/exp_unspaydb_20160611.dmp ' for writeora-19505:failed to identify file "/data/orabackup/exp/exp_unspaydb_20160611.dmp"Ora-27037:unable to obtain file statuslinux-x86_64 error:2: No such file or directoryAdditional Information:3Ora-31693:table data Object "VNV". " Id_card_info "failed to Load/unload and was being skipped due to error:ora-31617:unable to open dump file '/data/orabackup/exp/exp_unspaydb_20160611.dmp
Recently done a later program, started the transaction after a period of operation of the business, when running for a period of time, this exception occursThe CLR cannot convert from COM context 0x1b1c38 to COM context 0x1b1da8, which has lasted for 60 seconds. The thread that owns the target context/unit is most likely to perform a non-pump wait or handle a very long operation without sending a Windows message. This situation often affects performan
1 Linux Normal export/import1.1 Export under TerminalExp naricom/[email Protected]/sgtms owner= ' (' mw_app,mw_sys,statdba,dictdba ') ' file=/orabackup/sgtms_201408131200. DMP log=/orabackup/sgtms_201408131200.log buffer=800000001.2 End of ImportImp Naricom/[email protected] fromuser= (MW_SYS,MW_APP,STATDBA,DICTDBA) touser= (MW_SYS,MW_APP,STATDBA,DICTDBA) file= D:\SGTMSDB\init\SGTMS_201408131105.dmp Log=d:\sgtmsdb\init\sgtms_201408131105.log2 Linux Data pump
According to Oracle's documentation, the data pump in different ways to export imports, performance can be significantly different, this time just have the opportunity to test, migration table space, direct path, external table, and database chain export, import performance differences.
This test direct path export, import method.
First clear the user and tablespace imported from the previous article, and re-establish the test user and tablespace.
The new feature after oracle11g makes it possible to omit empty tables when allocating table spaces by default, to reduce the resource usage of tablespaces, so that empty tables are ignored when exporting user data using the EXP export of Oracle, which results in incomplete data and, of course, no way to export data using EXP , this has been mentioned before, do not do too much to repeat the http://jim123.blog.51cto.com/4763600/1934205. Using this method solves the empty table when exporting use
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.