1, the data pump work flow is as follows:
(1) Executing commands at the command line
(2) The EXPDP/IMPD command invokes the Dbms_datapump Pl/sql package. This API provides a high speed export import function.
(3) When data is moved, the data Pump automatically chooses direct path or external table mechanism or two combinations. When the metadata (object definition) moves
, the Data
Tags: Ota read content xxxx version Object CLI def extData pump is 10g launch function, personal inverted data like with data pump.It is easy to convert table space and schema using the remap parameter when importing, and can ignore the server and client character set problems (Exp/imp need to troubleshoot the character set).Data pump also has inconvenient place, if the remote export import, must install th
Global_name=true, the Dblink name must be the same as the global database name global_name the remote database; otherwise, it can be named arbitrarily. (3) To view the Global_name parameter method: sql> Show Parameters global_name; name type VALUE ---------------------------------------------------------------------------- global_names boolean FALSE 6. Remap_schema ParametersIt is well known that the Fromuser and Touser parameters of the Imp tool enable the migration of one user's
Tags: dblink data pump remote export dataUsing Dblink to realize data pump remote data exportA simple implementation of data in database A is derived using a data pump on a database B server.The premise is that 2 data versions are consistent.1. Create an export user on database B (as Administrator)Sql> create user Daochub identified by Daochub;2.1 Grant connectio
Tags: backing up schema database metadataBackup and recovery of Oracle database pumpDirectoryI. Preparation for database backup and pre-recovery ... 1 1. Authorize and create DIRECTORY objects for database users ... 1 _toc388455817second, the data pump backup and Recovery ... 11. Database backup ... 1 2. Database recovery ... 2 Third, enter the interactive mode ... 4I. Preparation for database backup and pre-recovery1, authorizing the d
Oracle Data Pump detailed
The Oracle database 10g uses data pump technology that enables DBAs or developers to quickly move database metadata (object definitions) and data to another Oracle databases.the role of data Pump export import (EXPDP and IMPDP):1, the implementation of logical backup and logical recovery.2. Move objects between database users.3. Moving o
Content
Data Pump provides different export modes to detach different parts of the database. Specify the mode by entering appropriate parameters in the command line. The following are available export modes:
Full Export Mode
Schema mode
Table mode
Tablespace mode
Transportable Tablespace mode
Oracle 11g Release 1 (11.1) Data Pump import Mode
Note:
Many system modes cannot be exported because they
being skipped due to error:ora-31617:unable to open dump file '/data/orabackup/exp/exp_unspaydb_20160611.dmp ' for writeora-19505:failed to identify file "/data/orabackup/exp/exp_unspaydb_20160611.dmp"Ora-27037:unable to obtain file statuslinux-x86_64 error:2: No such file or directoryAdditional Information:3Ora-31693:table data Object "VNV". " Id_card_info "failed to Load/unload and was being skipped due to error:ora-31617:unable to open dump file '/data/orabackup/exp/exp_unspaydb_20160611.dmp
Recently done a later program, started the transaction after a period of operation of the business, when running for a period of time, this exception occursThe CLR cannot convert from COM context 0x1b1c38 to COM context 0x1b1da8, which has lasted for 60 seconds. The thread that owns the target context/unit is most likely to perform a non-pump wait or handle a very long operation without sending a Windows message. This situation often affects performan
1 Linux Normal export/import1.1 Export under TerminalExp naricom/[email Protected]/sgtms owner= ' (' mw_app,mw_sys,statdba,dictdba ') ' file=/orabackup/sgtms_201408131200. DMP log=/orabackup/sgtms_201408131200.log buffer=800000001.2 End of ImportImp Naricom/[email protected] fromuser= (MW_SYS,MW_APP,STATDBA,DICTDBA) touser= (MW_SYS,MW_APP,STATDBA,DICTDBA) file= D:\SGTMSDB\init\SGTMS_201408131105.dmp Log=d:\sgtmsdb\init\sgtms_201408131105.log2 Linux Data pump
According to Oracle's documentation, the data pump in different ways to export imports, performance can be significantly different, this time just have the opportunity to test, migration table space, direct path, external table, and database chain export, import performance differences.
This test direct path export, import method.
First clear the user and tablespace imported from the previous article, and re-establish the test user and tablespace.
The new feature after oracle11g makes it possible to omit empty tables when allocating table spaces by default, to reduce the resource usage of tablespaces, so that empty tables are ignored when exporting user data using the EXP export of Oracle, which results in incomplete data and, of course, no way to export data using EXP , this has been mentioned before, do not do too much to repeat the http://jim123.blog.51cto.com/4763600/1934205. Using this method solves the empty table when exporting use
platform database version of the environment -------------------------------------------Dividing line-------------------------------------------- The following is a copy of the data import and export using Oracle's EXPDP/IMPDP data pump launched in 10g and 11g, the following Oracle-related commands are executed in the user Oracle environment: (PS:EXPDP/IMPDP is a server-side tool that runs only on the database server, so it is not affected by the cli
Tags: Oracle data pump AIX contabWith the gradual launch of multiple systems, as operations personnel, database backup appears to be particularly important, considering the current system resources are limited, the database is in non-archival mode, using the data pump to complete daily , online backup scripts are many, I also refer to some scripts on the Internet to share and modify them according to the ex
file.3) When Global_name=true, the Dblink name must be the same as the global database name global_name the remote database; otherwise, it can be named arbitrarily. (3) View Global_name parameter method: sql> show Parameters global_name; name type VALUE ---------------------------------------------------------------------------- global_names boolean FALSE 6. Remap_schema ParametersIt is well known that the Fromuser and Touser parameters of the Imp tool enable the migration of one
Tags: Oracle database Data MigrationDirectory:First, cold backup derivative considerationsSecond, data pump derivative considerationsFirst, cold backup derivative considerations1. Cold Recovery steps:A. Close the Source LibraryB. Transfer data files, control files, pfile, log files to the target repository from the source library, and give Oracle user rightsC, open the transferred pfile from the target library, modify the instance name, control file p
Exception Reason:1, wrote a cycle of death, this is the most likely.2, the operation of large data volume, resulting in the status of suspended animation.Solution: Managed Debug Assistants in Debug---Exceptions, remove ContextSwitchDeadlock a front hook.Exception information: The CLR cannot convert from COM context 0x645e18 to COM context 0x645f88, which has lasted for 60 seconds. The thread that owns the target context/unit is most likely to perform a non-p
Oracle Data Pump tool series: how to reinstall the DataPump EXPDP/IMPDP tool and how to reload the Datapump utility EXPDP/IMPDP this article applies to OracleDatabase-Enterprise Edition-Version 10.1.0.2 to 11.2.0.3 which may need to be reloaded in many different scenarios, for example, the database has all problems related to the DataPump initialization phase, such as the hang machine, internal error, and data dictionary incompatibility.In some cases,
Oracle Data Pump import/export Syntax 1. first create a directory: www.2cto.com create directory name as 'a directory on the database server ', such as: create directory alias as 'd: \ Server Directory name'; put the imported or exported files in this directory. 2. Export and Import files with SID = orcl. The exported dmp account is test, the account for importing dmp is test. If you export data from sfz: expdp test/test @ orcl directory = alias dumpf
first, the parameters1.directory2.dumpfile3.logfile4.tables=table1,table2 ... 5.estimate_only=y6.exclude=statistics or statistics=none , and then for the user dbms_stats.gather_schema_stats (' SCHEMANAME ') or for the table Dbms_stats.gather_table_stats (' TABLENAME ') 7.remap_schema=old:new8.remap_tablespace=old:new9.cluster=nSecond, Progress view"Reference" 1.HTTP://BLOG.ITPUB.NET/9240380/VIEWSPACE-754878/2.HTTP://WWW.CNBLOGS.COM/EASTSEA/P/4614479.HTML1, view related views Dba_datapump_jobs
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.