oracle10g data Pump EXPDP and IMPDP backup and recovery data ____oracle

Source: Internet
Author: User
Tags create directory table definition sqlplus

oracle10g Data Pump EXPDP and IMPDP backup and recovery data

Pre-database backup preparation work

Create a new backup directory directory and authorize the user

Steps:

1. Login Sqlplus

Account name: ptemp Password: 000000 host string: empdb

Sqlplus Execution statement: Conn ptemp/000000@empdb as Sysdba

Hint: Login Successful

2. Create directory backup directory

Sqlplus Execution statement: Create DIRECTORY Bakupkf_dir as ' D:\bakupKF '

Tip: Create directory success

3. Authorize directory directory to the user

Sqlplus Execution statement: GRANT READ, WRITE on DIRECTORY bakupkf_dir to Ptemp;

Hint: Authorization succeeded

Second, EXPDP backup data (including table structure)

Click: Menu-Run-cmd

Execute the following statement:

1. Table and table definitions for backing up the entire database

EXPDP ptemp/powersmart@empdbtest Directory=bakupcs_dir dumpfile=ptemp830.dmp

Note: The table definition and table data of the entire database are backed up, the default Content=all, so the entire database table definition and table data are exported.

2. Back up data for the entire database table (no table structure definition)

EXPDP ptemp/000000@empdb Directory=bakupkf_dir dumpfile=ptemp813data.dmp content=data_only

Note: When the content is set to all (the default), the object definition and all its data are exported. When data_only, only object data is exported, and only object definitions are exported when metadata_only

3. Back up the data and table definitions for the specified table (multiple tables can be selected)

EXPDP ptemp/000000@empdb directory=bakupkf_dir dumpfile=ptemp813tab.dmp tables=test_lr,test_lr2,test_lr3 ...

Note: Tables=test_lr,test_lr2,test_lr3 a "," connection between multiple tables

Third, IMPDP restore data

1. Table and table definitions for restoring the entire database

IMPDP ptemp/000000@empdb Directory=bakupkf_dir dumpfile=ptemp813.dmp table_exists_action=replace

Note: Tabble_exists_action={skip (default) | APPEND | TRUNCATE | REPLACE}

When this option is set to skip, the import job skips the existing table processing the next object, and when set to append, appends the data, and when it is truncate, the import job truncates the table and appends it with new data; When set to replace, the import job deletes the existing table. Rebuild the table and append the data, note that the TRUNCATE option does not apply to the cluster table and the Network_link option

2. Restore data only (TRUNCATE table restore method)

IMPDP ptemp/000000@empdb directory=bakupkf_dir dumpfile=ptemp813data.dmp content=data_only TABLE_EXISTS_ACTION= REPLACE

Note: When the content=data_only restore, only when the table and table data is empty, can import success, otherwise the "violation of UNIQUE constraint/primary key" will result in data restore failure. The solution to this problem is to rebuild the existing table by backing up database table data and table definitions in 1 ways.

3. TRUNCATE TABLE Restore Data

IMPDP ptemp/000000@empdb Directory=bakupkf_dir dumpfile=ptemp813data.dmp table_exists_action=truncate

Note: When Table_exists_action=truncate restore, the table definition is separated from the table data, that is, only the data is restored and the table definition is not restored. However, when a master-slave table is restored, data from the primary table will fail to recover due to primary key relationships, but the data from the table can be successful.

To solve the whole library data restoration method, we can use the "1. Restore the entire database table and table definition" to solve.

4. Restore the specified table

IMPDP ptemp/000000@empdb directory=bakupkf_dir dumpfile=ptemp813tab.dmp tables=test_lr,test_lr2,test_lr3 TABLE_ Exists_action=replace

More references: http://blog.csdn.net/jojo52013145/article/details/7966047

iv. data and table definition practice of Database Whole library

1. Business Scenario Description

A company development department currently has two Oracle databases, namely, the development library empdb, the test library empdbtest, now needs to synchronize the data of test library to the development library, need to apply Oracle data pump implementation.

2. Database instance Information

2.1 empdb Instance login information

2.2 Empdbtest Instance login information

3. Implementation steps

3.1 Creating a backup directory

The purpose of creating a backup directory is to tell the database instance which directory is an instance-specific directory, and to authorize the read and write permissions of the directory to the database instance user, where the meaning of creating a directory is tantamount to legalizing a directory, which is the purpose of registering. Therefore, creating a backup directory is a crucial step.

The process of creating the directory is as follows:

① creates a directory for the database instance empdbtest, as shown in the following illustration:

The execution statements are as follows (in order):

Ⅰconn ptemp/powersmart@empdbtest as SYSDBA (logged in as Administrator)

Hint: connected

Ⅱcreate directory Bakupcs_dir as ' D:\ptemp_bakup_CS ' (Create directory)

Tip: Catalog Creation succeeded

Note: You must have d:\ptemp_bakup_CS this directory under D disk, if not, create it

Ⅲgrant read,write on directory Bakupcs_dir to Ptemp

Hint: Authorization succeeded

② creates a directory for the database instance empdbtest, as above.

The execution statements are as follows (in order):

Ⅰconn ptemp/000000@empdb as Sysdba

Ⅱcreate DIRECTORY bakupkf_dir as ' D:\ptemp_bakup_KF '

Ⅲgrant read,write on directory Bakupkf_dir to Ptemp

After the above operation is completed, D disk should have the following two folders:

3.2 Backing up test library data

Back up the entire library of Empdbtest data and table definitions into the lower Ptemp_bakup_cs folder in D, backup file name: ptemp830.dmp

Operation Process: Start-run-cmd

Then execute the following statement in CMD:

EXPDP ptemp/powersmart@empdbtest DIRECTORY =bakupcs_dir dumpfile=ptemp830.dmp

The execution process is as follows:

Export successful, ptemp830.dmp file will be generated within Ptemp_bakup_cs folder

3.3 Synchronizing data to the development library

Copy the Ptemp830.dmp file under the Ptemp_bakup_cs folder to the PTEMP_BAKUP_KF folder to synchronize the data.

Execute the statement as follows:

IMPDP ptemp/000000@empdb Directory=bakupkf_dir dumpfile=ptemp830.dmp table_exists_action=replace

The implementation process is as follows:

The import process takes a lot of time, about 5-10 minutes, such as the computer screen appears motionless, is normal, please be patient.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.