Use a Data Pump to export remote data to a local database

Source: Internet
Author: User

The remote database user test has a t table.

  1. SQL> showUser
  2. USERIs"TEST"
  3. SQL>Select*FromTab;
  4. TNAME TABTYPE CLUSTERID
  5. -----------------------------------------------
  6. TTABLE
  7. SQL>! Cat/etc/hosts
  8. # DoNotRemove the following line,OrVarious programs
  9. # That require network functionality will fail.
  10. 127.0.0.1 linux localhost. localdomain localhost
  11. : 1 localhost6.localdomain6 localhost6
  12. 10.10.10.7 linux localhost. localdomain localhost
The local database user ing does not have a t table.
  1. SQL> showUser
  2. USERIs"ING"
  3. SQL>Select*FromTabWhereTname ='T';
  4. Unselected row
  5. SQL>! Cat/etc/hosts
  6. # DoNotRemove the following line,OrVarious programs
  7. # That require network functionality will fail.
  8. 127.0.0.1 linux localhost. localdomain localhost
  9. : 1 localhost6.localdomain6 localhost6
  10. 10.10.10.8 linux localhost. localdomain localhost
  11. 10.10.10.8 linux
  12. 10.10.10.8 localhost

Now, we need to import all objects under user test to user ing.

1. Create a database link object and link it to the database server of 10.10.10.7.

  1. SQL> conn system/Oracle
  2. Connected.
  3. SQL> showUser
  4. USERIs"SYSTEM"
  5. SQL>Create DatabaseLink link7Connect ToTest identifiedByTest using'10. 10.10.7/orcl';
  6. The database link has been created.
  7. SQL>Select Count(*)FromTest. t @ link7;
  8. COUNT(*)
  9. ----------
  10. 50027

2. the user test of the Remote Data 10.10.10.7 must have the exp_full_database permission.

  1. SQL> conn/AsSysdba
  2. Connected.
  3. SQL> showUser
  4. USERIs"SYS"
  5. SQL>GrantExp_full_databaseToTest;
  6. Authorization successful.

3. Execute export on the 10.10.10.8 server.

  1. [Oracle @ linux exp] $ expdp system/oracle @ orcl network_link = link7 directory = exp schemas = test
  2. Export: Release 10.2.0.4.0-ProductionOnWednesday, October, 2011 19:48:57
  3. Copyright (c) 2003,200 7, Oracle.AllRights reserved.
  4. Connect to: OracleDatabase10g Enterprise Edition Release 10.2.0.4.0-Production
  5. WithThe Partitioning, OLAP, Data MiningAnd RealApplication Testing options
  6. Start"SYSTEM"."SYS_EXPORT_SCHEMA_01": System/******** @ orcl network_link = link7 directory = exp schemas = test
  7. Using the BLOCKS Method for estimation...
  8. Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
  9. Total estimation using the BLOCKS method: 6 MB
  10. Processing object type SCHEMA_EXPORT/USER
  11. Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
  12. Processing object type SCHEMA_EXPORT/ROLE_GRANT
  13. Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
  14. Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
  15. Processing object type SCHEMA_EXPORT/TABLE/TABLE
  16. .. Exported"TEST"."T"4.667 MB 50027 rows
  17. The master table is successfully loaded/uninstalled."SYSTEM"."SYS_EXPORT_SCHEMA_01"
  18. **************************************** **************************************
  19. The dump file set of SYSTEM. SYS_EXPORT_SCHEMA_01 is:
  20. /U01/exp/expdat. dmp
  21. Job"SYSTEM"."SYS_EXPORT_SCHEMA_01"Completed successfully at 19:50:10

4. Run the import command on the 10.10.10.8 server.

  1. [Oracle @ linux exp] $ impdp system/oracle @ orcl directory = exp dumpfile = expdat. dmp remap_schema = test: ing
  2. Import: Release 10.2.0.4.0-ProductionOnWednesday, October, 2011 19:51:24
  3. Copyright (c) 2003,200 7, Oracle.AllRights reserved.
  4. Connect to: OracleDatabase10g Enterprise Edition Release 10.2.0.4.0-Production
  5. WithThe Partitioning, OLAP, Data MiningAnd RealApplication Testing options
  6. The master table is successfully loaded/uninstalled."SYSTEM"."SYS_IMPORT_FULL_01"
  7. Start"SYSTEM"."SYS_IMPORT_FULL_01": System/******** @ orcl directory = exp dumpfile = expdat. dmp remap_schema = test: ing
  8. Processing object type SCHEMA_EXPORT/USER
  9. ORA-31684: object type.USER:"ING"Already exists-- This user is not automatically created here.
  10. Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
  11. Processing object type SCHEMA_EXPORT/ROLE_GRANT
  12. Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
  13. Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
  14. Processing object type SCHEMA_EXPORT/TABLE/TABLE
  15. Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
  16. .. Imported"ING"."T"4.667 MB 50027 rows
  17. Job"SYSTEM"."SYS_IMPORT_FULL_01"Completed, but there is an error (completed at 19:51:28)

5. Check the user ing on 10.10.10.8.

  1. SQL> conn ing/ing
  2. Connected.
  3. SQL> showUser
  4. USERIs"ING"
  5. SQL>Select*FromTabWhereTname ='T';
  6. TNAME TABTYPE CLUSTERID
  7. -----------------------------------------------
  8. TTABLE
  9. SQL>Select Count(*)FromT;
  10. COUNT(*)
  11. ----------
  12. 50027
Note: The imported users have unlimited power to use the table space: alter user ing quota unlimited on users; the system users here certainly have this power. This can be done for a small database because it depends on the network. For large databases, export data from the data source and copy the data to the target database for import.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.