Import and export of Oracle databases

Source: Internet
Author: User
Tags create directory

Examples of common commands:

Export:

EXPDP Acl/[email protected] directory=dir schemas=acl dumpfile=acl. DMP Logfile=acl.log

Import:IMPDP acl/[email protected] directory=dir schemas=acl dumpfile=acl. DMP Logfile=acl.log 

Note: Do not have a semicolon after the export statement


Oracle Data Import and export imp/exp and EXPDP/IMPDP

When we need to import and export the Oracle database data, we just need to write a batch script of the data Pump command to make the data operation easy. For example, write a. bat file under Windows, or write a. sh file under Linux, where you can add the following desired command execution. However, EXPDP/IMPDP can only perform import and export commands on the server where the database resides.

Exp/imp has been very useful, but the only drawback is that it is too slow, starting with Oracle 10g, a new tool called the data Pump EXPDP/IMPDP, which provides high-speed parallel and big data migrations for Oracle data.

Imp/exp can be called on the client side, but EXPDP/IMPDP can only be on the server, because a directory needs to be created in the database before using EXPDP/IMPDP.

Expdp/impdp

First, directory directories:

1. View the Administrator directory (also see if the operating system exists because Oracle does not care if the directory exists, and if it does not exist, an error occurs)
SELECT * from Dba_directories;

2. set up Database specified directory : (If you do not want to use the old directory or the specified directory does not exist) Note: The General database directory only allows SYS

Create directory dir as ' D:\dump '; Dir name can be arbitrarily named need to be created manually , namely:Create directory alias as ' d:\ Server directory name ', Place the imported or exported files in the server directory.

Note: Create the directory as you import, and place the files you want to import in the established directory . and to be generated when exporting Log is also placed below , Otherwise it will be an error.

3, give the user permission to operate in the specified directory , preferably given by administrators such as system.

Select Privilege,table_name from Dba_tab_privs where grantee= ' Scott '; --Query permissions

Grant Connect,resource to Scott;

Grant Read,write on directory dir to Scott; ----to give this permission at least

Grant Exp_full_database,imp_full_database to Scott;

4. Delete the custom directory

Delete directory dir;

Second, export: 1. Export the parameters related to the content: EXPDP: Export Command
Ktrade/[email protected]: user and DB instance validation directory=dump: Specify the directory where the dump and log files are located, and the directory objects are objects created using the Create DIRECTORY statement, not the OS directory? OS directory NOTE: This parameter needs to be placed in front of the position, in the following may be error ORA-39002: Invalid operation
Schemas:Schemas=ktradeNote: Export all objects under an instance DumpFile:
Used to specify the name of the dump file, the default name is Expdat.dmp
Dumpfile=[directory_object:]file_name [,....]
The directory_object is used to specify the directory object name, file_name is used to specify the dump file name, and if Directory_object is not specified, the export automatically uses the directory object specified by the directory option
logfile:
Specifies the name of the exported log file file, the default name is Export.log
Logfile=[directory_object:]file_name
The directory_object is used to specify the directory object name, and file_name is used to specify the export log file name. If you do not specify Directory_object. Export automatically uses the appropriate option values for your directory.
content: content=metadata_only: Export table Structurecontent=data_only: Exporting table Datacontent=all: Export table structure + table dataNote: If you do not specify content, all is by default tables:tables=tablename1,tablename2Note: If you do not write, the export mode by default allFull :full=yNote: Export all libraries, requiring a higher-privileged sys or system user Parallel:parallel=: The degree of parallelism setting, based on the number of CPUs tablespace:tablespace=: Export all data in table space transport_tablespaces:transport_tablespaces=: Export table space definition for table space transfer Parfile:parfile=: You can load a parameter file in a file query:query= "where id=20": Used to conditionally query, used when exporting by table

FileSize

Specifies the maximum size of the exported file, which defaults to 0 (indicates no limit on file size)

job_name:
Specifies the name of the action to export, which defaults to sys_export_schema_01
Job_name=jobname_string
SELECT * from dba_datapump_jobs;--view existing job 2. Example:

in SID=ORCL, the account for the export DMP is test, and the account for the DMP is test for example.
to export data from SFZ:
EXPDP Test/[email protected] directory= alias dumpfile= export file name
1) According to the user guide

EXPDP scott/[email protected] directory=dir dumpfile=expdp.dmp Schemas=scott logfile=expdp.log

2) parallel process parallel

EXPDP scott/[email protected] directory=dir dumpfile=scott3.dmp parallel=40 job_name=scott3 logfile =expdp.log

3) According to the table name guide

EXPDP scott/[email protected] directory=dir dumpfile=expdp.dmp tables=emp,dept logfile= Expdp.log

4) guided by query criteria

EXPDP scott/[email protected] directory=dir dumpfile=expdp.dmp tables=emp query= ' WHERE deptno=20 '

5) According to the table space guide

EXPDP system/manager directory=dir dumpfile=expdp.dmp tablespaces=temp,example

6) Guide the entire database

EXPDP system/manager directory=dir dumpfile=full.dmp full=y

  third, import: 1 . Import parameters related to the content: IMPDP: Import Command

Remap_schema:

This option is used to load all objects of the source scheme into the target scenario (when the two scheme names are not the same).
Remap_schema=source_schema:target_schema

Remap_tablespace:

Import all objects from the source table space into the target tablespace (when two tablespace names are not the same)
Remap_tablespace=source_tablespace:target:tablespace

Remap_datafile:
This option is used to convert the source data file name to the target data file name, which may be required when moving tablespaces between different platforms.
Remap_datafile=source_datafie:target_datafile 2. Example:

Note : If the imported database is New Database , then you need to create a table space first , as well users, such as the following:

drop tablespace koauth24 including contents and datafiles;Create tablespace koauth24 datafile ' F:\app\Administrator\product\4.0data\koauth24 ' size 100M autoextend on next 100M;drop user koauth24 cascade;create user koauth24 identified by kingddom88 default tablespace koauth24;Grant Dba,resource,connect to Koauth24;Grant Select any table to Koauth24;Grant Update any table to koauth24;Grant insert any table to koauth24;

1) leads to the specified user

IMPDP Scott/[email protected] Directory=dir dumpfile=expdp.dmp Schemas=scott logfile=impdp.log

2) Change the owner of the table

IMPDP system/manager directory=dir dumpfile=expdp.dmp tables=scott.dept remap_schema=scott:system;

3) Import table Space

IMPDP System/manager directory=dir dumpfile=tablespace.dmp tablespaces=example;

4) Import the database

impdb System/manager directory=dump_dir dumpfile=full.dmp full=y;

5) Append Data

IMPDP system/manager directory=dir dumpfile=expdp.dmp schemas=system table_exists_action

6. When the user name, tablespace name, etc. are different when importing, you can make a map
Remap_schema=test (name of Export): Test1 (name to be imported)

Remap_tablespace=oauth2:koauth2

Finally, you can check the database is not all tables have the , Insufficient can be re- filled .

Imp/exp

First, Data export:

1 full export of database test, user Name System Password Manager exported to D:\daochu.dmp

Exp System/[email protected] file=d:\daochu.dmp full=y

2 Exporting the system user in the database to the SYS user's table

Exp System/[email protected] file=d:\daochu.dmp owner= (System,sys)

3 Exporting Tables Table1, table2 in the database

Exp System/[email protected] file=d:\daochu.dmp tables= (table1,table2)

4 Export the field filed1 in table table1 in the database with data beginning with "00"

Exp System/[email protected] file=d:\daochu.dmp tables= (table1) query=\ "where filed1 like ' 0% ' \"

the above is a common export, for compression I do not care, with WinZip to the DMP file can be very good compression. But add compress=y to the upper order.

Second, the import of data:

1 Import the data from the D:\DAOCHU.DMP into the test database.

Imp System/[email protected] File=d:\daochu.dmp

There may be a problem, because some tables already exist, and then it is an error, and the table is not imported.

Add ignore=y to the back.

2 Import the table table1 in D:\daochu.dmp

Imp system/[email protected] file=d:\daochu.dmp tables= (table1)

Note: Basically the above import and export is enough. In many cases I have completely removed the table and then imported it.

Data import between users with the same name:

Imp hkb/[email protected] file=c:\orabackup\hkbfull.dmp log=c:\orabackup\hkbimp.log full=y

Data import between different names:

Imp system/[email protected] FROMUSER=HKB touser=hkb_new file=c:\orabackup\hkbfull.dmp

Log=c:\orabackup\hkbimp.log;

Import and export of Oracle databases

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.