First, get help
Exp/imp help=y
Second, data export
1. Fully export the database and set the full option
Exp System/[email protected] file=d:\db.dmp full=y
2, export the database structure, do not export data, set the Rows option
Exp System/[email protected] file=d:\db.dmp rows=n full=y
3, when the export data volume is large, can be divided into multiple file export, set filesize options
Exp System/[email protected] file=d:\db1.dmp,d:\db2.dmp filesize=50m full=y
4. Export the system user in the database with the SYS user's table and set the owner option
Exp System/[email protected] file=d:\test_bak.dmp owner= (System,sys)
5. Export the table T_result,t_khtime in the database, set the tables option
Exp System/[email protected] file= d:\Test_bak.dmp tables= (t_result,t_khtime)
6. Export the data in the table T_score_result in the database updatedate the Date field is greater than a value, set the query option
Exp Kpuser/[email protected] file=d:\test_bak.dmp tables= (t_score_result) query=\ "where Updatedate>to_date (' 2016-9-1 18:32:00 ', ' yyyy-mm-dd hh24:mi:ss ') \ "
Third, data import
1, import DMP file, if the table already exists, will error and do not import the existing table, set the Ignore option
Imp system/[email protected] file=d:\test_bak.dmp ignore=y
2. Import a partial specified table in the DMP file, set the tables option
Imp kpuser/[email protected]=d:\kpuser.dmp tables= (T_score_result)
3. Import all the tables, indexes, and other objects that the specified user belongs to, and set the Fromuser option
Imp system/[email protected] file=d:\kpuser.dmp Fromuser=kpuser//kpuser must exist
Imp system/[email protected] file=d:\users.dmp fromuser= (kpuser,kpuser1,test)//kpuser,kpuser1,test user must exist
4. Import data into one or more users specified, set Fromuser and Touser options
Imp system/manager file=d:\kpuser.dmp fromuser=kpuser touser=kpuser1//kpuser1 must exist
Imp System/manager file=d:\users.dmp fromuser= (kpuser,kpuser1) touser= (Kpuser2, Kpuser3)//kpuser2, Kpuser3 must exist
Iv. optimization of Exp/imp
When the amount of data that needs to be imported/exported is relatively large, the process takes a long time, and we can use some methods to optimize the operation of the Exp/imp.
1. Exp
1) using direct path direct=y,oracle bypasses the SQL statement processing engine, reads data directly from the database file, and then writes the export file
2) If you do not use a direct path, you must ensure that the value of the buffer parameter is large enough.
2. Imp
1) Avoid disk sorting
Set the sort_area_size to a larger value, such as 100M
2) Avoid log switching wait
Increase the number of redo log groups and increase the log file size.
3) Optimizing the log buffers
For example, expand the log_buffer capacity by 10 times times (maximum not more than 5M)
4) using array insertion and submission
Commit = Y
Note: Arrays cannot handle tables that contain LOB and long types, and for such a table, if you use commit = Y, a commit is executed every time you insert a row.
5) Use nologging method to reduce redo log size
Specify option Indexes=n on import, import data only and ignore index, create index from script after data is finished, specify nologging option
Five, attention
1, the source user (Fromuser) is the user in the DMP file, not the user in the current database, the target user (Touser) refers to the data to be imported users, in the current database must not exist, if not exist, you need to create the user, if you only specify Fromuser, Touser is not specified, the default import database is under a user with the same name as Fromuser, that is, the user specified by Fromuser must exist in the database.
2, if the entire DMP file is imported, that is, full=y, it is best to delete the corresponding table in the DMP database and import
3. Ignore option
When Oracle recovers data, when a table is restored, the table already exists and depends on the setting of the Ignore option to decide what to do.
1) If Ignore=y,oracle does not execute the CREATE TABLE statement, inserts the data directly into the table, if the inserted record violates the constraint, such as a primary key constraint, the record of the error is not inserted, but the valid record is added to the table.
2) If Ignore=n,oracle does not execute the CREATE TABLE statement and does not insert the data into the table, it ignores the table error and resumes the next table.
4. Indexes option
In the process of recovering data, the index on the table will not be restored, but the unique index corresponding to the primary key will be restored unconditionally, in order to ensure the integrity of the data indexes=n.
5. Export/import with character set
When importing and exporting data, be aware of the problem with the character set. For a single-byte character set (for example, Us7ascii), when recovering, the database is automatically converted to the session's character set (Nls_lang parameter); For multibyte character sets (for example, zhs16cgb231280), when restoring, try to make the character set the same (avoid conversion), and if you want to convert, the character set of the target database should be a superset of the output database character set.
In the EXP/IMP process, you need to note the parameters of the four character sets: The client character set on the export side, the export-side database character set, the client character set on the import side, and the import-side database character Set It is usually best to set the client charset to the same as the database side when exporting. When data is imported, there are two main cases:
1) The source and target databases have the same character set settings
At this point, simply set the export and import side of the client Nls_lang equal to the database character set.
2) different character sets for source and target databases
The Nls_lang of the export client is set to the same as the database character set on the export side, the data is exported, then the Nls_lang of the import client is set to the same as the export side, and the data is imported so that the conversion occurs only on the database side and only once.
If the IMP and exp character sets are different, the import fails, you can change the UNIX environment variables or NT Registry Nls_lang related information, after the import is completed and then changed back.
6. Database objects have primary foreign key constraints
When the primary foreign key constraint is not met, the data is imported to fail with the workaround:
1) First import the primary table (parent table), then import from the table (child table)
2) Disable the primary foreign KEY constraint of the target import object, after importing the data, enable them
7. Exp/imp can be used across versions, but the versions of EXP and IMP must be used correctly
1) Always use the version of IMP to match the version of the database, for example: to import into 817, use the 817 IMP tool.
2) Always use Exp's version to match the lowest version in two databases, such as import from 9201 to 817, then use the 817 version of the Exp tool.
8. Export and Import permissions
Exp_full_database and Imp_full_database role permissions are required.
9. Storage allocation failure when importing large tables (greater than 80M)
The default exp, compress = Y, is to compress all the data on a block of data.
On import, if there is no contiguous large block of data, the import fails. When exporting large tables larger than 80M, remember that compress= N does not cause this error.
Oracle exp/imp Data Import Export tool basic usage