Oracle's EMP Import IMP Export command

Source: Internet
Author: User
Tags create index

Oracle Import and Export command in detail (with database daily scheduled backup script) one. Export Tool EXP1. It is the next executable file directory of the operating system/oracle_home/bin  the &NBSP;EXP Export tool compresses the data backup in the database into a binary system file. Can be migrated between different OS        It has three modes:       a.   User mode: Export all user objects and data in Objects;       b.   Table mode: Export all tables or tables specified by the user ;       c   Entire database: Exports all objects in the database. 2. Export tool exp Interactive command-line mode using example $exp test/[email protected]enter array fetch buffer size:4096 > return export file:expdat.dm p > m.dmp   generate exported filename   (1) E (Ntire database), (2) U (sers), or (3) T (Ables): (2) u > 3Export table Data (yes/no): Yes > enter compress extents (yes/no): yes > enter export done in ZHS16GBK character set and ZHS16GBK NCHAR character UT to export specified tables via conventional Path ... Table (T) or Partition (T:P) to IS exported: (RETURN to quit) > cmamenu   table name to export. . Exporting table                        cmamenu     & nbsp 4336 rows exportedtable (T) or PaRtition (T:P) to being exported: (Return to quit) > table name to Export ntable (T) or Partition (T:P) to being exported: (Return to quit) > Enter export terminated successfully without warnings. 3. Export tool exp Non-interactive command line example $exp scott/tiger tables=emp,dept file=/directory/scott.dmp grants=y  Description: EMP Two tables from Scott users, Dept Export to file/directory/scott.dmp$exp scott/tiger tables=emp query=\ "where job=\ ' salesman\ ' and sal\<1600\" file=/ directory/scott2.dmp  Description: In exp plus export EMP query conditions job= ' salesman ' and sal<1600     (but I personally seldom use this, Or to generate a temporary table for the records that meet the conditions, then exp will be handy) $exp Parfile=username.par file=/directory1/username_1.dmp,/directory1/username_2.dmp filesize=2000m log=/directory2/username_exp.log parameter file username.par content userid=username/userpasswordbuffer= 8192000compress=ngrants=y Description: Username.par for EXPORT tool exp parameter file, inside the specific parameters can be modified as needed       FILESIZE Specifies the maximum number of bytes of the generated binary preparation file            (available to resolve 2G physical file limitations in some OS and speed up compression and easy engraving of historical data discs, etc.) 4. Command parameter Description keyword description (default)---------------------------------------------------USERID username/Password full Export the entire file (N) the size of the buffer data buffers the owner user Name list file output file (Expdat. DMP) TABLES table Name list compress import a range (Y) recordlength IO record length grants export permission (Y) Inctype incremental export Type Indexes export index (y) record trace incremental export (y) ROWS Export data row (y) parfile parameter filename constraints export limit (Y) consistent crosstab consistency log screen output logfile Statistics parse object (ESTIMATE) direct path (N) TRIGGERS Export Trigger (Y) FEEDBACK displays the progress of each x line (0) filesize The maximum size of each dump file query the clause for the subset of exported tables The following keywords are used only for transportable tablespace Transport_tablespace Export the transportable tablespace metadata (N) tablespaces the list of tablespaces that will be transferred two. Import Tool IMP1. It is the next executable file directory of the operating system/oracle_home/binimp Import Tool imports exp formed binary system files into the database .    It has three modes:       a.   User mode: Export all user objects and data in Objects;       b.   Table mode: Export all tables of users or specified tables;       c.   Entire database: Exports all objects in the database.            only users with Imp_full_database and DBA authority can do the entire database import       imp steps:     (1) CREATE TABLE   (2) Insert data   (3) create INDEX (4) Create TRIGGERS,CONSTRAINTS2. Import Tool imp interactive command line side Examples of the formula $ impimport:release 8.1.6.0.0-production onFriday December 7 17:01:08 2001 (c) Copyright 1999 Oracle Corporation.  all rights reserved. User name:  test Password: * * * Connect to: oracle8i Enterprise Edition Release 8.1.6.0.0-64bit productionwith The partitioning optionjserver Release 8.1.6.0.0-production import file: expdat.dmp>/tmp/m.dmp input Insert buffer size (min. 8192) 30720&gt Export files created by export:v08.01.06 by regular path warning: This object is exported by TEST, not the current user has completed the ZHS16GBK character set and the import in ZHS16GBK NCHAR character set lists only the contents of the import file (yes/no): No > Because the object already exists, ignore create error (yes/no):no> Yes import permission (yes/no):yes> Import table Data (yes/no):yes> Import the entire export file (yes/no):no> yes. Importing the object of test to SCOTT. . Importing tables                       "Cmamenu"       4336 lines are imported The import was terminated successfully, but a warning appears.  3. Importing Tools Imp example of non-interactive command line method $ imp system/manager fromuser=jones tables= (accts)  $ imp System/manager fromuser= Scott Tables= (emp,dept)  $ imp system/manager fromuser=scott touser=joe tables=emp $ imp scott/tiger file = expd AT.DMP full=y $ imp Scott/tiger file =/mnt1/t1.dmp show=n buffer=2048000 iGnore=n commit=y grants=y full=y log=/oracle_backup/log/imp_scott.log$ imp system/manager parfile=params.dat  Params.dat content  file=dba.dmp show=n ignore=n grants=y fromuser=scott tables= (dept,emp)  4. Import Tools Imp problems may occur (1) Database objects already exist in the general situation, before importing data should be completely deleted from the target data table, sequence, function/process, triggers, etc.;   Database object already exists, import failure by default IMP parameter if the parameter ignore=y is used, the data contents inside the exp file will be imported if the table has a unique keyword constraint, the condition will not be imported if the table does not have a constraint for a unique keyword, the record will be duplicated ( 2) The database object has a primary foreign KEY constraint       does not conform to the primary foreign key constraint, the data import fails        Solution: Import the primary table first, then import the dependent table disable the target Import object's primary foreign KEY constraint, After entering the data, enable them (3)   Permissions if you want to import a user's data into B users, a user needs to have Imp_full_database permissions (4)   Import large table (greater than 80M), storage allocation failure     & nbsp The default exp, compress = Y, which is to compress all the data on a block of data .      Import fails if there is no contiguous large block of data .      exporting large tables above 80M , remember Compress= N, it will not cause this error. (5) IMP and exp use different character sets       If the character set is different, the import will fail, you can change the UNIX environment variable or NT Registry Nls_lang related information .      Import after the completion of the change back . (6) IMP and EXP versions cannot be compatible with IMP can successfully import low version exp generated files, cannot import high version exp generated files According to the situation we can use $ imp username/[email&nbSp;protected]_string Description: Connect_string is in/oracle_home/network/admin/tnsnames.ora    Name of the local or remote database defined 5. Command parameter Description keyword description (default)   ----------------------------------------------USERID username/password Full Import the entire file (N) buffer data buffer size fromuser owner user Name list file input files (expdat. DMP) touser user Name list show only list file contents (n) TABLES table Name list Ignore ignore creation error (N) recordlength IO record length grants import permission (Y) Inctype incremental Import Type Indexes import index (Y) Commit array Insert (n) Rows import data row (y) parfile parameter file name log screen output logfile constraints import limit (y) DESTROY overwrite tablespace data file (n) indexfile writes table/index information to the specified text Skip_unusable_indexes Skip maintenance of unavailable indexes (N) ANALYZE execute ANALYZE statement in dump file (Y) FEEDBACK Show progress per x line (0) toid_novalidate skip check F for the specified type ID Ilesize maximum size of each dump file Recalculate_statistics recalculate statistics (n) The following keywords are used only for transportable tablespace transport_tablespace import the transportable tablespace metadata (n) tablespaces The tablespace that will be transferred to the database datafiles the data file that will be transferred to the database tts_owners has a user three that can transmit data in the table space set. Oracle Database scheduled backup script under UNIX (per user backup) The following commands can be executed in crontab mode every night 2 o'clock, script up to 30 files, each file size up to 1G of the form of the database export, if the database data volume is large, more than 30G, it will be exported unsuccessful, You can simply adjust the value of the variable num based on the volume size of the data. The script exports the database, compresses it with gzip, and then saves it to the system/data/expfiles, before it is exported, and the previous day's backup is movedTo/data/expfiles_bak, this backup method can save the last two days of data backup. outfile= ' Date +%y%m%d_%h%m ' num=30i=1files=if [!  -d/data/expfiles]; THENMKDIR/DATA/EXPFILESFIIF [!  -d/data/expfiles_bak]; thenmkdir/data/expfiles_bakfidfile= ' Ls-1/data/expfiles_bak/|awk ' {if (NR < 2) {print $} ' prefix= ' echo | awk ' {print substr ("'" ${dfile} "'", 1,13)} ' rm-f/data/expfiles_bak/${prefix}*dfile= ' ls-1/data/expfiles/|awk ' {if (NR < 2) {print $} ' prefix= ' echo | awk ' {print substr ("'" ${dfile} "'", 1,13)} ' mv/data/expfiles/${prefix}*/data/expfiles_bakwhile [$i  -lt  $ Num]dofiles= $files/data/expfiles/${outfile}_$i.dmp,i= ' expr $i + 1 ' donefiles= $files/data/expfiles/${outfile}_$ I.dmp#echo $filesexp userid=user/[email protected] file= $files filesize=1024m grants=n 2>>exp_rpt.loggzip/data/expfiles/${outfile}* 1, the Oracle Database Export command, refer to the following exp system/[email protected] file=d:\mb.dmp owner= (Scott)----------------------------- --------------Data export:  1 full export of database test, user Name System Password Manager exported to D:\daochu.dmp    exp system/[email  Protected] file=d:\daochu.dmp full=y 2 Export the table of system user and SYS user in database   &NBSP;EXP system/[email protected] File=d:\daochu.dmp owner= (System,sys)  3 the tables in the database table1, table2 export    exp system/[email protected] File=d:\daochu.dmp tables= (table1,table2)   4 The fields in the table table1 in the database filed1 the data that starts with "00"    exp system/ [email protected] file=d:\daochu.dmp tables= (table1) query=\ "where filed1 like ' 00% ' \"       & nbsp, the above is a common export, for compression I do not care, with WinZip to the DMP file can be very good compression.       But add compress=y   after the above command to import the data  1 import the data from the D:\DAOCHU.DMP into the test database.    imp system/[email protected]  file=d:\daochu.dmp  ignore=y    above may be a problem, Because some tables already exist, and then it is an error, the table is not imported.     Add ignore=y to the back. &nbsp  imp aichannel/[email protected] full=y file=d:\datanewsmgnt.dmp ignore=y    imp  Aichannel/[email protected] full=y file=e:\20150714-1.dmp ignore=y  2 Import table D:\daochu.dmp in table1  imp aichannel/[email protected]  file=d:\daochu.dmp  tables= (table1)   imp  aichannel/[email protected]  file=d:\daochu.dmp  tables= (table1)   Basically the above import and export is enough. In many cases I have completely removed the table and then imported it. ---------TFX Case One: 9I database Aichannel on 9.3 Import and export operations 1, run &GT;&GT;CMD2, D:3, CD D:\oracle\ora92\bin4, exp aichannel/[ Email protected] File=e:\20150714-1.dmp full=y5, imp aichannel/[email protected] full=y file=e:\ 20150714-1.dmp ignore=y --Single table import

imp system/[email protected] file=e:\0825.dmp tables= (acd_code,acd_codetype) ignore=y buffer= (5400000)

---------TFX Note that the:  operator must have sufficient permissions to be prompted for insufficient permissions.   Database can be connected. You can use tnsping test to get the database test to connect.   You can enter the IMP command and your username/password followed by a username/password command: routine: Imp Scott/tiger Alternatively, you can control "import" according to different parameters by entering the IMP command and various parameters. To specify parameters, you can use the keyword: format: Imp keyword=value or keyword= (value1,value2,..., Vlauen) Routines: Imp scott/tiger ignore=y tables= (emp,dept Full=n or tables= (T1:P1,T1:P2), if T1 is the partition table appendix one:  give users the ability to add import data permissions   First, start sql*puls  second, to System/manager login   Third, create user username identified by password (this step can be omitted if the user has already been created)   IV, GRANT create User,drop user,alter user, create any VI EW,    drop any view,exp_full_database,imp_full_database,       DBA,CONNECT, Resource,create SESSION to User name   V, run-cmd-into the directory where the DMP file,       IMP Userid=system/manager full= Y file=*.dmp       or imp userid=system/manager full=y file=filename.dmp  Execution Example:  f: Workoracle_databackup>imp userid=test/test full=y file=inner_notify.dmp  screen display  import:release 8.1.7.0.0 -Production on Thu February 16:50:05 2006  (c) Copyright: Oracle Corporation. All rights reserved.  connected to: Oracle8i Enterprise Edition Release 8.1.7.0.0-production with the partitioning optio N jserver Release 8.1.7.0.0-production  Export files created by export:v08.01.07 via regular path   completed ZHS16GBK character set and ZHS16GBK The import   export server in the NCHAR character set uses the UTF8 NCHAR character set (possible ncharset conversions)  . Importing Aichannel objects into aichannel . . Importing tables                   "inner_notify"         &NBSP;4 lines are imported   Prepare to enable constraints ...  successfully terminated the import, but a warning appears.   Appendix II:  oracle does not allow direct changes to the table owner, the use of export/import can achieve this purpose .  first establish import9.par,  then, use the command as follows: Imp parfile=/ filepath/import9.par  Example Import9.par content is as follows:         FROMUSER=TGPMS               TOUSER=TGPMS2     (note: Change the owner of the table from Fromuser to Touser,fromuser and Touser users can be different)                 rows=y       &NBsp indexes=y         grants=y         constraints=y     & nbsp   buffer=409600         file==/backup/ctgpc_20030623.dmp         log==/backup/import_20030623.log 

Oracle's EMP Import IMP Export command

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.