Keyword description (default)
------------------------------------------------------------------------------
Attach connects to an existing job, for example, attach [= job name].
Content specifies the data to be loaded. The valid keyword is:
(All), data_only and metadata_only.
Directory is the directory object used for dumping files, log files, and SQL files.
Dumpfile: the list of dump files to be imported from (expdat. dmp,
For example, dumpfile = scott1.dmp, scott2.dmp, dmpdir: scott3.dmp.
Encryption_password is the key word used to access the encrypted column data.
This parameter is invalid for network import jobs.
Estimate calculates the estimated job value. The valid keyword is:
(Blocks) and statistics.
Exclude is used to exclude specific object types, such as exclude = table: EMP.
Flashback_scn is used to set the session snapshot back to the SCN in the previous state.
Flashback_time is used to obtain the SCN time closest to the specified time.
Full import all objects from the source (y ).
Help displays the help message (n ).
Include includes specific object types, such as include = table_data.
The name of the import job to be created in job_name.
Logfile Log File Name (import. Log ).
The name of the remote database to which network_link is linked.
Nologfile does not write log files.
Parallel changes the number of active workers of the current job.
Parfile specifies the parameter file.
Query is the predicate clause used to import a subset of a table.
Remap_datafile redefines data file references in all DDL statements.
Remap_schema loads objects in one scheme to another.
Remap_tablespace remaps a tablespace object to another tablespace.
Reuse_datafiles if the tablespace already exists, initialize it (n ).
List of schemas import schemes.
Skip_unusable_indexes skips indexes that are set to useless indexes.
Sqlfile writes all SQL DDL statements to the specified file.
When the default value (0) shows the new status when available,
The frequency (in seconds) job status to be monitored.
Streams_configuration enable stream metadata Loading
The operation performed when the table_exists_action import object already exists.
Valid keywords: (skip), append, replace, and truncate.
Tables identifies the list of tables to be imported.
Tablespaces identifies the list of tablespaces to be imported.
Transform must be applied to the metadata conversion of applicable objects.
Valid conversion keywords: segment_attributes, storage
Oid and pctspace.
Transport_datafiles: list of data files imported in transmission mode.
Transport_full_check verifies the storage segments of all tables (n ).
The list of tablespaces from which transport_tablespaces will load metadata.
Valid only for the network_link mode import operation.
The version of the object to be exported. The valid keyword is:
(Compatible), latest or any valid database version.
Valid only for network_link and sqlfile.
The following commands are valid in interactive mode.
Note: abbreviations are allowed.
Command description (default)
------------------------------------------------------------------------------
Continue_client returns to record mode. If the job is idle, the job is restarted.
Exit_client exits the Client Session and keeps the job running.
Help summarizes interactive commands.
Kill_job separates and deletes jobs.
Parallel changes the number of active workers of the current job.
Parallel = <number of workers>.
Start_job start/restore the current job.
Start_job = skip_current will be skipped before the job starts
Any operation performed when the job is stopped.
When the default value (0) shows the new status when available,
The frequency (in seconds) job status to be monitored.
Status [= interval]
Stop_job closes the executed job in sequence and exits the client.
Stop_job = immediate will be closed immediately
Data pump operation.