SQL loader can smoothly import data stored in text format to the oracle database,
It is a convenient and common tool for data migration between different databases.
The disadvantage is that the speed is relatively slow, and it is a little troublesome for blob and other types of data.
Usage: SQLLDR keyword = value [, keyword = value,...]
Valid keywords:
Userid -- ORACLE username/password
Control-control File
Log-recorded log files
Bad-bad Data File
Data-data files
Discard-discarded data file
Discardmax-maximum value that can be discarded (all default values)
Skip -- Number of logical records to skip (0 by default)
Load -- Number of logical records to load (all default values)
Errors-Number of allowed error records (50 by default)
Rows -- Number of rows in conventional path bind array or between direct path data saves
(Number of records submitted each time. Default Value: general path 64, all direct paths)
Bindsize -- Size of conventional path bind array in bytes (256000 by default)
The buffer size of each submitted record (in bytes, default value: 256000)
Silent -- disable output information (header, feedback, errors, discards, partitions)
Direct-import using the direct path (default value: FALSE)
Parfile -- parameter file: name of file that contains parameter specifications
Parallel -- parallel Import (default value: FALSE)
File -- File to allocate extents from
Bindsize and bindsize are used in pairs. Smaller ones are automatically adjusted to larger ones.
Sqlldr first calculates the length of a single record, multiplied by rows. If it is smaller than bindsize, it does not try to expand rows to fill the bindsize. If it is exceeded, the bindsize prevails.
External_table
-- Use external table for load; NOT_USED, GENERATE_ONLY, EXECUTE (default NOT_USED)
Columnarrayrows
-- Number of rows for direct path column array (5000 by default)
Streamsize -- Size of direct path stream buffer in bytes (256000 by default)
Multithreading
-- Use multithreading in direct path
Resumable -- enable or disable resumable for current session (default: FALSE)
Resumable_name
-- Text string to help identify resumable statement
Resumable_timeout
-- Wait time (in seconds) for RESUMABLE (7200 by default)
Date_cache -- size (in entries) of date conversion cache (1000 by default)
Note: You can specify the command line parameter either by location or by keyword. Example of the former: 'sqlldr scott/tiger foo ';
Example of the latter: 'sqlldr control = foo userid = scott/tiger ';
You cannot use the keyword to specify the position-based hybrid mode;
For example, 'sqlldr scott/tiger control = foo logfile = log' is allowed,
However, 'sqlldr scott/tiger control = foolog' is not allowed.
For clarity, it is best to specify all command line parameters with keywords.
Control file:
A script file for control commands usually ends with ctl. The content is as follows:
LOAD DATA
INFILE 't. dat 'file to be imported
// INFILE 'tt. date'
// INFILE * indicates that the content to be imported is after BEGINDATA in the control file.
Into table table_name specifies the mounted TABLE
BADFILE 'C: \ bad.txt '(optional) specifies the bad file address. By default, A. bad file consistent with the original file name is generated in the current directory.
* ************ The following are four ways to load tables.
APPEND: add the data in the original table to the end.
INSERT loads empty tables. If the original table has data sqlloader, the default value is stopped.
REPLACE original tables with data. Original data will be deleted.
If the content specified by TRUNCATE is the same as that specified by replace, the truncate statement is used to delete existing data.
* ************ Specify the delimiter
Fields terminated by ', 'optionally enclosed '"'
// Terminated by writespace separated BY blank space
When the fields in the trailing nullcols table do not have the corresponding value, it can be blank.
* ************ The following table fields
(
Col_1, col_2, col_filler FILLER // The value of the keyword column is not loaded
// Lg, lg, and not result lg
)
If fields terminated by ',' is not declared, you can use the following two methods to implement the same function:
1. Specify a separator for each column
(
Col_1 [interger external] terminated ',',
Col_2 [date "dd-mon-yyy"] terminated ',',
Col_3 [char] terminated by ', 'optionally enclosed by 'lg'
)
2. Use location to tell the field to load data
(
Col_1 position ),
Col_2 position (3: 10 ),
Col_3 position (*: 16), // the start position of this field.
Col_4 position (1:16 ),
Col_5 position () char (8) // specifies the field type
)
BEGINDATA corresponds to the starting INFILE * the content to be imported is in the control file
10, SQL, what
20, lg, show