I. Features of SQL Loader
Oracle has many tools for data migration, backup, and recovery. However, each tool has its own characteristics.
For example, exp and IMP can be used to export and export data in the database, which is a good tool for database backup and recovery. Therefore, exp and IMP are mainly used for hot backup and recovery of the database. It has the advantages of high speed, easy to use, and fast. At the same time, it also has some disadvantages. For example, during the export and import process between databases of different versions, there will always be one or another problem, this may be the compatibility of Oracle's own products.
The SQL loader tool does not solve this problem. It can smoothly import data stored in text format to the Oracle database, it is a convenient and common tool for data migration between different databases. The disadvantage is that the speed is relatively slow, and it is a little troublesome for blob and other types of data.
Ii. sqlldr help:
C:/Documents and Settings/David> sqlldr
SQL * Loader: Release 10.2.0.1.0-production on Thursday July 2 08:54:06 2009
Copyright (c) 1982,200 5, Oracle. All rights reserved.
Usage: sqlldr keyword = value [, keyword = value,...]
Valid keywords:
Userid -- Oracle user name/Password
Control -- Control File Name
Log -- Log File Name
Bad -- incorrect file name
Data -- Data File Name
Discard -- discard file name
Discardmax -- number of files that can be discarded (all default values)
Skip -- number of logical records to be skipped (0 by default)
Load -- number of logical records to be loaded (all default values)
Errors -- number of allowed errors (50 by default)
Rows-the number of rows in an array bound to a conventional path or stored in a direct path
(Default: general path 64, all direct paths)
Bindsize -- size of the array bound to the regular path (in bytes) (256000 by default)
Silent -- hide messages (title, feedback, error, discard, partition) during running)
Direct -- use the direct path (default value: false)
Parfile -- parameter file: name of the file containing the parameter description
Parallel -- execute parallel loading (default value: false)
File -- the file from which the partition is to be allocated
Skip_unusable_indexes -- useless indexes or index partitions are not allowed/allowed (false by default)
Skip_index_maintenance -- indexes are not maintained and the affected indexes are marked as useless (false by default)
Commit_discontinued -- submit the rows that have been loaded when loading is interrupted (false by default)
Readsize -- size of the read buffer (1048576 by default)
External_table -- use external tables for loading; not_used, generate_only, execute (No by default)
T_used)
Columnarrayrows -- number of rows in the direct path column array (5000 by default)
Streamsize -- the size of the direct runoff buffer (in bytes) (256000 by default)
Multithreading -- use multithreading in the direct path
Resumable -- enable or disable the current recoverable SESSION (false by default)
Resumable_name -- a text string that helps identify recoverable statements
Resumable_timeout -- waiting time of resumable (in seconds) (7200 by default)
Date_cache -- size of the date-converted high-speed cache (calculated by entries) (1000 by default)
Please note: the command line parameters can be specified by the location or keyword.
. The former example is 'sqlload
Scott/tiger Foo '; the next example is 'sqlldr control = foo
Userid = Scott/tiger '. The specified location parameter must be earlier
However, it cannot be later than the parameter specified by the keyword. For example,
'Sqlldr Scott/tiger control = Foo logfile = log' is allowed,
'Sqlldr Scott/tiger control = Foo log' is not allowed, even if
The location of the 'log' parameter is correct.
Iii. SQL loader instance
Create a table: test_load,
SQL> Create Table test_load (ID number (10), uname varchar2 (20), phone varchar2 (20 ));
Control File: D:/loader. CTL
Load data
Infile 'd:/datafile.txt'
Append into Table test_load
Fields terminated by "," optionally enclosed '"'
(ID, uname, phone)
Data File: D:/datafile.txt
001, "David", 0551-1234567
002, "Dave", 0551-7654311
Sqlloader command:
SQL> sqlldr 'sys/admin as sysdba 'control = D:/loader. CTL log = D:/a. Log
SQL * Loader: Release 10.2.0.1.0-production on Thursday July 2 09:40:42 2009
Copyright (c) 1982,200 5, Oracle. All rights reserved.
Commit point reached-logical record count 1
Commit point reached-logical record count 2
SQL> select * From Test2;
Empno ename Sal deptno
----------------------------------------
10002 Frank 500 20
10001 Scott 1000 40
Iv. Detailed description of Control Files
Load records with a variable length or a specified length. "*" indicates that the data file has the same name as this file, that is, the begindata segment is used to identify the data later.
For example:
Load data
Infile *
Into Table load_delimited_data
Fields terminated by "," optionally enclosed '"'
Trailing nullcols
(Data1,
Data2
)
Begindata
11111, aaaaaaaaaa
22222, "a, B, c, d ,"
The following is an example of importing data with a fixed position (fixed length:
Load data
Infile *
Into Table load_positional_data
(Data1 position (1:5 ),
Data2 position (6: 15)
)
Begindata
11111 aaaaaaaaaa
22222 bbbbbbbbbb
Skip data rows:
You can use the "Skip N" keyword to specify the number of rows of data that can be skipped during import. For example:
Load data
Infile *
Into Table load_positional_data
Skip 5
(Data1 position (1:5 ),
Data2 position (6: 15)
)
Begindata
11111 aaaaaaaaaa
22222 bbbbbbbbbb
Modify data when importing data:
You can modify the data when importing data to the database. Note: This method is only applicable to regular imports and is not suitable for direct import. For example:
Load data
Infile *
Into Table modified_data
(Rec_no "my_db_sequence.nextval ",
Region constant '31 ',
Time_loaded "to_char (sysdate, 'hh24: Mi ')",
Data1 position () ": data1/100 ",
Data2 position (6: 15) "upper (: data2 )",
Data3 position (16: 22) "to_date (: data3, 'yymmdd ')"
)
Begindata
11111aaaaaaaaaa991201
22222bbbbbbbbbb990112
Load data
Infile 'mail_orders.txt'
Badfile 'bad_orders.txt'
Append
Into Table mailing_list
Fields terminated ","
(ADDR,
City,
State,
Zipcode,
Mailing_addr "decode (: mailing_addr, null,: ADDR,: mailing_addr )",
Mailing_city "decode (: mailing_city, null,: City,: mailing_city )",
Mailing_state
)
Import data to multiple tables:
For example:
Load data
Infile *
Replace
Into Table EMP
When empno! =''
(Empno position (1:4) integer external,
Ename position (6: 15) Char,
Deptno position (17: 18) Char,
Mgr position (20:23) integer external
)
Into Table proj
When projno! =''
(Projno position (25:27) integer external,
Empno position (1:4) integer external
)
Import selected records:
For example, (01) represents the first character, and (30: 37) represents the character between 30 and 37:
Load data
Infile 'mydata. dat 'badfile' mydata. bad' discardfile 'mydata. Dis'
Append
Into Table my_selective_table
When (01) <> 'H' and (01) <> 'T' and (30: 37) = '123'
(
Region constant '31 ',
Service_key position (0:11) integer external,
Call_ B _no position (12: 29) Char
)
Skip some fields during import:
Data can be separated by postion (X: Y). You can specify the filler field in Oracle8i. The filler field is used to skip or ignore fields in the imported data file. For example:
Load data
Truncate into Table T1
Fields terminated ','
(Field1,
Field2 filler,
Field3
)
Import multiple rows of records:
You can use one of the following two options to import multiple rows of data into a record:
Concatenate:-use when SQL * loader shoshould combine the same number of physical records together to form one logical record.
Continueif-use if a condition indicates that multiple records shoshould be treated as one. eg. By having a' # 'character in column 1.
5. Additional knowledge
Oracle does not provide a tool such as unloader to export data to a file. However, we can use SQL * Plus's select and format data to output to a file:
Set echo off Newpage 0 space 0 pagesize 0 feed off head off trimspool on
Spool oradata.txt
Select col1 | ',' | col2 | ',' | col3
From tab1
Where col2 = 'xyz ';
Spool off
You can also use utl_file PL/SQL package for processing:
Rem remember to update initsid. ora, utl_file_dir = 'C:/oradata 'Parameter
Declare
FP utl_file.file_type;
Begin
FP: = utl_file.fopen ('C:/oradata', 'tab1.txt ', 'w ');
Utl_file.putf (FP, '% s, % S/N', 'textfield', 55 );
Utl_file.fclose (FP );
End;
You can also use third-party tools, such as sqlways and toad for quest.
SQL * loader data submission:
Generally, it is submitted after the data file is imported.
You can also specify the number of records submitted each time by specifying the rows = parameter.
Improve the Performance of SQL * Loader:
1) A simple and easy-to-ignore problem is that no indexes and/or constraints (primary keys) are used for the imported tables ). If this is done, the Database Import performance will be significantly reduced even when the rows = parameter is used.
2) You can add direct = true to Improve the Performance of imported data. Of course, this parameter cannot be used in many cases.
3) you can disable database logs by specifying the unrecoverable option. This option can only be used with direct.
4) You can run multiple import tasks at the same time.
Differences between conventional import and direct import:
You can use the insert statement to import data for regular import. Direct import can skip the related database logic (direct = true) and directly import the data to the data file.