Sql*loader detailed

Source: Internet
Author: User
Tags log log how to use sql sql loader

In an Oracle database, we typically use the following methods when replicating or migrating between tables in different databases:

1. The record of a table is exported as a semicolon-delimited INSERT statement and then executed into table B
2. Create a dblink between the databases and then use CREATE TABLE B as SELECT * from [email protected] where ..., or INSERT INTO B SELECT * FROM C1>[email protected] where ...
3. Exp A, then imp to form B, exp can add query conditions
4. Program implementation Select from A ..., and then insert into B ..., also to submit in batches
5. Then the SQL Loader (SQLLDR) to be mentioned in this article is to import the data, the effect is more obvious than the insert.

The 1th method is a nightmare when recording for a long while, requiring 3,500 batches of submissions, otherwise the client will die, and the import process is slow. If you do not produce REDO to improve the performance of insert into, do the following:

ALTER TABLE B nologging;

Insert/* +append */into B (C1,C2) values (X,XX);

Insert/* +append * * to B SELECT * from [email protected] where ...

OK, the previous overview of the data import and export methods in Oracle, there must be more sophisticated. The following focuses on the use of oracle  Sql Loader (SQLLDR).

executes oracle   's SQLLDR command at the command line, and you can see its detailed parameter description, focusing on the following parameters:

userid --Oracle username/ password[@servicename]
control --control file, which may contain data for the table
---------------------------------------------------- ---------------------------------------------------
Log-records the log file when importing, by default the control file (remove extension). Log
Bad--the default is Control file (remove extension). Bad
Data--The file is typically specified in the control file. Do not specify a data file in the parameter control file is more suitable for automatic operation
Errors--The number of error records allowed, you can use him to control a record is not wrong
rows-how many records are submitted once, default is the number of
skip-skipped rows, For example, the first few lines of the exported data file are headers or other descriptions of

and more Sqlldr parameter descriptions refer to: SQL loader usage.

demonstrates the use of SQLLDR in an example using two methods:

1. Use only one control file, including data in this control file
2. Use a control file (as a template) and a data file

In general, in order to facilitate the separation of templates and data, and the different division of the program will use the second way, so first look at this usage. Data files can be CSV files or separated by other delimiters, data files can be exported with PL/SQL Developer or Toad, or they can be generated using the   spool formatted output, or utl_file packages. In addition, the Toad can also be used to directly generate control files containing data.

First, assume that you have a table of users and insert five records:

CREATE TABLE Users (

USER_ID number,--User ID

User_name VARCHAR2 (50),--User name

Login_times number,--Login

Last_login date--Last Login date

);

INSERT into users
VALUES
(1,
' Unmi ',
3,
Sysdate);
INSERT into users
VALUES
(2,
Null
5,
To_date (' 2008-10-15 ', ' yyyy-mm-dd '));
INSERT into users
VALUES
(3,
' Septum warbler ',
8,
To_date (' 2009-01-02 ', ' yyyy-mm-dd '));
INSERT into users
VALUES
(4,
' Kypfos ',
Null
NULL);
INSERT into users
VALUES
(5,
' Don't know Autumn ',
1,
To_date (' 2008-12-23 ', ' yyyy-mm-dd '));


The second way: Use a control file (as a template) and a data file

1) Set up the data file, here we use PL/SQL Developer export table the users record as Users_data.csv file, the contents are as follows:

"", "user_id", "user_name", "Login_times", "Last_login"

"1", "1", "Unmi", "3", "2009-1-5 20:34:44"

"2", "2", "" "," 5 "," 2008-10-15 "

"3", "3", "Spacer Warbler", "8", "2009-1-2"

"4", "4", "Kypfos", "", ""

"5", "5", "Unknown Autumn", "1", "2008-12-23"
2) Establish a control file Users.ctl, which reads as follows:

Options (skip=1,rows=128)--The SQLLDR command displays the option to write to this side, Skip=1 to skip the first line in the data

LOAD data INFILE "Users_data.csv"--Specify the external data file, you can write multiple INFILE "another_data_file.csv" Specify multiple data Files--you can also use Badfile, Discardfil E to specify the bad data and discard the data--the file,

Truncate--operation type, using TRUNCATE table to clear existing records

into table users--the table to insert the record

Fields terminated by ","--the data in each row of records with "," delimited

Optionally enclosed by ' "--when each field in the data is in the '" box, such as when there is a "," delimiter in the field

Trailing nullcols-NULL is allowed when the table field does not have a corresponding value

(Virtual_column FILLER,--This is a virtual field used to skip the first column ordinal generated by PL/SQL Developer

USER_ID number,--the field can specify a type, otherwise it is considered a CHARACTER type, the log file has a display

USER_NAME,

Login_times,

Last_login Date "Yyyy-mm-dd HH24:MI:SS"--Specifies the format of the accepted date, which is quite converted with the to_date () function

)

Description: In the action type truncate location, one of the following values is available:
1) Insert--The default way to require the table to be empty at the beginning of the data load
2) Append--Append a new record to the table
3) Replace-deletes the old record (with the DELETE from table statement) and replaces the newly loaded record
4) Truncate--delete old record (with TRUNCATE TABLE statement), replace new loaded record

3) Execute the command:

Sqlldr dbuser/[email protected] Control=users.ctl

The tables in the database indicated by Dbservice are recorded in the same way as in the data file.

After executing the Sqlldr hope to be able to pay attention to the generated several files, such as Users.log log files, Users.bad bad data files. In particular, to look at the log file, which allows you to better understand Sql Loader, there is a control file parsing, listing the type of each field, loading records statistics, the cause of the error and other information.

In the first way, only one control file is used to contain the data in this control file.

1) The contents of the Users_data.cvs in the Users.ctl, and to Begindata connection, but also to the INFILE "Users_data.csv" to INFILE *. At the same time, in order to more Dahua explain the problem, the data processing a bit. At this point, the full Users.ctl file content is:

Options (skip=1,rows=128)--The SQLLDR command displays the option to write to this side, Skip=1 to skip the first line in the data

LOAD Data INFILE *-because the data is together with the control file, it is represented by *

Append--This is done with append, attaching records to the table users

into TABLE users when login_times<> ' 8 '--You can also select to import eligible records with the When clause

Fields terminated by ","

Trailing Nullcols (

Virtual_column FILLER,--Skips the first column ordinal generated by PL/SQL Developer

user_id "User_seq.nextval"--this column takes the next value of the sequence directly, instead of the value provided in the data

User_name "' Hi ' | | Upper (: user_name) ",--, can also process data with SQL functions or operations

Login_times terminated by ",", Nullif (login_times= ' NULL ')--you can specify a delimiter for the column individually

Last_login DATE "Yyyy-mm-dd HH24:MI:SS" nullif (last_login= "null")--null if the field is "null")

Begindata--The data starts here

, User_id,user_name,login_times,last_login

1,1,unmi,3,2009-1-5 20:34

2,2,fantasia,5,2008-10-15

3,3, Yehuang, 8,2009-1-2

4,4,kypfos, Null,null

5,5, don't know autumn, 1,2008-12-23

2) Execute the same command:

Sqlldr dbuser/[email protected] Control=users.ctl

For example, this information is displayed in the console:

C:\>sqlldr dbuser/[email protected] Control=users.ctl

Sql*loader:release 9.2.0.1.0-production on Wednesday January 7 22:26:25 2009

Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.

Reach commit point, logical record Count 4
Reach commit point, logical record Count 5

The above control file contains more complex content (Demo purpose), please understand the meaning of each parameter according to the annotation. You can also discover more usage.

Finally, the performance of SQL *loader and concurrent operations

1) The default value for rows is 64, and you can specify the number of commits per commit based on the actual specified rows parameter. (Have you experienced the case of executing more than one INSERT statement at a time in PL/SQL Developer?) )

2) General import you can import data by using INSERT statements. Direct import can improve the performance of imported data by skipping the related logic (direct=true) of the database and directly importing the data into the data file. Of course, in many cases, this parameter cannot be used (if the primary key repeats, the state of the index becomes unusable!).

3) by specifying the unrecoverable option, you can close the log of the database (do you want to alter TABLE table1 nologging?). This option can only be used with direct.

4) for the import of oversized data files it is necessary to use concurrent operation, that is, running multiple import tasks at the same time.

Sqlldr userid=/control=result1.ctl direct=true parallel=true
Sqlldr userid=/control=result2.ctl direct=true parallel=true
Sqlldr userid=/control=result2.ctl direct=true parallel=true

When loading large amounts of data (approximately 10GB), it is best to suppress the generation of logs:

Sql>alter TABLE resultxt nologging;

This does not produce redo LOG, which can improve efficiency. The CONTROL file then adds a line to the load data: Unrecoverable, this option must be applied in conjunction with direct.

In concurrent operations, Oracle claims to be able to handle 100GB of data per hour! In fact, estimated to be able to 1-10g even good, began to use the same structure of the file, but only a small amount of data, after successful start loading a large amount of data, so as to avoid the waste of time.

(note: Generally only in ASCII format, remember to convert the encoding, otherwise the import data is empty) (FTP upload CSV file Transfer type select ASCII)

Reference: 1. Oracle Sql*loader-English, official instructions for SQL Loader, including multiple types of Demo
2. SQL Loader Usage--The Chinese description of the selection parameters of SQL Loader is listed
3. Use SQL Loader to import large amounts of data and avoid SQL frequent write libraries-a simple example of how to use SQL Loader
4. Detailed syntax for Oracle SQL Loader
5. Oracle SQL Loader--it's all about it. And how to generate a data file with the SQL *plus spool or Utl_file package
6. Sql*loader control File Reference-English, use reference for controlling files
7. Learn the use of Oracle SQL Loader
8. Load LOB data with Sqlloader (SQLLDR)--The inner class of the LOB is an external file that is imported into the database with SQL Loader
9. Sqlldr directly load several parameters of the test
10.Maximizing Sql*loader Performance

Sql*loader detailed

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.