Oracle Sqlloader Usage

Source: Internet
Author: User
Tags log log sql loader

1. What is *.csv and how to get it?

It contains a data sheet. Each row represents a row of a database table,

In each row, the middle of each two data is separated by a comma ",".

*.csv can be obtained by "save Excel file as *.csv".

2. How do I import data from *.csv format into an Oracle database?

For example:

The following data is stored in the Test.csv file.

A1,b1

A2,b2

The storage path is:

D:\test.csv

Ready to store it in an Oracle database.

A. First, you need to create a new table (the number of columns and types to be consistent with the import)

CREATE TABLE Test (A char,b char);

B. Creating a new Test.ctl file

Edit the Write with Notepad:

Load data

InFile ' D:\test.csv '

into table "test"

Fields terminated by ', '

(A, B)

C. Open cmd command line interface

Input:

Sqlldr userid=user/[email protected] Control=d:\test.ctl Log=d:\test.log

Enter

user/[email protected]: For account number and password as well as database name.

The above introduction is relatively simple.

Http://www.blogjava.net/Unmi/archive/2009/01/05/249956.html

Inside the introduction of more detailed, I go here:

About Oracle's data import and export and the use of SQL Loader (SQLLDR)

In an Oracle database, we typically use the following methods when replicating or migrating between tables in different databases:

1. The record of a table is exported as a semicolon-delimited INSERT statement and then executed into table B
2. Create a dblink between the databases and then use CREATE TABLE B as SELECT * from [email protected] where ..., or INSERT INTO B SELECT * FROM C1>[email protected]where ...
3. Exp A, then imp to form B, exp can add query conditions
4. Program implementation Select from A ..., and then insert into B ..., also to submit in batches
5. Then the SQL Loader (SQLLDR) to be mentioned in this article is to import the data, the effect is more obvious than the insert.

The 1th method is a nightmare when recording for a long while, requiring 3,500 batches of submissions, otherwise the client will die, and the import process is slow. If you do not produce REDO to improve the performance of insert into, do the following:

1 ALTER TABLE B nologging;
2
3 Insert/* +append */into B (C1,C2) values (X,XX);
4 Insert/* +append * * to B SELECT * from [email protected] where ...;
5

Well, the previous summary of the data import and export methods in Oracle, I think there must be more sophisticated. The following focuses on the use of Oracle's SQL Loader (SQLLDR).

Execute Oracle's SQLLDR command at the command line, you can see its detailed parameter description, to focus on the following several parameters:

userid--Oracle's username/password[@servicename]
Control--control files, which may contain table data
-------------------------------------------------------------------------------------------------------
Log-logs file when importing, default to control file (remove extension). log
Bad--The default is the control file (remove extension).
Data--file, usually specified in the control file. Not specifying data files in parameter control files is more appropriate for automatic operation
Errors--The number of error records allowed, which can be used to control a record.
Rows-How many records are submitted once, default to 64
Skip-The number of skipped rows, such as the first few rows of the exported data file is the header or other description

There are more Sqlldr parameter descriptions please refer to: SQL loader usage.

Using examples to demonstrate the use of SQLLDR, there are two ways to use it:

1. Use only one control file to contain the data in this control file
2. Use a control file (as a template) and a data file

In general, in order to facilitate the separation of templates and data, and the different division of the program will use the second way, so first look at this usage. Data files can be CSV files or separated by other delimiters, data files can be exported with PL/SQL Developer or Toad, or they can be generated using the spool formatted output of the *plus, or utl_file packets. In addition, the Toad can also be used to directly generate control files containing data.

First, assume that there is a table for users and insert five records:

Insert into users values (1, ' Unmi ', 3,sysdate);
Insert into users values (2,null,5,to_date (' 2008-10-15 ', ' yyyy-mm-dd '));
Insert into users values (3, ' spacer Warbler ', 8,to_date (' 2009-01-02 ', ' yyyy-mm-dd '));
Insert into users values (4, ' Kypfos ', null,null);
Insert into users values (5, ' unknown Autumn ', 1,to_date (' 2008-12-23 ', ' yyyy-mm-dd '));

The second way: Use a control file (as a template) and a data file

1) Set up the data file, here we use PL/SQL Developer export table the users record as Users_data.csv file, the contents are as follows: "", "user_id", "user_name", "Login_times", "Last_lo GIN "
"1", "1", "Unmi", "3", "2009-1-5 20:34:44"
"2", "2", "" "," 5 "," 2008-10-15 "
"3", "3", "Spacer Warbler", "8", "2009-1-2"
"4", "4", "Kypfos", "", ""
"5", "5", "Unknown Autumn", "1", "2008-12-23"

2) Establish a control file Users.ctl, which reads as follows:

Description: In the action type truncate location, one of the following values is available:

1) Insert--The default way to require the table to be empty at the beginning of the data load
2) Append--Append a new record to the table
3) Replace-deletes the old record (with the DELETE from table statement) and replaces the newly loaded record
4) Truncate--delete old record (with TRUNCATE TABLE statement), replace new loaded record

3) Execute the command:

Sqlldr dbuser/[email protected] Control=users.ctl

The tables in the database indicated by Dbservice are recorded in the same way as in the data file.

After executing the Sqlldr hope to be able to pay attention to the generated several files, such as Users.log log files, Users.bad bad data files. In particular, to look at the log file, which allows you to better understand Sql Loader, there is a control file parsing, listing the type of each field, loading records statistics, the cause of the error and other information.

In the first way, only one control file is used to contain the data in this control file.

1) The contents of the Users_data.cvs in the Users.ctl, and to Begindata connection, but also to the INFILE "Users_data.csv" to INFILE *. At the same time, in order to more Dahua explain the problem, the data processing a bit. At this point, the full Users.ctl file content is:


Options (skip=1,rows=128)--The SQLLDR command displays the option to write to this side, Skip=1 to skip the first line in the data
LOAD DATA
INFILE *-Because the data is together with the control file, it is represented by *
Append--This is done with append, attaching records to the table users
Into TABLE users
When login_times<> ' 8 '--you can also choose to import eligible records with the When clause
Fields terminated by ","
Trailing Nullcols
(
Virtual_column FILLER,--Skips the first column ordinal generated by PL/SQL Developer
user_id "User_seq.nextval"--this column takes the next value of the sequence directly, instead of the value provided in the data
User_name "' Hi ' | | Upper (: user_name) ",--, can also process data with SQL functions or operations
Login_times terminated by ",", Nullif (login_times= ' NULL ')--you can specify a delimiter for the column individually
Last_login DATE "Yyyy-mm-dd HH24:MI:SS" nullif (last_login= "null")--null when the field is "null"
)
Begindata--The data starts here
, User_id,user_name,login_times,last_login
1,1,unmi,3,2009-1-5 20:34
2,2,fantasia,5,2008-10-15
3,3, Yehuang, 8,2009-1-2
4,4,kypfos,null,null
5,5, don't know autumn, 1,2008-12-23

2) Execute the same command:

Sqlldr dbuser/[email protected]Control=users.ctl

For example, this information is displayed in the console:

C:\>sqlldrdbuser/[email protected] Control=users.ctl

Sql*loader:release 9.2.0.1.0-production on Wednesday January 7 22:26:25 2009

Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.

Reach commit point, logical record Count 4
Reach commit point, logical record Count 5

The above control file contains more complex content (Demo purpose), please understand the meaning of each parameter according to the annotation. You can also discover more usage.

Finally, the performance of SQL *loader and concurrent operations

1) The default value for rows is 64, and you can specify the number of commits per commit based on the actual specified rows parameter. (Have you experienced the case of executing more than one INSERT statement at a time in PL/SQL Developer?) )

2) General import you can import data by using INSERT statements. Direct import can improve the performance of imported data by skipping the related logic (direct=true) of the database and directly importing the data into the data file. Of course, in many cases, this parameter cannot be used (if the primary key repeats, the state of the index becomes unusable!).

3) by specifying the unrecoverable option, you can close the log of the database (do you want to alter TABLE table1 nologging?). This option can only be used with direct.

4) for the import of oversized data files it is necessary to use concurrent operation, that is, running multiple import tasks at the same time.

Sqlldr userid=/control=result1.ctl direct=true parallel=true
Sqlldr userid=/control=result2.ctl direct=true parallel=true
Sqlldr userid=/control=result2.ctl direct=true parallel=true

When loading large amounts of data (approximately 10GB), it is best to suppress the generation of logs:

Sql>alter TABLE resultxt nologging;

This does not produce redo LOG, which can improve efficiency. The CONTROL file then adds a line to the load data: Unrecoverable, this option must be applied in conjunction with direct.

In concurrent operations, Oracle claims to be able to handle 100GB of data per hour! In fact, estimated to be able to 1-10g even good, began to use the same structure of the file, but only a small amount of data, after successful start loading a large amount of data, so as to avoid the waste of time.

Oracle Sqlloader Usage

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.