thing to note about this place isExport loadfilename= $data _dir "/" $fileThis will Loadfilename (import Oracle text, absolute path) This variable export to the Linux environment, so that the Sqlload control file can read this variable and import2.sqlldr Control text Dayflow.log.ctlLoad Data
CHARACTERSET Al32utf8
infile ' $LoadFileName '
APPEND into table User_1.bil_flux_high_cur
Fields terminated by ' | '
Import data to Oracle
When importing data in Oracle, the file suffix is *. CTL
The command is sqlldr.
Sqlldr username/password control = 'tbl _ EMP, CTL'
Export part of data from postgre
Psql Saison-C 'select user_id, user_name from user order by 1, 2 'user_list.txt-a-f,-T
Generated file user_list.txt
100001, Xiao
FALSENLS_NCHAR_CHARACTERSET AL16UTF16NLS_RDBMS_VERSION 10.2.0.4.020 rows selected.Confirm that the character set selected when the database was created is a UTF-8. Execute SQL: select userenv ('language ') from dual; get oracle Server character set XSQL> select userenv ('language') from dual;USERENV ('language ')----------------------------------------------------SIMPLIFIED CHINESE_CHINA.UTF82. Check the settings of NLS_LANG on the client that execut
loadfilename= $data _dir "/" $fileThis will Loadfilename (import Oracle text, absolute path) This variable export to the Linux environment, so that the Sqlload control file can read this variable and import2.sqlldr Control text Dayflow.log.ctlLoad Datacharacterset al32utf8infile ' $LoadFileName ' APPEND into table User_1.bil_flux_high_curfields terminated by ' | ' Trailing Nullcols (ACCS_NBR "trim (: ACCS_
: Specify the data loading method, there are four kinds of values: [(1) Insert, the default way, at the beginning of the data load requires the table is empty, (2) Append, delete the old record (with the DELETE from table statement), replace the new loaded record; (3) Replace , delete the old record, replace it with the new loaded record, (4) Truncate, delete the old record (with the TRUNCATE TABLE statement), replace the new loaded record]; Here I specify the way for appendField terminated by:
Label:1, the command is written in one line: for example, Sqlldr sh/[email protected]connect_string control=ctl_file data=dat_file log= Log_file Direct=yes rows=100000.2, note the file newline character, if it is wind, it is cr/lf, if it is a Mac, then CR, if it is UNIX, then LF, and finally converted to 16, for example, RECORDS delimited by 0X ' 0A '.3, note the file path do not write wrong, otherwise, will be reported to find a file error.4. When th
('language ')----------------------------------------------------SIMPLIFIED CHINESE_CHINA.UTF82. Check the settings of NLS_LANG on the client that executes sqlldr.Oracle client execution[Oracle @ localhost hx] $ echo $ NLS_LANGAMERICAN_AMERICA.UTF83. Try to make the character set settings checked in the preceding three steps consistent. Start to import text through sqlldr and check the results.There are th
--Create TableCreate TableDEPT2 (DEPTNO Number(2) not NULL, DnameVARCHAR2( -), LOCVARCHAR2( -));Alter TableDEPT2Add constraintDept_pkPrimary Key(DEPTNO); ------Demo.ctlLOADDatainfile*---* indicates that the data is in the control file into TABLEDEPT2INSERT---The default load modeFields TERMINATED by ','---separators(DEPTNO, dname, LOC)---The default type is char (255) BegindataTen, Sales,virginia -, Accounting,virginia -, Consulting,virginia +, Finance,virginiasqlldr userid=Scott/Dcjet Control=/
Objective
Recently do projects need to import a batch of 3000多万条 POI data to Oracle database, simple insert import speed is too slow, use SQLLDR batch import 3000多万条 data spent 20 minutes, speed can also, now share to everyone, the specific methods are as follows:
1. New import Control file Input.ctl, file contents as follows:
Load data
Characterset UTF8
Infile ' H:\POI\baidu.txt '
Append into tab
old data, insert new,truncate----> Clear table First, then insert. General error: For example: column not found before logical record ends (using TRAILING nullcols) Indicates the number of columns in your control file that do not match the number of columns in the data file , and has not specified trailing nullcols parameter, which prevents some columns from being inserted, first check the control file and data file, and then consider whether you need to add Parameters.
Title printing; on by defaultSet heading on/offThat is to say, whether the following information in step 1 is displayedId username date1 money---------------------------------- The number of blank lines before the new page. If it is equal to 0, a line break will appear before the new page's 1st characters.Number of set Newpage rows Input content to fileSpool off/file name Determines whether SQL * Plus is displayed on the screen. The default value is on and the value is off, which indicates t
Sqlldr is used to quickly bulk import data,The example steps are as follows:1. Determine connection examples for Oracleusername/[email protected]I'm using:system/ World @ LOCALORCLThe SID is an example of a connection, the service naming in Net Manager, as follows: 2. Create a table with the following statement:CREATE TABLE TT ( ID INTEGER, NAME VARCHAR2 , CON VARCHAR2 ( BYTE), DT DATE)3. Create the data file under the D drive: test.
Command:Sqlldr Userid=[username]/[password]@[database] Control=[filepath]Username User NamePassword PasswordDatabase DB InstanceFilePath the physical path of the CTL file-----------------------------------------------------------------------File:Load datainfile *append into table [Tablename]fields terminated by "," trailing nullcols ([ID] sequence (), [col1],TableName Table nameID PRIMARY Keycol1,2,3,4,5,6 Field NameOracle uses SQLLDR (with sequence)
* FROM V$OPTION WHERE PARAMETER = ‘Oracle Database Vault‘;select comp_id,comp_name, version, status from dba_registry;chopt disable dvIf you do not closeSQL> select * from prod_master;select * from prod_master*ERROR at line 1:ORA-29913: error in executing ODCIEXTTABLEOPEN calloutORA-29400: data cartridge errorKUP-04094: preprocessing cannot be performed if Database Vault is installed2 Creating a TableCREATE TABLE PROD_MASTER ( "EMPNO" NUMBER, "ENAM
Sqlldr is a recommended method for processing large data volumes. It has many performance switches to minimize redo and undo generation, control Data Processing Methods (insert, append, replace, truncate)
Because the project needs to compare the performance of datapump, it is still not ideal, so we still want to use sqlldr. I did a simple test.
According to thomas kyte, the fastest way to load parallel exec
Sqlldr is a recommended method for processing large data volumes. It has many performance switches to minimize redo, undo generation, and control data processing methods.
Sqlldr is a recommended method for processing large data volumes. It has many performance switches to minimize redo, undo generation, and control data processing methods.
Sqlldr is a recomm
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.