Using sqlload-Text data to Oracle is presumably a lot of people who use it, but if there are hundreds of or thousands of data texts, or real-time text data. How to Bulk Import Oracle into this scenario, here's how I deal with it.
The 1.shell script loaddata.sh is as follows
#!/bin/bashdata_dir=/home/bea/datasource/warn-log/dayflow # Text Data Directory data_dir_bak=/home/bea/ datasource/warn-log/dayflow_bakshell_dir=/home/bea/scriptwhile [ 1 ]docd $data _dirif [ ' pwd ' == $data _dir ]then filelist= ' ls -rt | grep Dayflow.log_. ' #数据文本文件名匹配fi #echo $filelistcd $shell _dirif [ ' pwd ' == $shell _dir ]then for file in $filelist do date= ' date "+%y%m%d %h:%m:%s" export loadfilename= $data _dir "/" $file #这个地方很重要, to load variables into the environment echo "time is $date , starting $loadfilename data" sqlldr user_1/[email protected]_db control=dayflow.log.ctl bad= Dayflow.log.bad echo "time is $date , $LoadFileName data import complete" echo "Start backing up files $file" mv $LoadFileName $data _dir_bak echo "$file Backup Complete" donefidone
The most important thing to note about this place is
Export loadfilename= $data _dir "/" $file
This will Loadfilename (import Oracle text, absolute path) This variable export to the Linux environment, so that the Sqlload control file can read this variable and import
2.sqlldr Control text Dayflow.log.ctl
Load Datacharacterset al32utf8infile ' $LoadFileName ' APPEND into table User_1.bil_flux_high_curfields terminated by ' | ' Trailing Nullcols (ACCS_NBR "trim (: ACCS_NBR)", date_id "trim (:D ate_id)", Total_flux "trim (: Total_flux)", Sys_date " Sysdate ")
Note that infile ' $LoadFileName ', which is the variable that the shell has just exported, can pass in the file name in the form of a variable, so the bulk import is a looping operation.
This article is from the "Change the World" blog, please be sure to keep this source http://wongsong.blog.51cto.com/6040075/1906436
Bulk data files with SQLLDR data to Oracle