The example of the oracle sqlldr data import tool can be used for large-volume txt documents, which is very fast to import. Www.2cto.com directly code BlackListDay. ctl: SQL code LOAD DATA INFILE '$ {DATAFILE}' --------- DATA file, that is, the txt file badfile 'ct _ blacklistday_info.bad 'Append into table fujz_blacklist_temp ----- TABLE name fields terminated by' | '------ data used | split trailing nullcols (ID_TYPE, <span style = "white-space: pre;"> </span> ------ ID_ICCID, OP_TIME "to_date (: OP_TIME, 'yyyy-MM-DDHH24: MI: ss ') ") BlackListDay. sh: Shell code #! /Usr/bin/ksh im_data () {DATAFILE = "$ HOME/work/fujza/BLACK13_201211.txt" ---- data file DATAFILE35 = "$ HOME/work/fujza/BLACK35_201211.txt" export DATAFILE35 sqlldr $ {DBNAME}/$ {DBPWD} @ $ {SID} log =$ {HOME}/work/fujza/BlackListDay2.log control =$ {HOME}/work/fujza/BlackListDay35.ctl streamsize = 25600000 echo "sqlldr2 end" export DATAFILE sqlldr $ {DBNAME}/$ {DBPWD }@$ {SID} log =$ {HOME}/work/fujza/BlackListDay. log control =$ {HOME}/work/fujza/BlackListDay. ctl streamsize = 25600000 echo "sqlldr1 end" exit EOF} im_data