The database has the following table structure:
User_info (user_id number primary key, user_name VARCHAR2 (+) not NULL, User_age VARCHAR 2 (+), create_date date, create_order number)
Now there are 1 million records that require each 1000 records to be exported as a CSV file, the contents are separated by commas, a total of 1000 files,
The file name format is output1.csv,output1.csv......output1000.csv.
/*==================== First step: Create a table ====================*/create tables User_info ( user_id number PRIMARY key, user_name VARCHAR2 (NO) null, user_age VARCHAR2 (), create_date date, create_order number)
/*==================== Second Step: Initialize the data, paste on the command line to execute ====================*/declare I number; V_age Number;begin i:=0; For I in 1..1000000 loop-randomly generates 1~100 's digital select round (mod (dbms_random.value*100,100)) into v_age from dual; Insert into User_info (User_id,user_name,user_age,create_date,create_order) VALUES (seq_all_table. Nextval, ' Onlin ' | | I,v_age,sysdate,i); --Submit the If mod (i,1000) = 0 Then commit on every 1000 lines; End If; End Loop;end;
/*==================== The third step: the specific process to achieve a one-time 1 million records all query out into the cursor, every 1000 write a file,
After testing, using the cursor method, using the data initialized by the table, the whole process is approximately 40s. ====================*/create or Replace procedure export_to_csv (--1, inlet parameters section p_ Dir varchar2) is--2, internal variable part v_errorcode VARCHAR2 (100);v_errormsg varchar2 (+)---display cursor, read all data once cursor mycur is SELECT * from User_info order by user_id; If you do not sort, you will find that the order of the written results will not reach your expectations, it should be understood. --line record Myrecord user_info%rowtype;csv_output utl_file.file_type;out_file_name varchar2 (; begin_time number); End_time number; Count_num number;begin--3, main part Begin_time: =dbms_ utility.get_time; Open mycur; for i in 1..1000 loop out_file_name: = ' Output ' | | i| | '. CSV '; csv_output: = Utl_file.fopen (p_dir,out_ file_name, ' W '); count_num: = 0; & nbsp; while Count_num < loop-write one file per 1000 lines fetch mycur into myrecord; Utl_file.put_line (Csv_output , myrecord.user_id| | ', ' | | myrecord.user_name| | ' , ' | | myrecord.user_age| | ' , ' | | To_char ( Myrecord.create_date, ' yyyy-mm-dd HH24:MI:SS ') | | ', ' | | Myrecord.create_ order); Count_num: = count_num + 1; end loop; Utl_ File.fclose (csv_output); end loop; Close mycur; end_time: = dbms_utility.get_time; Dbms_ Output.put_line (' Total time= ' | | (end_time-begin_time) *10 | | ' Ms. '); --4, exception handling exception when others then v_errorcode:= sqlcode; v_errormsg:=sqlerrm; rollback; - -Output exception information Dbms_output.put_line (v_errorcode| | V_ERRORMSG); end export_to_csv;
/*==================== Fourth Step: Create the output directory and authorize the relevant user ====================*/create or replace directory MYDIR as ' d:\tmp\ '; GRANT read,write on DIRECTORY MYDIR to wenglin;/*==================== step Fifth: Command line execution, viewing output file
To see the command line print results, execute: Set serveroutput on====================*/exec export_to_csv (' MYDIR ');
Additional instructions: the example reference: http://blog.csdn.net/chjttony/article/details/7466159
Bulk export of Oracle data to multiple CSV files using stored procedures