In plsql development, some big data table data processing is involved. For example, the records of a table with over million records are processed and converted into another table or several tables.
The conventional operation method can be implemented, but the time, disk IO, redo log and so on are very large. Oracle provides an advanced function that can improve the performance of such data processing to the limit. This type of function is called a pipeline function.
In actual projects, pipeline functions are used together with table functions, data stream functions (that is, table functions and CURSOR), data sets, and concurrency to reach the peak of big data processing performance.
In the following example, the t_ss_normal record of the table is inserted into the t_target table, and some conversion operations are performed during the insertion process.
I divided it into four methods to implement this data processing operation.
The first method is also the most common method. The Code is as follows:
- Create TableT_SS_NORMAL
- (
- Owner VARCHAR2 (30 ),
- Object_name VARCHAR2 (128 ),
- Subobject_name VARCHAR2 (30 ),
- Object_id NUMBER,
- Data_object_id NUMBER,
- Object_type VARCHAR2 (19 ),
- CreatedDATE,
- Last_ddl_timeDATE,
- TimestampVARCHAR2 (19 ),
- Status VARCHAR2 (7 ),
- TemporaryVARCHAR2 (1 ),
- Generated VARCHAR2 (1 ),
- Secondary VARCHAR2 (1)
- );
- /
- Create TableT_TARGET
- (
- Owner VARCHAR2 (30 ),
- Object_name VARCHAR2 (128 ),
- Comm VARCHAR2 (10)
- );
This is the table structure of the source and target tables. Currently, there are million source tables with data from the dba_objects view.
- Create Or ReplacePackage pkg_testIs
- ProcedureLoad_target_normal;
- EndPkg_test;
- Create Or ReplacePackage body pkg_testIs
- ProcedureLoad_target_normalIs
- Begin
- Insert IntoT_target (owner, object_name, comm)
- SelectOwner, object_name,'Xxx' FromT_ss_normal;
- Commit;
- End;
- Begin
- Null;
- EndPkg_test;
An insert into select statement is easy to process.