Discussion on pl/SQL batch processing statements: Contribution of BULK COLLECT and FORALL to Optimization

Source: Internet
Author: User
Tags rowcount

We know that there is an overhead for running SQL statements in PL/SQL programs, because SQL statements must be submitted to the SQL engine for processing.
The transfer of control between PL/SQL engine and SQL engine is called context change, but each change has additional overhead.


However, FORALL and bulk collect allow the PL/SQL engine to compress multiple contexts into one, this causes the time required to execute the SQL statement to process multiple rows of records in PL/SQL to plummet.
Check again:

Let's take a closer look at the two.

(I) accelerated query through BULK COLLECT

(1) usage of BULK COLLECT

Bulk collect can be used to load the query results to collections at a time, instead of processing the results one by one Using cursor.
You can use bulk collect in select into, fetch into, and returning into statements.
Note that when using bulk collect, all INTO variables must be collections

Here are some simple examples:

① Use bulk collect in select into statement

Copy codeThe Code is as follows: DECLARE
TYPE sallist is table of employees. salary % TYPE;
Sals sallist;
SELECT salary bulk collect into sals FROM employees where rownum <= 50;
-- Use the data in the set.

② Use bulk collect in fetch

Copy codeThe Code is as follows: DECLARE
TYPE deptrealb is table of orders ments % ROWTYPE;
Dept_recs deptrectl;
CURSOR cur is select department_id, department_name FROM orders ments where department_id> 10;
OPEN cur;
FETCH cur bulk collect into dept_recs;
-- Use the data in the set.

③ Use bulk collect in returning

Copy codeThe Code is AS follows: create table emp as select * FROM employees;

TYPE numlist is table of employees. employee_id % TYPE;
Enums numlist;
TYPE namelist is table of employees. last_name % TYPE;
Names namelist;
DELETE emp WHERE department_id = 30
RETURNING employee_id, last_name bulk collect into enums, names;
DBMS_OUTPUT.PUT_LINE ('deleted' | SQL % ROWCOUNT | 'rows :');
FOR I IN enums. FIRST... enums. LAST
DBMS_OUTPUT.PUT_LINE ('employee # '| enums (I) |': '| names (I ));
End loop;

Employee #114: Raphaely
Employee #115: Khoo
Employee #116: Baida
Employee #117: Tobias
Employee #118: Himuro
Employee #119: Colmenares

Eate table emp as select * FROM employees; declare type numlist is table of employees. employee_id % TYPE; enums numlist; TYPE namelist is table of employees. last_name % TYPE; names namelist; begin delete emp WHERE department_id = 30 RETURNING employee_id, last_name bulk collect into enums, names; values ('deleted' | SQL % ROWCOUNT | 'rows: '); FOR I IN enums. FIRST .. enums. last loop DBMS_OUTPUT.PUT_LINE ('employee # '| enums (I) |': '| names (I); END LOOP; END;/deleted6rows: employee #114: raphaelyemployee #115: Khooemployee #116: Baidaemployee #117: Tobiasemployee #118: Himuroemployee #119: Colmenares

(2) Optimization of bulk collect for Big Data DELETE UPDATE

Here we can use DELETE, and UPDATE is the same.

Delete 10 million rows of data in a large 0.1 billion rows table
The requirement is to achieve the fastest speed when the impact on other database applications is minimal.

If the business cannot be stopped, you can refer to the following ideas:
Sharding by ROWID, sorting by Rowid, batch processing, and table Deletion
This method is indeed the best option when the business cannot be stopped.
Generally, it can be controlled to submit within 10 thousand rows without causing too much pressure on the rollback segment.
When I do DML, I usually select 1000 or 2000 rows and one commit.
When the business peak is selected, the application will not be significantly affected.
The Code is as follows:

Copy codeThe Code is as follows: DECLARE
-- Cursor sorted by rowid
-- The deletion condition is oo = xx, which depends on the actual situation.
CURSOR mycursor is select rowid FROM t where oo = xx order by rowid;
TYPE rowid_table_type is table of rowid index by pls_integer;
V_rowid rowid_table_type;
OPEN mycursor;
FETCH mycursor bulk collect into v_rowid LIMIT 5000; -- submit row 5000 once
Exit when v_rowid.count = 0;
FORALL I IN v_rowid.FIRST... v_rowid.LAST
DELETE t WHERE rowid = v_rowid (I );
End loop;
CLOSE mycursor;

(3) Limit the number of records extracted by BULK COLLECT

FETCH cursor bulk collect into... [LIMIT rows];
Rows can be a constant, and the result of a variable or a value is an integer expression.

Suppose you need to query and process rows of data, you can use bulk collect to retrieve all rows at a time, and then fill it into a very large collection.
However, this method will consume a large amount of PGA for this session, and the APP may suffer performance degradation due to PGA page feed.

At this time, the LIMIT clause is very useful. It can help us control how much memory the program uses to process data.


Copy codeThe Code is as follows: DECLARE
CURSOR allrows_cur is select * FROM employees;
TYPE employee_aat is table of allrows_cur % rowtype index by BINARY_INTEGER;
V_emp employee_aat;
OPEN allrows_cur;
FETCH allrows_cur bulk fetch into v_emp LIMIT 100;

/* Process data through a scan Set */
FOR I IN 1 .. v_emp.count
Upgrade_employee_status (v_emp (I). employee_id );
End loop;

Exit when allrows_cur % NOTFOUND;
End loop;

CLOSE allrows_cur;

(4) Batch extract multiple columns

Extract all traffic details with fuel consumption less than 20 km/RMB in the transportation table
The Code is as follows:

Copy codeThe Code is as follows: DECLARE
-- Declare the collection type
TYPE vehtab is table of transportation % ROWTYPE;
-- Initialize a set of this type
Gas_quzzlers vehtab;
SELECT * bulk collect into gas_quzzlers FROM transportation WHERE mileage <20;

Using the RETURNING clause for batch operations

With the returning clause, we can easily determine the results of the just-completed DML operation without any additional query work.
For the example, see the third point in bulk collect usage.

(Ii) DML Acceleration Through FORALL

FORALL indicates that the PL/SQL engine binds all the members of one or more sets to SQL statements, and then sends the statements to the SQL engine.

(1) syntax

Not complete...

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.