PL/SQL batch processing statement: Contribution of BULK COLLECT and FORALL to Optimization

Source: Internet
Author: User

PL/SQL batch processing statements: bulk collect and FORALL contribute to optimization. We know that there are overhead for running SQL statements in PL/SQL programs, because SQL statements are to be submitted to the SQL engine for processing, the transfer of control between the PL/SQL engine and the SQL engine is called context change, but each time it is changed, there are additional overhead, please refer to: however, FORALL and bulk collect can let the PL/SQL engine compress multiple contexts into one, this causes the time required to execute the SQL statement to process multiple rows of records in PL/SQL to drop sharply. Please refer to the following: the following describes how to use bulk collect to accelerate the query (1) bulk collect. bulk collect can be used to load the query results to collections at a time, instead of using cursor to process data one by one, you can use bulk collect in select into, fetch into, and returning into statements. Note that when using bulk collect, all INTO variables must be collections. For example, ① use bulk collect [SQL] DECLARE TYPE sallist IS TABLE OF employees in the select into statement. salary % TYPE; sals sallist; begin select salary bulk collect into sals FROM employees where rownum <= 50; -- use the data END in the set next; /② In fetch into, use bulk collect [SQL] DECLARE TYPE deptrectl IS TABLE OF orders % ROWTYPE; includeptrectl; CURSOR cur IS SELECT department_id, department_name FROM orders where department_id> 10; begin open cur; FETCH cur bulk collect into dept_recs; -- use the data END in the set next;/③ use bulk collect [SQL] CREATE TABLE emp AS SELECT * FROM employees in returning; declare type numlist is table of employees. employee_id % TYPE; enums numlist; TYPE namelist is table of employees. last_name % TYPE; names namelist; begin delete emp WHERE department_id = 30 RETURNING employee_id, last_name bulk collect into enums, names; values ('deleted' | SQL % ROWCOUNT | 'rows: '); FOR I IN enums. FIRST .. enums. last loop DBMS_OUTPUT.PUT_LINE ('employee # '| enums (I) |': '| names (I); END LOOP; END;/deleted6rows: employee #114: raphaely employee #115: Khoo employee #116: Baida employee #117: Tobias employee #118: Himuro employee #119: colmenares (2) bulk collect optimizes the big data delete update. here we can use DELETE. The same case is true for UPDATE: In a large table with 0.1 billion rows, to delete tens of millions of rows of data, you need to complete the operation as quickly as possible without affecting other applications in the database. If the business cannot be stopped, you can refer to the following ideas: when you choose this method based on ROWID sharding, Rowid sorting, batch processing, and back-to-table deletion when the business cannot be stopped, it is indeed the best to control the submission within every 10 thousand rows, it will not cause too much pressure on the rollback segment. When I do DML, I usually select 1000 or 2000 rows and 1 commit to select the business peak, which will not affect the application as follows: [SQL] DECLARE -- cursor sorted BY rowid -- the deletion condition IS oo = xx. You need to determine the CURSOR mycursor IS SELECT rowid FROM t WHERE OO = XX ORDER BY rowid according to the actual situation; TYPE rowid_table_type is table of rowid index by pls_integer; v_rowid rowid_table_type; begin open mycursor; loop fetch mycursor bulk collect into v_rowid LIMIT 5000; -- submit row 5000 once exit when v_rowid.count = 0; FORALL I IN v_rowid.FIRST .. v_rowid.LAST DELETE t WHERE rowid = v_rowid (I); COMMIT; END LOOP; CLOSE mycursor; END;/(3) Limit the number of records extracted by BULK COLLECT Syntax: FETCH cursor BULK COLLECT... [LIMIT rows]; rows can be constants, and the result of a variable or evaluate is an integer expression. Suppose you need to query and process rows of data, you can use bulk collect to retrieve all rows at a time and fill them in a very large collection. However, this method consumes a large amount of PGA for this session, the LIMIT clause is very useful when the performance of an APP may be degraded due to a PGA change. It can help us control how much memory the program uses to process data. Example: [SQL] DECLARE CURSOR allrows_cur IS SELECT * FROM employees; TYPE employee_aat IS TABLE OF allrows_cur % ROWTYPE INDEX BY BINARY_INTEGER; v_emp employee_aat; BEGIN OPEN allrows_cur; loop fetch allrows_cur bulk fetch into v_emp LIMIT 100;/* process data through a scan Set */FOR I IN 1 .. v_emp.count LOOP upgrade_employee_status (v_emp (I ). employee_id); end loop; exit when allrows_cur % NOTFOUND; end loop; CLOSE allrows_cur; END;/(4) demand for batch extraction of multiple columns: the code for extracting all traffic information with the fuel consumption less than 20 km/RMB in the transportation table is as follows: [SQL] DECLARE -- DECLARE the set TYPE vehtab IS TABLE OF transportation % ROWTYPE; -- initialize a set of this type: gas_quzzlers vehtab; begin select * bulk collect into gas_quzzlers FROM transportation WHERE mileage <20 ;... after using the RETURNING clause for batch operations with the returning clause, we can easily determine the results of the DML operation just completed, for example, see the third point in bulk collect usage. (ii) use FORALL to accelerate dml forall and tell PL/SQL engine to bind all members of one or more sets in SQL statements, then, send the statement to the SQL engine.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.