I studied the execution plan today and saw cost (% CPU) in the execution plan. I studied it here. I don't know if it is correct or not.
In Oracle 10 Gb, Oracle also included CPU cost statistics in the execution plan, which is very different from the previous 8i and 9I (9i actually started.
So we can use a formula to represent the cost, which can be simply written
Cost = I/O cost + CPU cost
The Oracle version of my VM is 10.2.0.1.
SQL> select * From dumptest;
50763 rows selected.
Execution Plan
----------------------------------------------------------
Plan hash value: 3522420958
------------------------------------------------------------------------------
| ID | operation | Name | rows | bytes | cost (% CPU) | time |
------------------------------------------------------------------------------
| 0 | SELECT statement | 50763 | 4610k | 215 (3) | 00:00:02 |
| 1 | table access full | dumptest | 50763 | 4610k | 215 (3) | 00:00:02 |
------------------------------------------------------------------------------
In this example, the full table scan cost is 215 (3)
So 215 is the cost of the entire step. What are the CPU and I/O cost related to it? You can check it through V $ SQL _plan.
SQL> select cost, cpu_cost, io_cost from V $ SQL _plan SQL where SQL. plan_hash_value = '2016'
2;
Cost cpu_cost io_cost
------------------------------
215
215 24910764 209
As you can see, in step 2, io_cost is 209, and the total cost is 215. What is the CPU cost (215-209)/215 = 3%?
SQL> select (215-209)/215 from dual;
(215-209)/215
-------------
. 027906977
Therefore, this cost (% CPU) indicates the percentage of CPU cost in the entire cost.
As for the byte source, it is from avg_row_len in the statistical information of the table.
SQL> select avg_row_len from user_tables where table_name = upper ('dumptest ')
2;
Avg_row_len
-----------
93
The length of the bytes returned in the second step is
Rows × 93 = (50763 × 93)/1024 = 4610 K
SQL> select (50763*93)/1024 from dual;
(50763*93)/1024
---------------
4610.31152
Some may ask, if I have no statistics, this Byte will come out.
Check the operations in the actions column.
SQL> exec dbms_stats.delete_table_stats (user, 'dumptest ');
PL/SQL procedure successfully completed
SQL> select avg_row_len from user_tables where table_name = upper ('dumptest ')
2;
Avg_row_len
-----------
SQL> set autotrace traceonly
SQL> select * From dumptest;
Execution Plan
----------------------------------------------------------
Plan hash value: 3522420958
------------------------------------------------------------------------------
| ID | operation | Name | rows | bytes | cost (% CPU) | time |
------------------------------------------------------------------------------
| 0 | SELECT statement | 42043 | limit 7k | 214 (3) | 00:00:02 |
| 1 | table access full | dumptest | 42043 | average 7 k | 214 (3) | 00:00:02 |
------------------------------------------------------------------------------
Note
-----
-Dynamic sampling used for this statement
As a result, although cost has not changed significantly, the returned rows and bytes are much different. I think this is the result after Oracle uses dynamic sampling.
There is no clear documentation for CPU cost computing, but cost is
Under noworkload statistics
Cost = io_cost + cpu_cost/cpuspeednw * (ioseektim + db_block_size/iotfrspeed) * 1000
In workload statistic
Cost = io_cost + cpu_cost/cpuspeed * sreadtim * 1000
Dynamic sampling used for this statement
Dynamic sampling indicates that Oracle does not collect statistics.
When the statistical information is too old, Oracle collects statistical information to determine multiple DML objects over a period of time. Then, collect information from the old statistical information table.
Http://space.itpub.net/7728585/viewspace-624471
Dynamic adoption can also be used when statistics are available, depending on your dynamic sampling levels
Http://download.oracle.com/docs/... 11/stats.htm # i1_32
Dynamic sampling is useful when you test SQL Performance problems because of Oracle statistics or Oracle CBO algorithm problems.
Yes. A total of 10 levels. The default value is 2.