Poptest is the only training institute for developing Test and development engineers in China, aiming at the ability of the trainees to be competent in automated testing, performance testing and testing tools development. In Poptest's LoadRunner training, in order to improve the performance of students experience, add a lot of server optimization knowledge, for performance tuning ability to lay the foundation. Today's knowledge of SQL Server performance testing. (You are interested in the course, please add qq:564202718)
The analysis of performance testing has some methods and ideas, this time we look at the performance of SQL Server knowledge.
Database developers use local variables in stored procedures and scripts, but local variables affect the performance of the query. We create a table and insert some test data:
USE AdventureWorks
GO
CREATE
TABLE
TempTable
(tempID UNIQUEIDENTIFIER,tempMonth
INT
, tempDateTime DATETIME )
GO
INSERT
INTO
TempTable (tempID, tempMonth, tempDateTime)
SELECT
NEWID(),(
CAST
(100000*RAND()
AS
INT
) % 12) + 1 ,GETDATE()
GO 100000
-- (EXECUTE THIS BATCH 100000 TIME)
--Create an index to support our query
CREATE
NONCLUSTERED
INDEX
[IX_tempDateTime]
ON
[dbo].[TempTable]
([tempDateTime]
ASC
)
INCLUDE ( [tempID])
WITH
( ALLOW_ROW_LOCKS =
ON
, ALLOW_PAGE_LOCKS =
ON
)
ON
[
PRIMARY
]
GO
then we make a simple query:
SET
STATISTICS
IO
ON
GO
SELECT
*
FROM
TempTable
WHERE
tempDateTime >
‘2015-10-10 03:18:01.640‘
Table ' temptable '. Scan count 1, logical reads, physical reads 0, Read-ahead reads 0, LOB logical reads 0, LOB physical reads 0, LOB read- Ahead reads 0.examining the properties of this execution plan and index retrieval, you will find that the estimated number of rows is twice times the actual number of rows, but it does not affect the execution plan too much, because the optimizer chooses the most appropriate query method:
The query optimizer estimates the number of rows of data based on a basic statistical histogram, namely: Eq_rows + avg_range_rows (+ 88.64286) DBCC show_statistics (' dbo. TempTable ', ix_tempdatetime)
Now that we modify the SELECT statement to use local variables, you will find that the query optimizer uses a different query plan, which is a more time-consuming plan, why?
DECLARE
@RequiredDate DATETIME
SET
@RequiredDate =
‘2015-10-10 03:18:01.640‘
SELECT
*
FROM
TempTable
WHERE
tempDateTime > @RequiredDate
Table ' temptable '. Scan count 1, logical reads 481, physical reads 0, Read-ahead reads 0, LOB logical reads 0, LOB physical reads 0, LOB read -ahead reads 0.
The difference between the pre-estimate and the actual value is greater, which is equivalent to the query optimizer's inability to select the most appropriate query plan because of the erroneous pre-valuation. The statistical histogram cannot be used because query optimizations do not clearly understand local variable values at execution time.
The case of inequality operators
The inequality operator used in our query, so the query optimizer uses a simple 30% calculation to estimate.
Estimated rows = (total rows * 30)/100 = (100000*30)/100 = 30000
Case of the equality operator
DECLARE
@RequiredDate DATETIME
SET
@RequiredDate =
‘2012-07-10 03:18:01.640‘
SELECT
*
FROM
TempTable
WHERE
tempDateTime = @RequiredDate
If you use the equality operator in a local variable, the query optimizer also chooses a different formula, that is, the total number of accuracy * table records. execute the following query to get an exact value
DBCC show_statistics (' dbo. TempTable ', ix_tempdatetime)
All Density = 0.0007358352 Total number of Rows in Table = 100000
Estimated Rows = Density * Total number = 0.0007358352 * 100000 = 73.5835
Performance Test Training: SQL Server performance Test Analysis performance impact of local variables