Source: What caused that "go horribly" wrong–should you update statistics?
Since I did encounter such problems, but based on the level and experience, do not intend to repeat the wheel, so Daniel's article translated for your reference. The following is the translation:
In the past few years, I have encountered such situations:
There is a stored procedure that runs very well for most of the time, but suddenly it doesn't. There is a cliff-type decline in performance. You don't know why, but someone said: "It must be a question of statistical information." Indeed, if you have a lot of time to check the execution plan, you will find that the estimated number of rows and actual rows are very different. OK, it should be the problem of statistical information.
But, maybe not. The first thing to know is that the stored procedure, the parameterized statements that are executed using sp_executesql, and the client-submitted prepared (prepared) statements reuse the cached execution plan. These execution plans are defined using techniques called "parameter sniffing (parameter sniffing)". Parameter sniffing is not a problem in itself, but it can become a problem in the subsequent use of the same statement and the execution of the stored procedure. If the execution plan for some statements returns only 1 rows of data based on parameters, the execution plan can be simple--using a nonclustered index and bookmark lookup to find the data. However, the execution plan may not be reasonable when the same statement is executed again but the parameter causes a return of thousands of rows of data.
The execution plan is not stored on disk and is created when the execution plan does not exist in the cache. So there may be a number of reasons why the required execution plan cannot be found in the cache. If an atypical set of parameters occurs just as the first set of arguments generated by the execution plan (which should be "invalidated"), it means that the execution plan for this infrequently used parameter is always used, and that the execution plan is often not efficient. At the same time, when you study the execution plan, you will also find a significant difference between the estimated number of rows and the actual number of rows. This is not a question of statistical information.
If it's really a question of statistics, how do you do it?
Typically, the update STATISTICS table name or update STATISTICS table name Index name (the index to which the execution plan is used). Then execute the stored procedure again. It's accurate this time. So you're going to think it's a statistical information problem.
But you may only see server-side statistics that have been updated, and when you update statistics, SQL Server usually invalidates the execution plan. Therefore, the execution plan in the cache will also fail. When it is executed again, a new execution plan is created, which is generated using the current parameter.
So what to do with this type of problem.
First, do not update the statistics first. If this problem occurs with a stored procedure, consider recompiling first to see if you can get a better execution plan. Recompiling, such as using sp_recompile, invalidates the corresponding execution plan in the cache. This is not only quick and easy, but it also verifies that statistical information is problematic. If this is valid, you may want to consider improving your code. This section can refer to this article: Stored procedures, recompilation and. Netrocks. If it is not valid, then the statistics are considered for updating. However, the first thing to do is to make sure that the code compiles the same value as the execution value. You can find by using the Properties window that contains the actual execution plan:
If using the correct value or not, it may be a statistical information problem, but statistics are often criticized, but usually not the real reason, the implementation plan is.
Whether the execution plan always fails when the statistics are updated. The answer is in the negative. For this question, you can refer to two articles: http://erinstellato.com/2012/01/statistics-recompilations/and Statistics and recompilations, part Ii. In simple terms, starting with SQL 2012, updating statistics does not invalidate execution plans.
Summary
When you encounter a similar phenomenon, you should first check whether the parameter sniffing problem occurs, rather than updating the statistics immediately. Also note that in addition to stored procedures, parameterized statements that are executed using sp_executesql and the prepared (prepared) statements submitted by the client can occur.
Another article in my (translator's) Series: Understanding the Mysteries of performance--slow in application, SSMs in Fast (1)--Introduction