question

Ken Simmons avatar image
Ken Simmons asked

Determine cause of a Bad Execution Plan

I had a server with some issues the other day (SQL 2005 SP3). I tracked down a job that was running every 5 minutes that was taking 3 minutes to run. Looking at the history it was previously running in under a second.

After looking at the plan it was doing a clustered index scan on a large table. I run sp_updatestats every morning at 3:00 AM and when the job ran at 7:40 AM it took under a second; when it ran at 7:45 AM it started taking 3 minutes. No major changes or data loads occurred from what I can tell between that time. When I ran sp_updatestats again, the plan changed to an index seek and execution time reduced to under a second again.

I am trying to determine what may have caused the query to start using the "Bad Plan" at that time especially since stats were just updated and no major changes to the data occurred. Any ideas?

sql-server-2005execution-planstatistics
2 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Blackhawk-17 avatar image Blackhawk-17 commented ·
What kind of Page Life Expectancy are you getting on the server?
0 Likes 0 ·
Ken Simmons avatar image Ken Simmons commented ·
The server is due for a refresh and the PLE is averaging 191. Just trying to do a root cause now for the disruption. Short of adding a USE PLAN hint, I can't guarantee that the query will not resort to the bad plan again.
0 Likes 0 ·
sp_lock avatar image
sp_lock answered

Does the job have IF ...ELSE statements?

Is it using a proc?

I belive that sp_updatestats clears the proc cache. So that would resolve the slow performace as new stats are used.

10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Tom Staab avatar image
Tom Staab answered

Based on the various comments, this may not be an answer for your situation, but I'd like to describe a similar situation we had with a stored procedure and how we resolved it.

A certain procedure was running fine until one day it kept timing out for one of our customers. Upon investigation, we learned that their dataset was roughly twice the average size. That said, the query still should have executed in less than 30 seconds. We ran sp_recompile for that sproc and tried again. This time, it worked fine. Unfortunately, the same problem occurred again 2 days later when the same customer had another large dataset.

We determined the problem was caused by a nightly reindex job that was running. (This was added by a Microsoft consultant and was written to only reindex if fragmentation was over a certain threshold.) We tried adding sp_recompile for the problem stored procedure after the reindex, but it didn't help. Then the consultant came up with a plan that worked. We ran sp_recompile and then immediately ran the procedure for a large dataset. This forced SQL Server to cache the plan for the large dataset. That change was made months ago, and we haven't had a single complaint since.

It turns out the cached execution plan generated for a large dataset was much better for large ones and only very slightly slower for small datasets. On the other hand, the plan generated for a small dataset was optimized for that but terrible for a large one.

I don't know if this will help for the original question, but it may help others with similar issues.

10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Grant Fritchey avatar image
Grant Fritchey answered

From the sounds of it, it's probable that you have some parameters passed to your query that cause a bad plan and some that cause a good plan, independent of the statistics. You could run a server side trace to try to capture recompile events or take a look at the sys.dm_exec_query_stats to see when the plan was compiled. To fix it you would want to address it like it was a parameter sniffing problem (which it sort of is). Either use the OPTIMIZE FOR query hint and supply a good parameter, or use local parameters to get a more generic plan. You might even consider creating the query WITH RECOMPILE to get a new plan each time. It really depends on if you can afford the recompile cost.

10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.