question

stevesl avatar image
stevesl asked

Is this a parameter sniffing issue?

How do you know when you have a parameter sniffing issue & how to you best tackle it? A# - we know there is a problem because they are monitoring execution times of sprocs. Their SLA is sub 10-sec across the board. When an alert is fired they find a great big bottleneck on the server is, blocking, is what they find such that they can't tell why this started (that last part is something I have a great big problem with, but I'm not in the system so it is what I have to go with.) B# - this will happen at apparently random times, business days & business hours. C# - they gone so far as to create a bandaid, a scheduled hourly, 24/7/365 'UPDATE STATISTICS (tablename). D# - without this bandaid, in short order- things will degrade int the same day. E# - even with the bandaid in place, when it still occurs, the fix is to execute sp_recomple (tablename). Remember, they can't tell which sprocs are at issue. Until recently they would not tell me which table. Now that I know the table I find there are 20 dependent sprocs. - 1 statistic - 3 views I'm considering adding 'WITH RECOMPILE' in each sproc given the remark "When I sp_recompile everything clears up." I'm also considering capturing the parameters values sent to the sprocs as step one in each of the 20 sprocs. Anyone thoughts?
performanceplan-cacheparameter-sniffing
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

This web site runs off of votes. Can you please indicate all helpful answers below by clicking on the thumbs up next to those answers. If any one answer lead to a solution, please indicate that by clicking on the check mark next to that one answer.
0 Likes 0 ·
Fatherjack avatar image
Fatherjack answered
Parameter sniffing will potentially show itself as a very different execution time depending on values provided from the application to the SQL Server. Sometimes users will get fast response, other times not. You may also see a difference in execution between environments - e.g. fast in test, slow in prod. Here are some resources that will explain it and how to approach mitigating its affects http://blogs.technet.com/b/mdegre/archive/2012/03/19/what-is-parameter-sniffing.aspx http://www.brentozar.com/archive/2013/06/the-elephant-and-the-mouse-or-parameter-sniffing-in-sql-server/
7 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Would you expect blocking? this guy is saying once a specific execution of a proc hits everything gets blocked.
0 Likes 0 ·
would be worth adding this information to the question - give us all the details and we can give more specific advice. Blocking is always possible. If he can tell you when it happens you can start working on the fix/investigation.
0 Likes 0 ·
Both answers are very good, but I'm confused by Steve's latest comment. If the problem only happens with a "specific execution of a proc", that sounds more like an issue with specific parameters at that time rather than a cached plan always using "bad" parameters.
0 Likes 0 ·
@Tom. The performance of a given execution with specific values that closely match the plan's cached values -v- lesser execution performance because the parameter values are not relevant to the optimized cached plan basically is the definition of the "parameter sniffing" problem. One way to look at resolving this can be looked from two directions: 1) Let's build a better application that properly constrains values within a range which will fit within the cached plan's optimization -or- 2) Let's build a better sproc where the compilation of the plan will be optimized for ranges more suitable from what will be received from the application. Unfortunately in this case, option #1 is off the table per the owner.
0 Likes 0 ·
and this is where more information from the OP would help us further ...
0 Likes 0 ·
Show more comments
Kev Riley avatar image
Kev Riley answered
Parameter sniffing is not always an issue. In fact this behaviour is by design and in most cases parameter sniffing is advantageous, giving you performance improvements over other designs that may not be using parameter sniffing. It can become an issue however when the values that are 'sniffed' are atypical - this would then appear as having worsened performance over 'normal' that is simply rectified by recompiling the stored procedure. So you start by identifying a performance issue, which then may lead you to find that parameter sniffing is generating plans with atypical parameter values. You can see which values are being used by looking at the xml execution plan - search for the ParameterList section and look at the ParameterCompiledValue values. How do you tackle this - depends on why this is happening. Is the first run of the day of the procedure an 'odd' execution? Is this plan being cached? Maybe force a recompile with 'good' values instead at the start of the day (or whenever the cache seems to be empty). Maybe the procedure has too wide-ranging a function - one proc that does many things - break this down into smaller more precise pieces of code. Guy Glantser has done a good blog series on parameterization, covering parameter sniffing in Part 3 : http://www.madeirasql.com/parameterization-part-3-parameter-sniffing/ but be sure to read the rest of the posts to follow the story.
2 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

@Kev. You are hitting the nail on the head in your first paragraph. Your explanation so closely matches the DBA's it isn't funny.
0 Likes 0 ·
This web site runs off of votes. Can you please indicate all helpful answers below by clicking on the thumbs up next to those answers. If any one answer lead to a solution, please indicate that by clicking on the check mark next to that one answer.
0 Likes 0 ·
Grant Fritchey avatar image
Grant Fritchey answered
With so little to go on it's hard to make suggestions. The most important thing would be to first identify which queries are having issues. Then capture an execution plan for that query (those queries) when they're performing well and when they're performing badly. Understand what it is about the parameters being passed that is causing the optimizer to make poor choices. This means looking at the statistics, the state of the statistics and compare this with the parameters being passed. Keeping the statistics up to date is important. Hourly updates is probably overkill in most cases (although, I had a design that was so bad we were updating statistics every 5 minutes until we redesigned the queries). You're generally going to find you need either a very specific plan that meets the majority of cases in the data set, or a generic plan that works good enough, or you'll need a specific plan for each case. Implement a solution which addresses the case. You can use OPTIMIZE FOR @Parm = to get a specific plan. OPTIMIZE FOR UNKNOWN will get you a generic plan. RECOMPILE will get you a new plan each time. Knowing which of these you need is a matter of understanding the data and the statistics as well as testing.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.