x

Diagnose SQL Performance Issue

Hi all I have an app that runs a advanced search query as below

CREATE PROC [dbo].[CustomerAdvancedSearch]
    @FirstName NVARCHAR(50) = '' ,
    @LastName NVARCHAR(50) = '' ,
    @City NVARCHAR(50) = '' ,
    @Postcode NVARCHAR(20) = '' ,
    @Country NVARCHAR(50) = '' ,
    @Name NVARCHAR(50) = '' ,
    @DateOfBirth NVARCHAR(12) = ''
AS 
    SET nocount ON

--DECLARE @FirstName NVARCHAR(50) = '' ,
--    @LastName NVARCHAR(50) = '' ,
--    @City NVARCHAR(50) = '' ,
--    @Postcode NVARCHAR(10) = '' ,
--    @Country AS NVARCHAR(2) = '' ,
--    @Name AS NVARCHAR(50) = '' ,
--    @DateOfBirth AS NVARCHAR(12)= ''

    SELECT TOP 100
            ISNULL(CU.FirstName, 'Unknown') AS FirstName ,
            ISNULL(CU.LastName, 'Unknown') AS LastName ,
            ISNULL(CU.City, 'Unknown') AS City ,
            ISNULL(CU.Postcode, 'Unknown') AS Postcode ,
            CO.Name ,
            CL.CompanyID ,
            ISNULL(CN.Country, 'Unknown') ,
            CU.CustomerID ,
            CASE WHEN CU.PrimaryCustomer = '1' THEN 'Yes'
                 ELSE 'No'
            END AS [Primary] ,
            ISNULL(CU.MiddleName, 'Unknown') AS MiddleName ,
            ISNULL(REPLACE(CONVERT(NVARCHAR, DateOfBirth, 103), '/', '-'),
                   'Unknown') AS DateOfBirth ,
            CASE WHEN CU.Active = '1' THEN 'Active'
                 WHEN CU.Active = '0' THEN 'Inactive'
                 ELSE 'Unknown'
            END AS ActiveStatus
    FROM    crm.dbo.Customer CU
            JOIN crm.dbo.CompanyLevel CL ON cl.CompanyLevelID = cu.CompanyLevelID
            JOIN crm.dbo.Company CO ON co.CompanyID = cl.CompanyID
            LEFT OUTER JOIN dbo.Country CN ON CN.CountryCode = CU.Country
    WHERE   ( @FirstName = ''
              OR FirstName LIKE @FirstName + '%'
            )
            AND ( @LastName = ''
                  OR LastName LIKE @LastName + '%'
                )
            AND ( @City = ''
                  OR City LIKE @City + '%'
                )
            AND ( @PostCode = ''
                  OR Postcode LIKE @PostCode + '%'
                )
            AND ( @Country = ''
                  OR CN.Country = @Country
                )
            AND ( @Name = ''
                  OR CO.Name LIKE @Name + '%'
                )
            AND ( @DateOfBirth = ''
                  OR @DateOfBirth = CONVERT(DATE, DateOfBirth, 103)
                )

I've tuned this search query and it is as optimised as possible. I have looked to see what the app is running and it is literally this stored proc with various parameters. I have run the proc with same parameters in SSMS and it is blazing quick sub second. However when the app is running it, it is really slow or hangs. How do i rule out that it is the database that is the issue? i'm not very familiar with the various diagnostic DMVs in SQL or understand if what i'm looking at is high etc. so could someone help me and i'll post stats etc.? Thanks!

I have run which i found

WITH Waits AS
(SELECT
    wait_type,
    wait_time_ms / 1000.0 AS WaitS,
    (wait_time_ms - signal_wait_time_ms) / 1000.0 AS ResourceS,
    signal_wait_time_ms / 1000.0 AS SignalS,
    waiting_tasks_count AS WaitCount,
    100.0 * wait_time_ms / SUM (wait_time_ms) OVER() AS Percentage,
    ROW_NUMBER() OVER(ORDER BY wait_time_ms DESC) AS RowNum
FROM sys.dm_os_wait_stats
WHERE wait_type NOT IN (
    'CLR_SEMAPHORE', 'LAZYWRITER_SLEEP', 'RESOURCE_QUEUE', 'SLEEP_TASK',
    'SLEEP_SYSTEMTASK', 'SQLTRACE_BUFFER_FLUSH', 'WAITFOR', 'LOGMGR_QUEUE',
    'CHECKPOINT_QUEUE', 'REQUEST_FOR_DEADLOCK_SEARCH', 'XE_TIMER_EVENT', 

'BROKER_TO_FLUSH',
        'BROKER_TASK_STOP', 'CLR_MANUAL_EVENT', 'CLR_AUTO_EVENT', 'DISPATCHER_QUEUE_SEMAPHORE',
        'FT_IFTS_SCHEDULER_IDLE_WAIT', 'XE_DISPATCHER_WAIT', 'XE_DISPATCHER_JOIN', 'BROKER_EVENTHANDLER',
        'TRACEWRITE', 'FT_IFTSHC_MUTEX', 'SQLTRACE_INCREMENTAL_FLUSH_SLEEP',
        'BROKER_RECEIVE_WAITFOR', 'ONDEMAND_TASK_QUEUE', 'DBMIRROR_EVENTS_QUEUE',
        'DBMIRRORING_CMD', 'BROKER_TRANSMITTER', 'SQLTRACE_WAIT_ENTRIES',
        'SLEEP_BPOOL_FLUSH', 'SQLTRACE_LOCK')
    )
SELECT
    W1.wait_type AS WaitType,
    CAST (W1.WaitS AS DECIMAL(14, 2)) AS Wait_S,
    CAST (W1.ResourceS AS DECIMAL(14, 2)) AS Resource_S,
    CAST (W1.SignalS AS DECIMAL(14, 2)) AS Signal_S,
    W1.WaitCount AS WaitCount,
    CAST (W1.Percentage AS DECIMAL(4, 2)) AS Percentage,
    CAST ((W1.WaitS / W1.WaitCount) AS DECIMAL (14, 4)) AS AvgWait_S,
    CAST ((W1.ResourceS / W1.WaitCount) AS DECIMAL (14, 4)) AS AvgRes_S,
    CAST ((W1.SignalS / W1.WaitCount) AS DECIMAL (14, 4)) AS AvgSig_S
FROM Waits AS W1
    INNER JOIN Waits AS W2 ON W2.RowNum <= W1.RowNum
GROUP BY W1.RowNum, W1.wait_type, W1.WaitS, W1.ResourceS, W1.SignalS, W1.WaitCount, W1.Percentage
HAVING SUM (W2.Percentage) - W1.Percentage < 95; -- percentage threshold
GO

which gives me

alt text

capture.png (28.9 kB)
more ▼

asked Jul 12, 2012 at 11:18 AM in Default

jhowe gravatar image

jhowe
1.1k 50 57 61

and I bet if you run the alter script (as above) the app suddenly reports quick run times?
Jul 12, 2012 at 11:26 AM Kev Riley ♦♦
This is what the app is running EXEC customeradvancedsearch with various parameters
Jul 12, 2012 at 11:30 AM jhowe
I totally agree with Kev Riley. Such search procedure cannot have a good plan for every scenario. It could run good for few scenarios and would be awful for some others, depending upon the data.
Jul 12, 2012 at 11:34 AM Usman Butt
the DB isn't big, under 6gb i also reset DBCC SQLPERF ('sys.dm_os_wait_stats', CLEAR); fairly recently @Usman , yes but that is the issue i'm running the EXACT same query as the APP in SSMS and it is running sub second. The APP is HANGING i can't figure it out...
Jul 12, 2012 at 11:36 AM jhowe
Have you tried OPTION(RECOMPILE) hint?
Jul 12, 2012 at 11:37 AM Usman Butt
show all comments (comments are locked)
10|1200 characters needed characters left

4 answers: sort voted first

I would take a punt that you've bad plans, caused by parameter sniffing.

One 'cure' as well as option(recompile) is to set the parameters to local variables, and then use them in the script instead.
more ▼

answered Jul 12, 2012 at 02:14 PM

Kev Riley gravatar image

Kev Riley ♦♦
53k 47 49 76

I don't know what you mean by parameter sniffing and what do you mean by local variables? :) There is a search screen in the application which is passing variables to the stored procedure.
Jul 12, 2012 at 02:18 PM jhowe

Ah sorry! http://www.simple-talk.com/sql/t-sql-programming/parameter-sniffing/

Option 2 talks about using local variables - if you're still stuck I can mock up your proc....
Jul 12, 2012 at 02:22 PM Kev Riley ♦♦

+1. I would prefer OPTION(Recompile) as the caveat with local variables being the engine starts with a plan that there would be 30 percent data related to the local variable. But as always testing should give the final word.

BTW, this was the same I thought of to letting jhowe know, but after he would have digested the multiple execution plans ;)
Jul 12, 2012 at 02:42 PM Usman Butt
I've used sp_recompile on the customer table which seems to have resolved the issue. How do i stop SQL from using a bad query plan in the future?
Jul 12, 2012 at 04:51 PM jhowe
(comments are locked)
10|1200 characters needed characters left

BTW, what is the output of the following

 SELECT qs.plan_handle, a.attrlist
 FROM   sys.dm_exec_query_stats qs
 CROSS  APPLY sys.dm_exec_sql_text(qs.sql_handle) est
 CROSS  APPLY (SELECT epa.attribute + '=' + convert(nvarchar(127), epa.value) + '   '
   FROM   sys.dm_exec_plan_attributes(qs.plan_handle) epa
   WHERE  epa.is_cache_key = 1
   ORDER  BY epa.attribute
   FOR    XML PATH('')) AS a(attrlist)
 WHERE  est.objectid = object_id ('dbo.CustomerAdvancedSearch')
   AND  est.dbid     = db_id()
I guess there could be two plans for the same procedure.
more ▼

answered Jul 12, 2012 at 11:53 AM

Usman Butt gravatar image

Usman Butt
13.9k 6 8 14

sorry didn't realise i had it on master

plan_handle

0x050005002ADB1A0D40018B00010000000000000000000000
0x050005002ADB1A0D4001D8C3010000000000000000000000

attrlist

acceptable_cursor_options=0 compat_level=100 date_first=7 date_format=1 dbid=5 dbid_execute=5 is_replication_specific=0 language_id=0 merge_action_type=0 objectid=219863850 optional_clr_trigger_dbid=0 optional_clr_trigger_objid=0 optional_spid=0 required_cursor_options=0 set_options=251 status=0 user_id=1 

acceptable_cursor_options=0 compat_level=100 date_first=7 date_format=1 dbid=5 dbid_execute=5 is_replication_specific=0 language_id=0 merge_action_type=0 objectid=219863850 optional_clr_trigger_dbid=0 optional_clr_trigger_objid=0 optional_spid=0 required_cursor_options=0 set_options=4347 status=0 user_id=1 
Jul 12, 2012 at 12:15 PM jhowe
As I guessed there are two plans because of the different settings. Please make sure that your application have the same settings as your SSMS have. The settings like ANSI_NULL_DFLT_ON, ANSI_NULLS,ANSI_PADDING, ANSI_WARNINGS,ARITHABORT etc
Jul 12, 2012 at 12:33 PM Usman Butt
What options would make a difference performance wise, and how can i see what options i have set or what options the app server is using?
Jul 12, 2012 at 02:05 PM jhowe
This can make a big difference as the plan could not be efficient. You can run SQL Profiler to see the settings made by the application as well as SSMS. For SSMS, you can also see these settings through Tools-> Options -> Query Execution -> Advanced/ANSI
Jul 12, 2012 at 02:17 PM Usman Butt
(comments are locked)
10|1200 characters needed characters left

@jhowe - It seems that you are experiencing what @Kev Riley has correctly identified as parameter sniffing. SQL server has decided on a plan for the first time the sproc was called and this is optimal only for searches with very similar / identical values as the first call.

When you issued sp_recompile the plan that was in cache was kicked out and the next call of the procedure was then compiled and ran really fast. I would bet that a subsequent call with completely different parameter values would result in really slow execution.

The idea that @Usmann Butt offered up: OPTION(RECOMPILE) will effectively stop SQL Server from caching a plan. This will result in each call of the procedure to generate a one-time plan that fits exactly to the parameters supplied. This will mean that you have a much better chance of getting a good execution plan and faster execution. However, you have to consider the downside of compiling a plan each and every time the stored procedure is called. Compiling is not a trivial part of execution and can be quite taxing on the server, especially if the stored procedure is called very often. Compiling is especially CPU intensive and if you are calling the procedure hundreds of times a second, you may well run into performance problems from compiling. You also need to consider that compiling may end up taking more time than execution with an "average" plan.

Another alternative to OPTION(RECOMPILE) is to use OPTIMIZE FOR supplying values for each variable that make for a plan that is consistent in execution (guaranteeing an average execution rather than ultra-fast and ultra-slow) or even OPTIMIZE UNKNOWN. These hints were designed to tackle the issue you were having and are a step up from OPTION (RECOMPILE) which is a little bit of a sledgehammer solution to parameter sniffing.

If you have any more questions, let us know.
more ▼

answered Jul 13, 2012 at 06:50 AM

WilliamD gravatar image

WilliamD
25.9k 17 19 41

Wow, that turned out a little longer than I expected!
Jul 13, 2012 at 06:50 AM WilliamD

@WilliamD +1 but I do not think the content makes you bore while reading ;) I totally agree that OPTIMIZE FOR works better in most cases. But I do not think that OPTIMIZE FOR would be as beneficial as it should be in this case. Reason being it is a dynamic search. The parameters may be supplied a value OR may not be. So I would still fell that OPTION(RECOMPILE) has more probability of getting a better execution plan (Unless Recompile cost proves be too much). As always testing would yield the final results.

Moreover, Although I hate dynamic sql, but seems like a more viable solution than any other.

@jhowe An excellent article by Erland Sommarskog on such dynamic search is here. You would find very fruitful information and could optimize your search more.

Jul 13, 2012 at 08:07 AM Usman Butt
Thanks everyone for looking at this. I'm definitely suffering from parameter sniffing. I just tested again this morning through SSMS one query takes under a second, through the app it takes 30 + seconds. Let me describe the environment, it's a smallish db about 6gb. However we are running what is effectively a small call centre with the agents using the advanced search on a regular basis. My worry is that with the agents supplying random parameters i will run into trouble again. The server is an 8 core CPU with 8GB ram. It is very underutilised. I think the server could easily take compiling this stored proc on the fly. The maximum amount of times this procedure would be run in a min. is prob 5-10 times. My options are OPTIMIZE FOR VS OPTION RECOMPILE VS LOCAL PARAMETERS. The problem is I don't have the data in a test environment for security reasons, i can't replicate this in a test environment and test it the decision i make has to be made on the LIVE environment. Opinions? @WilliamD how can i use optimize for? I can't supply parameters as it's dynamic they will change all the time... I'm leaning towards option recompile at the moment... @Usman that was a great article!
Jul 13, 2012 at 08:34 AM jhowe

@jhowe I guess the article link was posted by me ;)

Now back to the point. you are having more than one problems and you need to take care of all.

  1. Multiple plans. You must make sure that you have the same settings of the application. Before sending the request the application would be setting up the environment, which can be seen in SQL Profiler.

  2. For OPTIMIZE FOR and OPTION(RECOMPILE) usage, you have to test it according to workload and the data. But If you are running it 5-10 times in a minute, then I would tilt towards OPTION(RECOMPILE).

It happens that sometimes you do not have access to the real data, but that does not mean you should not do any testing. Make a test bud with a test data. Do your testing with different, small/huge data etc. Only, then you would know what could be the best solution.
Jul 13, 2012 at 09:23 AM Usman Butt

@jhowe - With optimize for you would look at your data spread and what the most common searches are and hopefully find a set of values that make a nice plan. Note that this method would push towards an average execution but a more consistent one.

As you are only calling 5-10 times a minute, I would agree with @Usmann Butt and go for recompiling each time you call. Just be careful to monitor the server when the change is made to make sure you are not getting killed by recompiles.

The change to OPEION (RECOMPILE) is relatively safe and can be rolled back easily, so I wouldn't stress about the lack of testing that too much (unlike a major refactoring).
Jul 13, 2012 at 09:56 AM WilliamD
(comments are locked)
10|1200 characters needed characters left
Please run sql profiler and find the query execution is same or different. What i am feeling is the query to needs to be corrected. if you can reply back with the query execution time for both from SSMS and EXE with same parameters and values, then we can proceed further.
more ▼

answered Jul 13, 2012 at 07:33 AM

sqlmoorthy gravatar image

sqlmoorthy
0

so i've run profiler on the server and it is literally just executing EXECUTE CustomerAdvancedSearch '', 'howe', '', '', '', '', ''
Jul 13, 2012 at 11:47 AM jhowe
@jhowe - as the other answers already discussed, the issue is one of parameter sniffing and not a direct query syntax/design issue. Implementing the ROPTION (RECOMPILE) solution should solve the issue.
Jul 13, 2012 at 11:57 AM WilliamD
@williamd I have implemented option recompile and it is not even touching the CPU really. The query is running really fast so i'm happy with that. Thanks!
Jul 16, 2012 at 09:05 AM jhowe
(comments are locked)
10|1200 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

New code box

There's a new way to format code on the site - the red speech bubble logo will automatically format T-SQL for you. The original code box is still there for XML, etc. More details here.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

SQL Server Central

Need long-form SQL discussion? SQLserverCentral.com is the place.

Topics:

x1840
x586

asked: Jul 12, 2012 at 11:18 AM

Seen: 1482 times

Last Updated: Jul 16, 2012 at 09:05 AM