question

jhowe avatar image
jhowe asked

Diagnose SQL Performance Issue

Hi all I have an app that runs a advanced search query as below CREATE PROC [dbo].[CustomerAdvancedSearch] @FirstName NVARCHAR(50) = '' , @LastName NVARCHAR(50) = '' , @City NVARCHAR(50) = '' , @Postcode NVARCHAR(20) = '' , @Country NVARCHAR(50) = '' , @Name NVARCHAR(50) = '' , @DateOfBirth NVARCHAR(12) = '' AS SET nocount ON --DECLARE @FirstName NVARCHAR(50) = '' , -- @LastName NVARCHAR(50) = '' , -- @City NVARCHAR(50) = '' , -- @Postcode NVARCHAR(10) = '' , -- @Country AS NVARCHAR(2) = '' , -- @Name AS NVARCHAR(50) = '' , -- @DateOfBirth AS NVARCHAR(12)= '' SELECT TOP 100 ISNULL(CU.FirstName, 'Unknown') AS FirstName , ISNULL(CU.LastName, 'Unknown') AS LastName , ISNULL(CU.City, 'Unknown') AS City , ISNULL(CU.Postcode, 'Unknown') AS Postcode , CO.Name , CL.CompanyID , ISNULL(CN.Country, 'Unknown') , CU.CustomerID , CASE WHEN CU.PrimaryCustomer = '1' THEN 'Yes' ELSE 'No' END AS [Primary] , ISNULL(CU.MiddleName, 'Unknown') AS MiddleName , ISNULL(REPLACE(CONVERT(NVARCHAR, DateOfBirth, 103), '/', '-'), 'Unknown') AS DateOfBirth , CASE WHEN CU.Active = '1' THEN 'Active' WHEN CU.Active = '0' THEN 'Inactive' ELSE 'Unknown' END AS ActiveStatus FROM crm.dbo.Customer CU JOIN crm.dbo.CompanyLevel CL ON cl.CompanyLevelID = cu.CompanyLevelID JOIN crm.dbo.Company CO ON co.CompanyID = cl.CompanyID LEFT OUTER JOIN dbo.Country CN ON CN.CountryCode = CU.Country WHERE ( @FirstName = '' OR FirstName LIKE @FirstName + '%' ) AND ( @LastName = '' OR LastName LIKE @LastName + '%' ) AND ( @City = '' OR City LIKE @City + '%' ) AND ( @PostCode = '' OR Postcode LIKE @PostCode + '%' ) AND ( @Country = '' OR CN.Country = @Country ) AND ( @Name = '' OR CO.Name LIKE @Name + '%' ) AND ( @DateOfBirth = '' OR @DateOfBirth = CONVERT(DATE, DateOfBirth, 103) ) I've tuned this search query and it is as optimised as possible. I have looked to see what the app is running and it is literally this stored proc with various parameters. I have run the proc with same parameters in SSMS and it is blazing quick sub second. However when the app is running it, it is really slow or hangs. How do i rule out that it is the database that is the issue? i'm not very familiar with the various diagnostic DMVs in SQL or understand if what i'm looking at is high etc. so could someone help me and i'll post stats etc.? Thanks! I have run which i found WITH Waits AS (SELECT wait_type, wait_time_ms / 1000.0 AS WaitS, (wait_time_ms - signal_wait_time_ms) / 1000.0 AS ResourceS, signal_wait_time_ms / 1000.0 AS SignalS, waiting_tasks_count AS WaitCount, 100.0 * wait_time_ms / SUM (wait_time_ms) OVER() AS Percentage, ROW_NUMBER() OVER(ORDER BY wait_time_ms DESC) AS RowNum FROM sys.dm_os_wait_stats WHERE wait_type NOT IN ( 'CLR_SEMAPHORE', 'LAZYWRITER_SLEEP', 'RESOURCE_QUEUE', 'SLEEP_TASK', 'SLEEP_SYSTEMTASK', 'SQLTRACE_BUFFER_FLUSH', 'WAITFOR', 'LOGMGR_QUEUE', 'CHECKPOINT_QUEUE', 'REQUEST_FOR_DEADLOCK_SEARCH', 'XE_TIMER_EVENT', 'BROKER_TO_FLUSH', 'BROKER_TASK_STOP', 'CLR_MANUAL_EVENT', 'CLR_AUTO_EVENT', 'DISPATCHER_QUEUE_SEMAPHORE', 'FT_IFTS_SCHEDULER_IDLE_WAIT', 'XE_DISPATCHER_WAIT', 'XE_DISPATCHER_JOIN', 'BROKER_EVENTHANDLER', 'TRACEWRITE', 'FT_IFTSHC_MUTEX', 'SQLTRACE_INCREMENTAL_FLUSH_SLEEP', 'BROKER_RECEIVE_WAITFOR', 'ONDEMAND_TASK_QUEUE', 'DBMIRROR_EVENTS_QUEUE', 'DBMIRRORING_CMD', 'BROKER_TRANSMITTER', 'SQLTRACE_WAIT_ENTRIES', 'SLEEP_BPOOL_FLUSH', 'SQLTRACE_LOCK') ) SELECT W1.wait_type AS WaitType, CAST (W1.WaitS AS DECIMAL(14, 2)) AS Wait_S, CAST (W1.ResourceS AS DECIMAL(14, 2)) AS Resource_S, CAST (W1.SignalS AS DECIMAL(14, 2)) AS Signal_S, W1.WaitCount AS WaitCount, CAST (W1.Percentage AS DECIMAL(4, 2)) AS Percentage, CAST ((W1.WaitS / W1.WaitCount) AS DECIMAL (14, 4)) AS AvgWait_S, CAST ((W1.ResourceS / W1.WaitCount) AS DECIMAL (14, 4)) AS AvgRes_S, CAST ((W1.SignalS / W1.WaitCount) AS DECIMAL (14, 4)) AS AvgSig_S FROM Waits AS W1 INNER JOIN Waits AS W2 ON W2.RowNum ]]] sql-server-2008sql-server-2008-r2
capture.png (28.2 KiB)
8 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Kev Riley avatar image Kev Riley ♦♦ commented ·
and I bet if you run the alter script (as above) the app suddenly reports quick run times?
3 Likes 3 ·
Usman Butt avatar image Usman Butt commented ·
I totally agree with Kev Riley. Such search procedure cannot have a good plan for every scenario. It could run good for few scenarios and would be awful for some others, depending upon the data.
3 Likes 3 ·
Usman Butt avatar image Usman Butt commented ·
Have you tried OPTION(RECOMPILE) hint?
2 Likes 2 ·
jhowe avatar image jhowe commented ·
This is what the app is running EXEC customeradvancedsearch with various parameters
0 Likes 0 ·
jhowe avatar image jhowe commented ·
the DB isn't big, under 6gb i also reset DBCC SQLPERF ('sys.dm_os_wait_stats', CLEAR); fairly recently @Usman , yes but that is the issue i'm running the EXACT same query as the APP in SSMS and it is running sub second. The APP is HANGING i can't figure it out...
0 Likes 0 ·
Show more comments
Kev Riley avatar image
Kev Riley answered
I would take a punt that you've bad plans, caused by parameter sniffing. One 'cure' as well as option(recompile) is to set the parameters to local variables, and then use them in the script instead.
4 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

jhowe avatar image jhowe commented ·
I don't know what you mean by parameter sniffing and what do you mean by local variables? :) There is a search screen in the application which is passing variables to the stored procedure.
0 Likes 0 ·
Kev Riley avatar image Kev Riley ♦♦ commented ·
Ah sorry! http://www.simple-talk.com/sql/t-sql-programming/parameter-sniffing/ Option 2 talks about using local variables - if you're still stuck I can mock up your proc....
0 Likes 0 ·
Usman Butt avatar image Usman Butt commented ·
+1. I would prefer OPTION(Recompile) as the caveat with local variables being the engine starts with a plan that there would be 30 percent data related to the local variable. But as always testing should give the final word. BTW, this was the same I thought of to letting jhowe know, but after he would have digested the multiple execution plans ;)
0 Likes 0 ·
jhowe avatar image jhowe commented ·
I've used sp_recompile on the customer table which seems to have resolved the issue. How do i stop SQL from using a bad query plan in the future?
0 Likes 0 ·
Usman Butt avatar image
Usman Butt answered
BTW, what is the output of the following SELECT qs.plan_handle, a.attrlist FROM sys.dm_exec_query_stats qs CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) est CROSS APPLY (SELECT epa.attribute + '=' + convert(nvarchar(127), epa.value) + ' ' FROM sys.dm_exec_plan_attributes(qs.plan_handle) epa WHERE epa.is_cache_key = 1 ORDER BY epa.attribute FOR XML PATH('')) AS a(attrlist) WHERE est.objectid = object_id ('dbo.CustomerAdvancedSearch') AND est.dbid = db_id() I guess there could be two plans for the same procedure.
4 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

jhowe avatar image jhowe commented ·
sorry didn't realise i had it on master plan_handle 0x050005002ADB1A0D40018B00010000000000000000000000 0x050005002ADB1A0D4001D8C3010000000000000000000000 attrlist acceptable_cursor_options=0 compat_level=100 date_first=7 date_format=1 dbid=5 dbid_execute=5 is_replication_specific=0 language_id=0 merge_action_type=0 objectid=219863850 optional_clr_trigger_dbid=0 optional_clr_trigger_objid=0 optional_spid=0 required_cursor_options=0 set_options=251 status=0 user_id=1 acceptable_cursor_options=0 compat_level=100 date_first=7 date_format=1 dbid=5 dbid_execute=5 is_replication_specific=0 language_id=0 merge_action_type=0 objectid=219863850 optional_clr_trigger_dbid=0 optional_clr_trigger_objid=0 optional_spid=0 required_cursor_options=0 set_options=4347 status=0 user_id=1
0 Likes 0 ·
Usman Butt avatar image Usman Butt commented ·
As I guessed there are two plans because of the different settings. Please make sure that your application have the same settings as your SSMS have. The settings like ANSI_NULL_DFLT_ON, ANSI_NULLS,ANSI_PADDING, ANSI_WARNINGS,ARITHABORT etc
0 Likes 0 ·
jhowe avatar image jhowe commented ·
What options would make a difference performance wise, and how can i see what options i have set or what options the app server is using?
0 Likes 0 ·
Usman Butt avatar image Usman Butt commented ·
This can make a big difference as the plan could not be efficient. You can run SQL Profiler to see the settings made by the application as well as SSMS. For SSMS, you can also see these settings through Tools-> Options -> Query Execution -> Advanced/ANSI
0 Likes 0 ·
WilliamD avatar image
WilliamD answered
@jhowe - It seems that you are experiencing what @Kev Riley has correctly identified as parameter sniffing. SQL server has decided on a plan for the first time the sproc was called and this is optimal only for searches with very similar / identical values as the first call. When you issued sp_recompile the plan that was in cache was kicked out and the next call of the procedure was then compiled and ran really fast. I would bet that a subsequent call with completely different parameter values would result in really slow execution. The idea that @Usmann Butt offered up: `OPTION(RECOMPILE)` will effectively stop SQL Server from caching a plan. This will result in each call of the procedure to generate a one-time plan that fits exactly to the parameters supplied. This will mean that you have a much better chance of getting a good execution plan and faster execution. However, you have to consider the downside of compiling a plan each and every time the stored procedure is called. Compiling is not a trivial part of execution and can be quite taxing on the server, especially if the stored procedure is called very often. Compiling is especially CPU intensive and if you are calling the procedure hundreds of times a second, you may well run into performance problems from compiling. You also need to consider that compiling may end up taking more time than execution with an "average" plan. Another alternative to `OPTION(RECOMPILE)` is to use `OPTIMIZE FOR` supplying values for each variable that make for a plan that is consistent in execution (guaranteeing an average execution rather than ultra-fast and ultra-slow) or even `OPTIMIZE UNKNOWN`. These hints were designed to tackle the issue you were having and are a step up from `OPTION (RECOMPILE)` which is a little bit of a sledgehammer solution to parameter sniffing. If you have any more questions, let us know.
5 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Usman Butt avatar image Usman Butt commented ·
@WilliamD +1 but I do not think the content makes you bore while reading ;) I totally agree that OPTIMIZE FOR works better in most cases. But I do not think that OPTIMIZE FOR would be as beneficial as it should be in this case. Reason being it is a dynamic search. The parameters may be supplied a value OR may not be. So I would still fell that OPTION(RECOMPILE) has more probability of getting a better execution plan (Unless Recompile cost proves be too much). As always testing would yield the final results. Moreover, Although I hate dynamic sql, but seems like a more viable solution than any other. @jhowe An excellent article by Erland Sommarskog on such dynamic search is [here][1]. You would find very fruitful information and could optimize your search more. [1]: http://www.sommarskog.se/dyn-search-2008.html
2 Likes 2 ·
WilliamD avatar image WilliamD commented ·
Wow, that turned out a little longer than I expected!
0 Likes 0 ·
jhowe avatar image jhowe commented ·
Thanks everyone for looking at this. I'm definitely suffering from parameter sniffing. I just tested again this morning through SSMS one query takes under a second, through the app it takes 30 + seconds. Let me describe the environment, it's a smallish db about 6gb. However we are running what is effectively a small call centre with the agents using the advanced search on a regular basis. My worry is that with the agents supplying random parameters i will run into trouble again. The server is an 8 core CPU with 8GB ram. It is very underutilised. I think the server could easily take compiling this stored proc on the fly. The maximum amount of times this procedure would be run in a min. is prob 5-10 times. My options are OPTIMIZE FOR VS OPTION RECOMPILE VS LOCAL PARAMETERS. The problem is I don't have the data in a test environment for security reasons, i can't replicate this in a test environment and test it the decision i make has to be made on the LIVE environment. Opinions? @WilliamD how can i use optimize for? I can't supply parameters as it's dynamic they will change all the time... I'm leaning towards option recompile at the moment... @Usman that was a great article!
0 Likes 0 ·
Usman Butt avatar image Usman Butt commented ·
@jhowe I guess the article link was posted by me ;) Now back to the point. you are having more than one problems and you need to take care of all. 1. Multiple plans. You must make sure that you have the same settings of the application. Before sending the request the application would be setting up the environment, which can be seen in SQL Profiler. 2. For OPTIMIZE FOR and OPTION(RECOMPILE) usage, you have to test it according to workload and the data. But If you are running it 5-10 times in a minute, then I would tilt towards OPTION(RECOMPILE). It happens that sometimes you do not have access to the real data, but that does not mean you should not do any testing. Make a test bud with a test data. Do your testing with different, small/huge data etc. Only, then you would know what could be the best solution.
0 Likes 0 ·
WilliamD avatar image WilliamD commented ·
@jhowe - With optimize for you would look at your data spread and what the most common searches are and hopefully find a set of values that make a nice plan. Note that this method would push towards an average execution but a more consistent one. As you are only calling 5-10 times a minute, I would agree with @Usmann Butt and go for recompiling each time you call. Just be careful to monitor the server when the change is made to make sure you are not getting killed by recompiles. The change to `OPEION (RECOMPILE)` is relatively safe and can be rolled back easily, so I wouldn't stress about the lack of testing that too much (unlike a major refactoring).
0 Likes 0 ·
sqlmoorthy avatar image
sqlmoorthy answered
Please run sql profiler and find the query execution is same or different. What i am feeling is the query to needs to be corrected. if you can reply back with the query execution time for both from SSMS and EXE with same parameters and values, then we can proceed further.
3 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

jhowe avatar image jhowe commented ·
so i've run profiler on the server and it is literally just executing EXECUTE CustomerAdvancedSearch '', 'howe', '', '', '', '', ''
0 Likes 0 ·
WilliamD avatar image WilliamD commented ·
@jhowe - as the other answers already discussed, the issue is one of parameter sniffing and not a direct query syntax/design issue. Implementing the R`OPTION (RECOMPILE)` solution should solve the issue.
0 Likes 0 ·
jhowe avatar image jhowe commented ·
@williamd I have implemented option recompile and it is not even touching the CPU really. The query is running really fast so i'm happy with that. Thanks!
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.