I am working on an application it is used for tracking the employees utilization,and during performance testing we have find out that DB utilization is more in some scenarios.
Eg: generate the reports for unsubmitted tasks.
When executing the set of stored procedures involved in this scenario,separately it's performance is good. But when it runs from application, showing the db utilization is high. Could someone please help me to locate the exact issues,which is causing the this problem.
Thanks, Subheesh Paul
Answer by Grant Fritchey ·
I agree with the Apeman, setting up a server side trace to capture the events as they come in is a good idea.
Before you get that set up though, you can start querying dynamic management views to capture performance metrics about stored procedures. Look at sys.dm_exec_query_stats to see aggregate performance metrics about the queries that have come into the system. You can join this with sys.dm_exec_query_plan to see the estimated execution plans and get some idea of how the queries are being executed within SQL Server. You could also look to sys.dm_os_wait_stats to see what has been causing the most slow down on the system. While queries are executing, look to sys.dm_exec_requests to see which processes are running long, what they're waiting on, if they're blocked, etc.
If you have large differences between the procedures executed from Management Studio and the application, you should compare the connection strings between the two. ANSI settings can affect the execution plans generated, which absolutely affects performance.