question

askmlx121 avatar image
askmlx121 asked

How many general requests the servers receives per day and per hour on average?

Hi 1)Is it possible to get analysis to understand how many general requests the servers receives per day and per hour on average? 2)How many requests are hitting the machine? if any one knows help me............give the script or DMV or article plz.........
serveranalysis
2 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Just remember to click on the thumbs up for any answer that was helpful and to mark the most helpful answer by clicking on the check box.
1 Like 1 ·
Thank you for your valuable comments by Blackhawk-17 & Grant Fritchey
0 Likes 0 ·
askmlx121 avatar image
askmlx121 answered
it works for me. Is it correct? DECLARE @BRPS BIGINT SELECT @BRPS=cntr_value FROM sys.dm_os_performance_counters WHERE counter_name LIKE 'Batch Requests/sec%' WAITFOR DELAY '000:00:01'---"per sec" SELECT (cntr_value-@BRPS)/10.0 AS "Batch Requests/sec" FROM sys.dm_os_performance_counters WHERE counter_name LIKE 'Batch Requests/sec%' select output of the above value i.e per sec *3600 as "per hour" select output of the above value i.e per hour*24 as "per day"
2 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

You're making an assumption that the data you collected for that one second is indicative of your constant load.
1 Like 1 ·
Yeah, those counters would only be per/day or per/hour if you could compare what it was today to what it was yesterday. Then you can make the math to say "per this day" the counter value is X.
0 Likes 0 ·
Grant Fritchey avatar image
Grant Fritchey answered
The very best way to do this is to use extended events where you can capture each remote procedure call and sql batch. You can then accumulate exactly how many calls have been placed against your server. [This is the introduction in the SQL Server Books Online][1]. If you are not using SQL Server 2008 or better (2005 or 2000), you'll have to use a server-side trace to capture the same information. [Gail Shaw's article on Simple-Talk covers this][2]. The second choice if you are running 2005 or better is to use the Dynamic Management Object, [sys.dm_exec_query_stats][3], which shows you the aggregate of all calls against queries that are currently in the plan cache in memory. But, as plans age out of cache due to memory pressure, or plans that recompile, or plans with the option to recompile, their statistics go away. This makes this a very spotty method for getting this done. However, it will be a reasonable measure for how many calls you have in a given day, just not at all perfect. It won't correlate to individual users though. You can also use [performance monitor counters][4]. There is one for SQL Statistics:Batch Requests/sec that will give you a good measure of load. You can also capture Transactions:Transactions to see the number going by. The only reason I don't list these first is because you can't correlate any individual call to a particular query or an individual user. [1]: http://msdn.microsoft.com/en-us/library/bb630282(SQL.110).aspx [2]: http://www.simple-talk.com/sql/performance/finding-the-causes-of-poor-performance-in-sql-server,-part-1/ [3]: http://msdn.microsoft.com/en-us/library/ms189741(v=SQL.90).aspx [4]: http://msdn.microsoft.com/en-us/library/ff727783.aspx
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.