I have a slow performing query and to in order to tune it I'm planning to take a step-by-step approach. My query has started performing slowly on 12/8 moving forward which is when the last time three out of 7 of the statistics were updated.
The primary key statistic was updated last on 12/8 and hasn't been updated after that and another statistic on a unique (non-clustered) column was also updated on 12/8 and hasn't been updated after that. Another non-unique, non-clustered columns statistics were updated on 12/8 and haven't been updated after that either. The system statistic was updated last on 12/10.
This query I'm referring to (stored procedure) used to run from 30-40 min now to 5+ hours. I'm thinking updating the statistics might help.
My question is... is it feasible to update the statistics on these columns during production hours? Or is it typically recommended to update the statistics later on at night? I have never updated statistics before and am not familiar with how long it takes.
Thanks in advance for all replies.
asked Dec 18 '09 at 11:56 AM in Default
Usually I update all the statistics in the database and I do this in off-hours. However, there are instances where the cost, if any, of updating statistics is far exceeded by the benefit it offers. For example, we have a rather poorly designed table. For all sorts of reasons the business won't let us fix it. But, due to it's poor design, the stats go out of date in a matter of hours. We have a special job that runs once every half an hour that updates the stats on that one table, all day long. It does provide a slight performance hit, but the benefits (good execution plans) are well worth it.
So, in short, depending on your situation, yes, you can, and should, run the stats updates during production hours. Just be sure to monitor to see if you're causing problems or, conversely, if you're not updating enough.
answered Dec 18 '09 at 12:02 PM
Grant Fritchey ♦♦
Great answer from Grant (as always)! I just want to make sure your performance issue is because of statistics. Yes, statistics is important for the performance, no dubt about that, and you say you are not familiar with "update statistics", how about index fragmentation? Are you familiar with that? And what about t-sql tuning?
I haven't seen your stored procedure, and i have no clue about the size of your database, but if you have a well optimized procedure, and an optimized database design, then from my experience, you have a large database with like 100 000 000+ of records in your involved tables. If you don't have a large database, then i recommend you to look at your stored procedure and your database design.
Sometimes there are just a small change in the code or a new index and a procedure that is taking 30 min will take 30s instead, I have done it a lot of times.
A procedure that goes from 30 min to 5 hours without any changes in the schemas may depend on :
"Not planning for performance is planning for disaster"