What are the cases when one should run a sp_recompile?
When there is a large change in the data that could affect the accuracy of the statistics that guide the optimisers decisions when it creates a query plan. If the mass update or insertion of data takes place after a procedure has its query plan generated then the plan could be 'non-optimal', setting sp_recompile for the tables affected will cause the optimiser to create a new plan.
Even without large scale spot changes to the tables, you might see incremental changes over time that will affect the statistics on a table. You might need to schedule sp_recompile just occasionally, depending on the system requirements, load, execution plans, etc.. But, for this instance, it's usually enough to simply update to the statistics and then any procedures that use those statistics will be marked for recompile the next time they execute.
For tables where we know there is a lot of activity, we do the following:
So, the indexes should be in the best possible state, the query optimizer knows what state they're in and the sprocs will definitely get recompiled next time they're used, to take advantage of all this.
Not sure how necesary this is in SQL 2008, but we definitely found it worthwhile in SQL 2000.
No one has followed this question yet.