We have a strategy here of reporting off a replication of the live database. The idea is that when something changes on live, it can usually be reflected in reporting in a few seconds. I'm not sure this is the best strategy and it will become less appealing as time goes on. A few of our reports are starting to run very slowly because the data are optimized for transactions, not analysis. I'd like to be able to move a lot of our reporting to cubes but there's a perception here that there is too much latency between the cubes and the source transactional data (say a day to a week).
The reality is that for 90% of reporting purposes here, they don't need the reporting data to be current to the minute, but it's like some sacred cow that they won't relinquish. I'm never going to win that argument. Where I might have some leverage though is if I could say that the cubes will be refreshed with minimal overhead every, say hour or two hours instead of once a week or overnight. Where can I get some good doco on how to reduce the lag between the cube and live data? What are the tradeoffs associated with keeping cubes current? What are your experiences?
GPO, keeping cubes current is the bane of most dbas/devs that have to look after a BI implementation.
You say that reporting runs currently against the OLTP system and this is causing performance problems. What would speak against setting up transactional replication to replicate all required tables to another machine. This would allow you to offload the reporting, freeing up the OLTP and allowing you to add indexes that support reporting to the replication target system.
We use this setup quite successfully, but are looking at the next release of sql server for further improvments through database mirrors that are readable.
answered Dec 15, 2010 at 10:38 PM