Currently we are using SQL Server 2005 in Production and we also have testing server with SQL Server 2008 R2. and there are same set of reports in both the servers. Now the problem I am seeing is that some of the complex reports are running slow in 2008 R2 server. And the main difference is in Processing time (TimeProcessing column from ExecutionLog table).
One report I am running has one dataset and the query is returning around 86K rows and report has 9 matrix (in which I have Sum aggregation) and a chart.
2005 Server : TimeDataRetrieval : 6 sec , TimeProcessing : 51 sec, TimeRendering : 0 sec
2008 R2 Server: TimeDataRetrieval : 13 sec , TimeProcessing : 234 sec, TimeRendering : 0 sec
Note: Index structure is not same in both servers, that's why there is a difference in Data Retrieval. I am not worrying about that for now.
Here are the server comparison.
2005 Server: Quad-Core AMD Opteron Memory: 16 GB OS: Windows Server 2008
2008 R2 Server: Quad-Core AMD Opteron Memory: 32 GB OS: Windows Server 2008 R2
Size of the ReportServerTempDB is almost same ~ 600 MB in both of them and both servers are used for Reporting only.
I have Globals!TotalPages in report so I there is no On-Demand Processing in 2008 server.
So my question is what am I missing here? Why is the huge difference in Processing? I know I can do all the aggregation in sql side and reduce the number of rows. My only concern is the difference in processing time by two servers for the same report.
asked Jun 02 '10 at 03:41 PM in Default