Implications of dynamically created table with large number of users
I am creating the data for a report with 19 sets of entities that require pivotting by day to produce a result that I can sent to an ASP page. A start date and an end date is given so as you can imagine the columns need generating dynamically. So far I have this: DECLARE @ColumnSchema varchar(max) DECLARE @StartDate datetime DECLARE @EndDate datetime SET @StartDate = '2010-10-01' SET @EndDate = '2010-10-16' SELECT @ColumnSchema = COALESCE(@ColumnSchema + ', ' + QUOTENAME(CONVERT(varchar(10), ReportDate, 120)) + ' varchar(max)', '' + QUOTENAME(CONVERT(varchar(10), ReportDate, 120)) + ' varchar(max)') FROM AgentDetailETLTest WHERE ReportDate >= @StartDate AND ReportDate < @EndDate + 1 ORDER BY ReportDate SET @QueryText = 'CREATE TABLE AgentDetailTempResult ([EntityName] varchar(max), ' + @ColumnSchema + ')' EXEC(@QueryText) This then generates a table using the same parameters that I am going to pivot my data with however I would prefer it to be a table variable or a temporary table so that it is handled in memory otherwise I am going to have problems when a large number of people start using the script with the same table being dropped and recreated all the time. I then use the following code 19 times to create the different entity lines: SET @QueryText = 'SELECT ''Percentage Free'' , ' + @ColumnNames + ' FROM ( SELECT ReportDate , CONVERT(decimal(3, 2), PercentageFree) [PercentageFree] FROM AgentDetailETLTest WHERE ReportDate >= ''' + CONVERT(varchar(10), @StartDate, 120) + ''' AND ReportDate
I can see why you are using a dynamic pivot - but I can't see why you are putting this information into a table... Is there anything preventing you from using the meta-data available with SqlDataReader to return your ASP page report without using an intermediate table (i.e. instead of `INSERT AgentDetailTempResult EXEC(@QueryText)` just `EXEC(@QueryText)` and interpret the result set, rather than reading from a table)?