temp table vs data flow task on physical table

Here is the scenario

I have one staging table for csv file which is My source I am loading it into physical staging table I will be doing transformations on this staging table data in later part of package I need fresh data (as it is from source)

Should I do transformation in temp table or should I use dataflow task again to reload staging table

The data isnt more just less than a million only

more ▼

asked Aug 31, 2012 at 04:47 PM in Default

avatar image

0 1 1 1

(comments are locked)
10|1200 characters needed characters left

1 answer: sort voted first

The better practice is to use a staging table when you load it from a file. You can work on the data in a table for all the transformations . Also when there is failure you dont have to load the data again , rather start from where you left ( recovery point) 1 million is not huge for a temp table but you will never know what are the other process using temp db.

more ▼

answered Dec 16, 2016 at 06:44 PM

avatar image

2.8k 56 65 71

(comments are locked)
10|1200 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

SQL Server Central

Need long-form SQL discussion? SQLserverCentral.com is the place.



asked: Aug 31, 2012 at 04:47 PM

Seen: 766 times

Last Updated: Dec 16, 2016 at 06:44 PM

Copyright 2018 Redgate Software. Privacy Policy