Here is the scenario
I have one staging table for csv file which is My source I am loading it into physical staging table I will be doing transformations on this staging table data in later part of package I need fresh data (as it is from source)
Should I do transformation in temp table or should I use dataflow task again to reload staging table
The data isnt more just less than a million only
The better practice is to use a staging table when you load it from a file. You can work on the data in a table for all the transformations . Also when there is failure you dont have to load the data again , rather start from where you left ( recovery point) 1 million is not huge for a temp table but you will never know what are the other process using temp db.
answered Dec 16, 2016 at 06:44 PM