All, i currently have a few tables which are linked as follows tableA (Aid int identity(1,1) primary key, Startdate date, key1 varchar (50)) tableB( Bid int identity(1,1) primarykey , Aid int , attribute1 varchar, attribute2 varchar) tableC (Cid int idenitity(1,1) primarykey, Bid int, value) Note that TableA, tableB, tableC are all related to each other and has one to many relationship. Currently i have a manual process, manual in the sense manually creating the staging tables and manually executig the query to load data into each of the tables, by managing the identity keys. this was done through pure sql code as this was one time load, But the new requirement says that the data needs to be synced in everyday with the target being the same structure. So it means i need to automate this solution, how can i automate it, how can i dynamically create the stage tabels based on the date . I am currently joining all three table and dumping it into a table and that is used as my source. the sample code that i am currently using is given below on the tab DECLARE @IDENT_GEN INT = (SELECT IDENT_CURRENT (DATABASE.TABLEA]')) Select @ident_gen,date etc into stage1.table from the sourcetable where sourcetable.date = getdate() and hence loading the three tables in the same fashion joining with the source table and also the stage table too . how can i automate this scenario, how can i dynamically create those stage tables. I am not sure if i should use temp tables because if, i automate it across, it could be too many rows (Around a million) to process in the memory. Any ideas? Thanks
I would say the easiest thing would be to use SSIS package for this task. You can set a variable to grab the current date, or enter the date you want to run the package against each time. Then just add a Execute SQL Task to execute the T-SQL code you use in the manual process. With the large number of rows and since this will be a repeated process you can probably create static tables for your staging, if the data types and columns will be the same every time. This will probably help control performance and not let it get out of control. Another option would be to just adjust your code to just run every time as a SQL Agent job, but I believe SSIS would handle it better and provide a little bit better logging.