Hi all We have a front-end server with Management Studio & SSIS installed. I use this server to connect to 3 SQL servers (Live, Live_Replicated and Reporting). The Live_Replicated and Reporting servers are on different subnets, but the front-end server can access them both. Everyday after data from Live is replicated to Live_Replicated server, i run a SSIS package to sync tables between Live_Replicated and the Reporting server. The problem i have is the Transactions table have about 400 000 new transactions daily and takes too long to import via the SSIS package. We generate a INSERT script,save to a .sql file straight on the Live_Replicated server, then open the file on the Reporting server to insert the transactions, and insert in batches of 50 000. Below is a example of the script: EXEC usp_INS_Transactions 246248974,'2013-10-18 12:29:28.963',72959,4464,5,19,20.00,22,73505,6,16 EXEC usp_INS_Transactions 246248975,'2013-10-18 12:29:29.087',69135,4464,5,28.5,30.00,22,69680,3,16 Is there any way this can be added to the SSIS package and not manually have to be inserted to the Reporting database? Any ideas will be appreciated. Thanks
400 000 records is no problem for SSIS. I would recommend you to use the BULK Insert task in SSIS and configure a batch size of 50 000. If you really want to increase the speed in SSIS, you can use a script Component as a "producer" and use multiple script Components to as "consumers" that run in parallell. If you want to know more about the Producer/consumer technique you could tweet @sqlrunner, he held an excellent presentation about this at Pass Summit 2013.