Dynamically change connection data prior to a Data flow Task.
Hi, At first it sounded easy.. doesn't it always? I want to perform the database task of importing and exporting tables within an SSIS project. I prompt the user for the database name, table name and direction. At that point I am trying to dynamically change the connection information of a data flow task. I need to export the table to excel, and then import the excel to the table. Since I can do it easily from SSMS, I figured it would be just as easy to create a package. Whoa nelly, was I wrong! I tried to use expressions on the Source Data Flow but no luck there. Any suggestions? (I am hoping to stay away from a script and use the the Data Flow Task. I need the column headers.) My goal to have a package that a colleague can execute, enter the DB, table and direction so that colleague can update the table and then import it back into the DB. We have the same server, but different DBs and Tables that need manually updating which means different numbers of columns to map as well. Appreciate any help suggested! paws27284
First related to connections. You can easily make the connection to SQL Server dynamic. You can use an expression for the connection string. Only check that you have not set the **`RetainSameConnection`** to true. It needs to be set to False. Second part is related to multiple tables with different columns. This cannot be handled dynamically in Data Flow Task. The data flow task has always fixed columns which are mapped from source to destination and the metadata are checked at run-time. The only possibility to have dynamic column is to completely script the transformation. However if the you have tables with several different structures, you can design multiple data flow (each for one table structure) and then in the control flow decide which one will be executed. Also you can create a child package for each table structure and a Master package which will execute particular child package based on the table structure.