In the data flow task around 10 million rows are fetched from a staging table BY QUERY and after some data conversion the records are put into another table.
The following errors are encountered at source
[SSIS.Pipeline] Error: Unspecified error
[OLE DB Source ] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
The server 64 bit machine with 8 GB Ram
When processed with TOP 20000 rows the data flow task is working fine
Plz Suggest any solution.
asked Sep 23, 2011 at 10:01 AM in Default
what is the defaultBufferMaxRows and the DefaultBufferSize set on the data flow task?
As always, the error messages are very vague, but it could be fixed with tweaking the values above.
answered Sep 25, 2011 at 09:53 PM
Cannot it be related to the memory problems?
You wrote, you have 8 GB of RAM, but how much RAM you have free when executing the SSIS Package?
When a data flow is executed, buffers are allocated on the defaultBufferMaxRows and DefaultBufferSize. Then based on the row size an appropriate buffer size is chosen. But you have to remember, that based on the data flow complexity and especially transformation components chosen several execution trees can be created and then each of the execution tree will have it's own buffer.
Then when data are flowing though the data flow, and there is a lot of data going though, then then buffers for all the execution trees are allocated and you can run out of memory.
In case of lower number of data coming from the source it could happen, then all the data read from the source prior other parts of the data flow execution tree are finished and the memory for the buffer used by the source component can be reused later in the data flow so it executes correctly.
As @Daniel Ross mentioned, by proper tuning of the defaultBufferMaxRows and DefaultBufferSize you can avoid this problem.
You can find some useful information here: Integration Services: Performance Tuning Techniques.
answered Sep 25, 2011 at 11:18 PM
Thanx for the inputs..tried to solve the problem by tweaking the values of defaultBufferMaxRows and DefaultBufferSize of the task..but it didnot help..
Got a breakthrough when changed the Server memory configuration of the SQL server ,the task was accessing.The minimum memory was 6168MB and maximum memory was 7168 MB. After changing the minimum memory required by the server instance to 0 MB , the task successfully completed.
Since SSIS has its own buffer manager which manages data specific to the data flow pipeline engine and is different from the SQL Server buffer manager, it was not getting enough memory since 6GB memory has been specifically allocated to the SQL Server instance. So I made the Minimum Server Memory as 0MB.
answered Sep 26, 2011 at 12:27 AM