question

paws27284 avatar image
paws27284 asked

additional criteria results in longer processing time

I am running an SSIS Package with in which 16 dataflow tasks reading an Oracle DB complete in about an hour. I found one task that was returning 54M rows in which we were processing every day. This is the original query running on SQL 2005;using Native OLE db\Microsoft OLE DB Provider for Oracle; SELECT COMPANY , EMPLOYEE , CHECK_ID , OBJ_ID , DEPARTMENT , RECORD_TYPE , CHECK_TYPE , DST_ACCT_UNIT , DST_ACCOUNT , DIST_COMPANY , DED_CODE , DIST_AMT , GL_DATE , JOB_CODE , POSITION FROM LAWSON.PRDISTRIB WHERE COMPANY = 30 AND DED_CODE <> ' ' AND DED_CODE IS NOT NULL We are moving to a new set of servers and I decided we did not need to reprocess history every time and added date criteria to narrow our results. It now looks like this using SQL 2008 R2 and Native OLE DB\Oracle Provider for OLE DB and has a much faster processor. SELECT COMPANY , EMPLOYEE , CHECK_ID , OBJ_ID , DEPARTMENT , RECORD_TYPE , CHECK_TYPE , DST_ACCT_UNIT , DST_ACCOUNT , DIST_COMPANY , DED_CODE , DIST_AMT , GL_DATE , JOB_CODE , POSITION FROM LAWSON.PRDISTRIB WHERE COMPANY = 30 AND GL_DATE > '2012-09-30 00:00:00' AND DED_CODE <> ' ' AND DED_CODE IS NOT NULL The package now takes over 3 hours to pull data. Both are running 32 bit mode. Would adding the date make that much of a difference? I thought it would be better to reduce the number of rows coming into the pipe. thoughts? suggestions? thanks! Paws
sql-server-2005ssissql-server-2008-r2oracle32-bit
13 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Could be down to indexes. Can you post your query plan?
1 Like 1 ·
And check your statistics. But a query plan would be best.
1 Like 1 ·
Wait. Is that query getting data from Oracle or from SQL Server?
1 Like 1 ·
Query is getting data from Oracle. There are no indexes on the Sql side, don't know about the Oracle side. Last night I ran the original query in the new environment and it took almost 6 hours. How do I look at a query plan if the query is hitting an Oracle database?
0 Likes 0 ·
How many rows are returned with the date filter?
0 Likes 0 ·
Around 26M
0 Likes 0 ·
1/2 the rows, 3 X the duration... sounds like the date column could use an index.
0 Likes 0 ·
The date column on the SQL destination table needs an index? Also in the old environment it was hitting Oracle 10g; New Environment is Oracle 11g
0 Likes 0 ·
Since you are limiting your results by date then every row has to be checked to see if it passes that test. An index on the Oracle side would eliminate many rows en masse.
0 Likes 0 ·
ok, here's what I found, source=Oracle 11g, destination=SQL 2008 r2, using the Native OLE db Microsoft OLE DB Provider for Oracle, Tried the index and there is no difference. So I think it has to do with the Native OLE DB Oracle Provider for OLE DB. (not anymore) However let me add that I have other data flow tasks using the new driver and passing a date as critieria without any issues, but 600K is the largest result set of those.
0 Likes 0 ·
Have you tried the Attunity connector? I've seen significant performance improvements when using that to get data from Oracle. It's also free! Check out http://msdn.microsoft.com/en-us/library/ee470675(v=SQL.100).aspx
0 Likes 0 ·
Have not tried the Attunity connector yet. But here are some things I did do, but didn't help. On the Database set Recovery mode to simple and allocated 100Gb for the transactions log, so autogrowth would hopefully not be necessary.
0 Likes 0 ·
Tried the Attunity Connector and it still ran for 3.5 hours. Tried source Oracle 11g to Destination SQL2005 and still it ran for 3.5 hours. My thought process is maybe the source table has changed, going down that road now.
0 Likes 0 ·

0 Answers

· Write an Answer

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.