question

pradyumin avatar image
pradyumin asked

SQL Server 2012: quickly INSERT millions of rows from SELECT

Apologies if I am enraging the forum with repetitive question. Couldn't find the right solution in the forum, hence posting it. I need to fetch 129991763 rows into a cursor or temp table or a staging table quickly and process them into another table. And this destination table is also huge table. Currently I am using INSERT using SELECT statement (the SELECT is nested 4 levels) used hints like Option (FAST 1000), MAXDOP 1, RECOMPILE ...etc... The procedure is consuming lot of time and showing no results or not getting completed at all. Previously I used a cursor with the same hints; but as it was also running more than 22 hours; I switched to INSERT using SELECT. Literally, I need to stop the execution for above both methods. And to be honest, I am beginner in SQL Server database. Even if specifically filter out the records in SELECT based on criteria; still the process needs to broken 4 or 5 chunks and these chunks are also taking more than 4 - 5 hours to complete. Please help. Thanks Pradyumna
sql-server-2012plsql
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

ThomasRushton avatar image ThomasRushton ♦♦ commented ·
I'm confused. You've tagged your query as plsql and sqlserver2012. Also not sure why you're using the query hints you've indicated - what did you expect to gain from them rather than using SQL Server's own inbuilt knowledge of your data?
0 Likes 0 ·
Kev Riley avatar image
Kev Riley answered
There's 2 things to consider here : the read of the data (the SELECT) and then the write of the data (the INSERT) - it would be useful to know which part is running slow (could be both) as then you have a starting point. You've made a good start as trying to fetch 129 million rows with a cursor will always take a LOT longer than using a set-based approach. So, to know if it's the insert or the select, you would look at the execution plan (can you post it here as a .sqlplan file?) This will also help you determine how to start to speed this up. If it's the read, then you'd be looking at tuning the select query - are there better indexes that could be used, are the stats correct, is the optimizer getting a good plan (you mention it is nested 4 levels - sometimes the optimizer cannot simplify complex queries enough). Or you may just be bound by the sheer data size - reading 129 million rows can take time. How big are the rows - do you need all the columns? Is the read being limited by the underlying disk subsystem and hardware? Are you trying to read from the same disk that you are also writing to (including log files)? For the write, we've already mentioned is the table on the same disk as the read. What about the log file for that database. If you are using temp tables, is tempdb on it's own disk? How many tempdb files are there? Are tempdb log files on the same disks again? If it's a staging table can this be put into a seperate database, or filegroup? How long do you want this to take? How long can you wait? Welcome to the world of SQL Server, everything you learn here will help you countless number of times in the future!
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

pradyumin avatar image
pradyumin answered
The below is not the answer; I am not able to post the below as comment. Webpage is not uploading/updating as comment. Experts, first of all apologies for being late in updating the status and responding to comments. I did the the following to encounter the challenge by dividing the INSERT operation into 4 chunks. SELECT the required data on 4 different requirements. For 2 chunks it is straight forward direct INSERT INTO using SELECT. But for 2 chunks I would have to DECLARE a CURSOR or use Temporary Tables. Currently testing is in progress. So far the 2 chunks having direct INSERT are responding normal. Further data verification and validation will be done and I would need to review the results of CURSOR and TempTable process repeatedly for better results. Kev Riley , your suggestions are helped a lot. We've increased the RAM memory and now database maintenance reboot is scheduled. TempDB The TempDB containing the 6 temporary tables and these tempdb files needs to be cleaned up. A templog file is also located at TempDB itself. And the database has seperate log drive as DB_Logs. The other drives Data Drive, BackUp, MasterDB are configured locally in the database server machine itself.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.