First, a little background:
I had an insert script that inserts a few hundred thousand rows into a categories table. It was taking a couple hours, and I stopped it because I thought it was taking too long. This crashed the DB and it went into recovery mode. Our network admin restored from a backup to another DB and I ran the categories script again. It ran fine and only took a few minutes. Now on that same DB that ran the previous script fine, I have another script for inserting a few hundred thousand products. Again, it's running slow. I stopped it at 5 hours but it didn't crash... I read somewhere to check the locks and it had 34,000. I don't really understand what that means though.
So, I'm not a DB admin, and I have no idea why this is happening or what to look for. Can I have some ideas on what to look for to see what this issue is?
HERE IS THE INSERT SCRIPT:
Lots of things could be impacting this. I will list them out in no particular order.
1) Indexes, when you perform inserts,updates,deletes then the indexes on that table also have to be updated.
FYI, killing a job mid way will cause a rollback, if you have a single transaction to insert 500000 rows and at row 300000 you kill it, it will have to rollback the insert of those 300000 records.
answered Nov 14 '12 at 04:30 PM
Hi... you can check in the 'Activity Monitor' (For SQL 2008, right click on the instance name in ssms and select the option). Sort the DB name column and check which processes are running on the database that you have executed the insert script on. If there are simultaneous SELECTS or UPDATES going on in the DB, they might be locking out your inserts. Best option to insert BULK data is in Server idle time when there are very few other processes/queries running on the DB.
answered Nov 14 '12 at 04:25 PM