question

suraj avatar image
suraj asked

How to shrink the log file of SQL database at run time

I am using the BCP utility to import a large text file (2GB), but the log file grows to around 70GB, vs. the mdf file at around 10GB.

I need to know whether i can shrink the logfile during this BCP process?

t-sqltransaction-logimport-data
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Håkan Winther avatar image
Håkan Winther answered

If your database is in "full recover model" you need to perform a transaction log backup to free up the logfile but that will only work if the transaction is completed. If you are running the bcp without the -b switch the whole file is imported in one transaction. To split the transaction into smaller blocks you can specify the switch -b along with a number to set a batch size.

If you are using "simple recover model" you do not need to do any transaction log backup during the load of the file.

Quoute from Books online :

Specifies the number of rows per batch of imported data. Each batch is imported and logged as a separate transaction that imports the whole batch before being committed. By default, all the rows in the data file are imported as one batch. To distribute the rows among multiple batches, specify a batch_size that is smaller than the number of rows in the data file. If the transaction for any batch fails, only insertions from the current batch are rolled back. Batches already imported by committed transactions are unaffected by a later failure

10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Rob Farley avatar image
Rob Farley answered

Everything that Hakan says is good. But also, you can run the CHECKPOINT command during the import, which may give the system the chance to truncate the log more often, thereby keeping it smaller. You can put the CHECKPOINT command in a loop so that it runs repeatedly - in a separate connection, of course.

Edited: This will impact a system using the Simple recovery model.

10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Grant Fritchey avatar image
Grant Fritchey answered

It is unlikely that you'll be able to shrink the log during the import if the import is running as a single transaction. Until the transaction is complete, it needs to be held within the log. If your log is growing that much, you might want to consider breaking down the import into multiple steps so that you can get more frequent commits and therefor smaller transactions.

10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Jack Corbett avatar image
Jack Corbett answered

You could change the recovery model of the database prior to the BCP, to Bulk-Logged or Simple, if you know you are not going to have anything going on that you would need to be able to do a point in time restore. Then after the import change back to FULL and do Full and Log backups.

I'd also add that the growth could also be due to changed in indexes as well. You might also want to drop and re-create the indexes on the affected table(s) during the process.

10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.