question

Leo avatar image
Leo asked

Disk De-fragmentation

Hi, Is that possible to Defrag the SAS Drives on Production Servers where SQL Database files reside? SQL performance will be better by doing this? I have been suggested from SQL Gurus, they said that I should do that but I have to stop SQL server while doing the Disk Defragmentation. Otherwise windows de-fragmentation will ignore the Database files, Is that true? What if DB is need to be online 24/7? Do you suggest any free tools to Defrag the Disk if that is a good to apply.... Thanks.
sql-server-2008sqlperformanceserver
7 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Grant Fritchey avatar image Grant Fritchey ♦♦ commented ·
Just for educational purposes, what SQL Guru's are suggesting disk defrag as a performance enhancement? Can you post a link?
2 Likes 2 ·
Blackhawk-17 avatar image Blackhawk-17 commented ·
@Grant - no link for you, but in environments where pre-sizing data files isn't done or shrinking is performed it can be beneficial (to correct issues that SHOULD NOT HAVE HAPPENED!!!!).
0 Likes 0 ·
Blackhawk-17 avatar image Blackhawk-17 commented ·
@Leo - "What if DB is need to be online 24/7?" Then, if fragmentation is an issue, you are in for some pain.
0 Likes 0 ·
Leo avatar image Leo commented ·
Hi Grant... I hope you were in Red Gate Event London Last week. I attended the session (Managing Data Growth) Brad McGehee. He mentioned that in the session. By the way I attended one of your session as well. That was excellent.
0 Likes 0 ·
Leo avatar image Leo commented ·
@Blackhawk - I do believed that All Indexes are properly rebuild every night. However I never done Disk Defrag on where SQL DB reside. I just want to see the performance, what happen after Defrag ... or may be I shouldn't bother to tweak it.
0 Likes 0 ·
Show more comments
Fatherjack avatar image
Fatherjack answered
As far as free tools go I have no answers. We use Perfect Disk, and have done for years. It has an open files agent that works magic and doesn't require the database files to be detached. We set it to run once a night to shuffle any fragments together.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Shawn_Melton avatar image
Shawn_Melton answered
The only way to defrag the database files on the physical disk would be to detach the database, if they are on dedicated disk, or you could shutdown SQL Server also. Defrag.exe (free tool, comes with Windows) will not defrag a opened/locked file. This is a good article to read on it: http://www.bradmcgehee.com/2011/06/do-you-ever-physically-defragment-your-sql-server-mdf-ldf-files/ Depending on the activity of your database you probably will not see a significant increase in performance by just doing defrag on the physical disk. I'm not sure which "SQL Guru" told you this but there are other things that you can do to improve performance on your database that will last longer and return better results, than defragging the disk. You can defrag the disk, but the fragmentation will return eventually. IMHO.
3 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Blackhawk-17 avatar image Blackhawk-17 commented ·
@meltondba - I assume you are making a big assumption :) Environments where the defaults are used and lots of data is added can have extremely physically fragmented files. This of course flies in the face of best practices where data files are pre-sized to avoid growing them (especially in small iterations) and shrinking is avoided (which I assume you base you answer on). Add to this a drive with numerous data files all doing this, and contending for fresh contiguous space, and you arrive at a scenario where it becomes your only sensible "fix". SQL Server puts enough of a workout on the drive heads without throwing in extra seeks as file fragments must be discovered and accessed. Having said all this... If the thought is put into the initial environment then, no, defragmentation will not produce great improvements in performance - but one size does not fit all.
1 Like 1 ·
Shawn_Melton avatar image Shawn_Melton commented ·
I guess "it depends" would have been a better response :) An assumption for the most part. Environments I had worked in when we checked disk fragmentation our database files were not really fragmented everything else on the disk was so it did not benefit us much doing it all the time.
0 Likes 0 ·
Blackhawk-17 avatar image Blackhawk-17 commented ·
Willing to bet your data files were pre-sized and not using a 1MB growth increment. I have been to Client environments with many multi-GB data files on the same, full, disks that started out as unaltered Model clones. We found fragmentation :)
0 Likes 0 ·
KenJ avatar image
KenJ answered
Several products can defrag live files, such as mdf and ldf files. Diskeeper has this ability, and I've read that Contig by Mark Russinovich can, as well - [ http://technet.microsoft.com/en-us/sysinternals/bb897428][1]. I'm surprised that the native tool wouldn't provide this too. A "live" defrag will impact DB performance and should not be run during times of heavy load or batch processing. Once your mdf and ldf files are defragged and properly sized, they should not require further defrag maintenance unless you allow auto grow to get out of hand through improper database sizing or inappropriate auto grow increments. It's possible you could see some performance benefits from a defrag, but it's not the first place I would look and I wouldn't expect an order of magnitude performance increase like you can often realize through proper index creation and maintenance. [1]: http://technet.microsoft.com/en-us/sysinternals/bb897428
3 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

KenJ avatar image KenJ commented ·
Completely concur on the performance aspect. I've heard similar "bad feeling" corruption concerns about database snapshots and DBCC - what if Windows breaks the copy-on-write? What if Windows makes a mistake when DBCC creates/uses a hidden alternate filestream? "We all know how reliable Windows is `*wink*`" SQL Server snapshots and these two defrag tools use Microsoft APIs to work with the files. If we can't trust the NTFS APIs to reliably implement these features, we probably shouldn't have our databases on a windows server. There's always a risk that a file could become corrupt for any reason, or no apparent reason at all. I've seen so many more database problems due to human error with configuration and maintenance than with file corruption that I'll take my chances with defrag related corruption any day. As long as we are mitigating both human and corruption risks with regular DBCC checks and database backups (with test restores), fixing the underlying cause of a heavily fragmented database file and doing a one-time defrag on it can be a reasonable step to take.
2 Likes 2 ·
Leo avatar image Leo commented ·
Hi Guys... Thanks for the information. My Production Servers are fine and they are porperly index every day. But also I am thinking about something I can do more ...something like Disk Defrag. Diskeeper is great tools but I don't think that is a Free tools.
0 Likes 0 ·
KenJ avatar image KenJ commented ·
I missed the "free" part. Fortunately, Contig is free, so 50% of the short list works :)
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.