x

Disk De-fragmentation

Hi,

Is that possible to Defrag the SAS Drives on Production Servers where SQL Database files reside? SQL performance will be better by doing this?

I have been suggested from SQL Gurus, they said that I should do that but I have to stop SQL server while doing the Disk Defragmentation. Otherwise windows de-fragmentation will ignore the Database files, Is that true?

What if DB is need to be online 24/7?

Do you suggest any free tools to Defrag the Disk if that is a good to apply....

Thanks.
more ▼

asked Jul 18 '11 at 09:03 AM in Default

Leo gravatar image

Leo
1.6k 51 56 58

Just for educational purposes, what SQL Guru's are suggesting disk defrag as a performance enhancement? Can you post a link?
Jul 19 '11 at 01:33 AM Grant Fritchey ♦♦
@Grant - no link for you, but in environments where pre-sizing data files isn't done or shrinking is performed it can be beneficial (to correct issues that SHOULD NOT HAVE HAPPENED!!!!).
Jul 19 '11 at 06:36 AM Blackhawk-17

@Leo - "What if DB is need to be online 24/7?"

Then, if fragmentation is an issue, you are in for some pain.
Jul 19 '11 at 06:46 AM Blackhawk-17
Hi Grant... I hope you were in Red Gate Event London Last week. I attended the session (Managing Data Growth) Brad McGehee. He mentioned that in the session. By the way I attended one of your session as well. That was excellent.
Jul 19 '11 at 08:51 AM Leo
@Blackhawk - I do believed that All Indexes are properly rebuild every night. However I never done Disk Defrag on where SQL DB reside. I just want to see the performance, what happen after Defrag ... or may be I shouldn't bother to tweak it.
Jul 19 '11 at 08:55 AM Leo
(comments are locked)
10|1200 characters needed characters left

3 answers: sort newest
As far as free tools go I have no answers. We use Perfect Disk, and have done for years. It has an open files agent that works magic and doesn't require the database files to be detached. We set it to run once a night to shuffle any fragments together.
more ▼

answered Jul 19 '11 at 06:53 AM

Fatherjack gravatar image

Fatherjack ♦♦
41.2k 72 77 107

(comments are locked)
10|1200 characters needed characters left

Several products can defrag live files, such as mdf and ldf files.

Diskeeper has this ability, and I've read that Contig by Mark Russinovich can, as well - [http://technet.microsoft.com/en-us/sysinternals/bb897428][1]. I'm surprised that the native tool wouldn't provide this too. A "live" defrag will impact DB performance and should not be run during times of heavy load or batch processing.

Once your mdf and ldf files are defragged and properly sized, they should not require further defrag maintenance unless you allow auto grow to get out of hand through improper database sizing or inappropriate auto grow increments.

It's possible you could see some performance benefits from a defrag, but it's not the first place I would look and I wouldn't expect an order of magnitude performance increase like you can often realize through proper index creation and maintenance.

[1]: http://technet.microsoft.com/en-us/sysinternals/bb897428
more ▼

answered Jul 18 '11 at 07:39 PM

KenJ gravatar image

KenJ
19.1k 1 3 11

Hi Guys...

Thanks for the information. My Production Servers are fine and they are porperly index every day. But also I am thinking about something I can do more ...something like Disk Defrag. Diskeeper is great tools but I don't think that is a Free tools.
Jul 19 '11 at 01:34 AM Leo
I missed the "free" part. Fortunately, Contig is free, so 50% of the short list works :)
Jul 19 '11 at 05:07 AM KenJ

Completely concur on the performance aspect.

I've heard similar "bad feeling" corruption concerns about database snapshots and DBCC - what if Windows breaks the copy-on-write? What if Windows makes a mistake when DBCC creates/uses a hidden alternate filestream? "We all know how reliable Windows is *wink*"

SQL Server snapshots and these two defrag tools use Microsoft APIs to work with the files. If we can't trust the NTFS APIs to reliably implement these features, we probably shouldn't have our databases on a windows server.

There's always a risk that a file could become corrupt for any reason, or no apparent reason at all. I've seen so many more database problems due to human error with configuration and maintenance than with file corruption that I'll take my chances with defrag related corruption any day.

As long as we are mitigating both human and corruption risks with regular DBCC checks and database backups (with test restores), fixing the underlying cause of a heavily fragmented database file and doing a one-time defrag on it can be a reasonable step to take.
Jul 19 '11 at 07:54 AM KenJ
(comments are locked)
10|1200 characters needed characters left

The only way to defrag the database files on the physical disk would be to detach the database, if they are on dedicated disk, or you could shutdown SQL Server also. Defrag.exe (free tool, comes with Windows) will not defrag a opened/locked file.

This is a good article to read on it: http://www.bradmcgehee.com/2011/06/do-you-ever-physically-defragment-your-sql-server-mdf-ldf-files/

Depending on the activity of your database you probably will not see a significant increase in performance by just doing defrag on the physical disk. I'm not sure which "SQL Guru" told you this but there are other things that you can do to improve performance on your database that will last longer and return better results, than defragging the disk. You can defrag the disk, but the fragmentation will return eventually. IMHO.
more ▼

answered Jul 18 '11 at 09:50 AM

Shawn_Melton gravatar image

Shawn_Melton
5.3k 17 21 29

@meltondba - I assume you are making a big assumption :)

Environments where the defaults are used and lots of data is added can have extremely physically fragmented files. This of course flies in the face of best practices where data files are pre-sized to avoid growing them (especially in small iterations) and shrinking is avoided (which I assume you base you answer on). Add to this a drive with numerous data files all doing this, and contending for fresh contiguous space, and you arrive at a scenario where it becomes your only sensible "fix".

SQL Server puts enough of a workout on the drive heads without throwing in extra seeks as file fragments must be discovered and accessed.

Having said all this... If the thought is put into the initial environment then, no, defragmentation will not produce great improvements in performance - but one size does not fit all.
Jul 19 '11 at 06:44 AM Blackhawk-17
I guess "it depends" would have been a better response :) An assumption for the most part. Environments I had worked in when we checked disk fragmentation our database files were not really fragmented everything else on the disk was so it did not benefit us much doing it all the time.
Jul 19 '11 at 07:29 AM Shawn_Melton
Willing to bet your data files were pre-sized and not using a 1MB growth increment. I have been to Client environments with many multi-GB data files on the same, full, disks that started out as unaltered Model clones. We found fragmentation :)
Jul 19 '11 at 08:23 AM Blackhawk-17
(comments are locked)
10|1200 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

New code box

There's a new way to format code on the site - the red speech bubble logo will automatically format T-SQL for you. The original code box is still there for XML, etc. More details here.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

SQL Server Central

Need long-form SQL discussion? SQLserverCentral.com is the place.

Topics:

x1816
x672
x241
x84

asked: Jul 18 '11 at 09:03 AM

Seen: 2584 times

Last Updated: Jul 18 '11 at 09:19 AM