question

Tim avatar image
Tim asked

Backup files - number of file limitations

Has anyone ever ran into an issue with retaining two many backup files in a windows folder that would cause performance issues? I have a vendor telling me that I shouldn't have more than 1000 files in my backup folder. That is a hard number to limit myself to when I have hundreds of databases on a SQL instance. I am wondering how creative anyone has had to get with writing their backup files to multiple subfolders. I know it would be easy to create the folder structure and dynamically set my backup path based on database name and such, but I can see how that would be a maintenance nightmare each time you create a new database you would have to create that sub folder. I am curious if anyone else has experience such a problem and how you overcame the issue.
backupsdumb vendorsfile retention
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Tim avatar image Tim commented ·
Turned out the issue was a limitation and or recommendation from the vendor for this backup de-duplication appliance. They recommend no more than 1000 file and 1000 sub folders per CIFS share. Well we exceeded the file number limit. After purging a lot of backups, this cleared up the issue. You can read about it here. [ http://www.timradney.com]( http://timradney.com/2013/02/08/how-to-break-a-high-end-de-duplication-backup-device/)
0 Likes 0 ·
ThomasRushton avatar image
ThomasRushton answered
I've only seen problems when the number of files in an NTFS folder gets over about 70k. The performance of the directory gradually slows down - it won't grind to a halt, but it'll feel like it. I have seen directories with a quarter of a million files in - very ugly!
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Pavel Pawlowski avatar image
Pavel Pawlowski answered
Never run into such issues, but I have a scenario in which I automatically backup all databases on the instance into separate forlders. In this case I'm using a PowerShell script for the backups which goes though all the databases to be backed up, checks the destination folder if exists. If not, then it automatically creates it and makes backup of the db into that folder. It's easy to write such script and you do not need to take care about the folder structure creation as it si being created automatically during backup process.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

WilliamD avatar image
WilliamD answered
Tim, I have seen issues in the past where the folder structure were too deep and that caused performance problems. Single files in a folder have been fine in the tens of thousands for me (YMMV). The way that I fix this is to use the backup script from [Ola Hallengren][1] - it creates backup with a structure like `SERVERNAME\DATABASENAME\BACKUPTPYE\Server_database_type_date_time.bak` It also creates folders as needed without further intervention. You can also decide upon retention so old backups are cleaned up too. [1]: http://ola.hallengren.com
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Grant Fritchey avatar image
Grant Fritchey answered
The only time I've seen anything like this was when we had a few thousand files and they all had very similar names. In fact the first 8-10 characters were identical and then large batches of them had similar characters for the next 8-10. I guess there must be some sort of index on the names and our selectivity was atrocious. It really hurt performance. Sounds similar to what @ThomasRushton was describing but with fewer files.
1 comment
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

ThomasRushton avatar image ThomasRushton ♦♦ commented ·
That's interesting. The files in my case were all named following a YYdddXXXXnnnnnn naming scheme (not my own, I hasten to add), and had all been created in a six year period. Those bits: YY - last two digits of year ddd - three digit day number within year XXXX - four character code specific to the server (DIP1, if you must know...) nnnnnn - six digit number looping independently of the YYddd combo. Generally worked OK, apart from the YY bit being a pain - four digit year code would have been better, obviously, given the go-live date of that system was 1998, and it was still being heavily used 8 years later... The million docs a day theoretical limit wasn't an issue - anyone pulling that number of documents a day would have more than one server! Sorry, irrelevant mental meandering. As you were.
0 Likes 0 ·
WilliamD avatar image
WilliamD answered
A good way to test would be to try creating a folder with lots of files in it! Powershell to the rescue: for ($i = 0; $i -le 100000; $i++) { New-item -type file testfile$i.txt | Out-Null } That'll create 100000 files in the folder that you run the command in. I tried it and managed fine, but my laptop has an SSD.
2 comments
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Tim avatar image Tim commented ·
Show-off indeed. After doing a bit of testing over night and this morning, the file limit does appear to be my issue. I will need to incorporate different folder structures per database for transaction log backups.
2 Likes 2 ·
ThomasRushton avatar image ThomasRushton ♦♦ commented ·
"my laptop has an SSD". Show-off. ;-)
0 Likes 0 ·
Blackhawk-17 avatar image
Blackhawk-17 answered
It depends... and yes I'm late to the party... but: - different versions of the O/S and their Service Packs have different performance on heavily populated folders. - Indexing, or lack of it, on the filesystem will impact it. - Physical fragmentation on the drive will make it worse - The underlying block size of the filesystem will impact it. I've had systems where 2K worth of files make a folder an anchor. As for the depth of the folder structure... Windows used to have a maximum amount of characters, 256 I think, in a pathname but I don't know if that still stands on the newer O/S'.
10 |1200

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.