question

ThomasRushton avatar image
ThomasRushton asked

Windows Azure Virtual Machine Sizes, temporary storage, and buffer pool extensions

Here's the scenario: * We're building Powershell scripts to automatically build Azure Virtual Machine server environments. Depending on the environment being built, we set up things like the machine name and instance size. * The instance size controls the type (and size) of temporary storage attached (very temporary - the D: drive on an Azure VM is physically located close to the VM host, and if your VM is rebooted, there's no guarantee that you'll get it back). Ideal territory for the TempDB, and, with new versions of SQL, Buffer Pool Extensions. * Except that enabling BPE is only recommended if you're running on an InstanceSize from the D series, as that's where the D: drive is on SSDs rather than HDDs. So, my question is... **How do we tell, from the VM side, what InstanceSize we're running**? Or whether we've got an SSD or an HDD? This would need to be done at boot time, to enable/disable BPE as appropriate, as the InstanceSize might change. So far, the only thing that looks promising is the PhysicalClusterSize - on the D series VM I've just spun up, that's different (4096) to those on the Basic & A series machines (512). But I can't find any documentation that points to this being the standard.
powershellsql-server-2014azurebuffer-pool-extension
5 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Not sure if this would be a good answer, but hope it's a great comment... Based on this post - http://pipe2text.com/?page_id=2589 - you should be able to create a task scheduler task that runs at startup, connects to your Azure subscription and pulls the InstanceSize for your host using Get-AzureVM -name "your_host_name"
1 Like 1 ·
Yeah, was wondering about that. But was hoping to have a more direct answer (ie one that doesn't involve going back into Azure)
0 Likes 0 ·
I haven't been managing large scale automation like this. The person to ask is Joey D'Antoni. I know he's been neck deep in it.
0 Likes 0 ·
@Grant - thanks. I'll hunt him down when I'm near a decent twitter client (blocked here at work...)
0 Likes 0 ·

1 Answer

· Write an Answer
Joey_Dantoni avatar image
Joey_Dantoni answered
Metadata management is always a challenge in a fully automated environment like this. I think a step in the right direction might be to look at using PowerShell DSC for Azure. I can't think of a good way to get that information without going back into Azure however, particularly the instance size. There isn't a lot of internal metadata exposed that isn't through PowerShell. One other place I might investigate, is the telemetry data folder under C:\WindowsAzure--I haven't checked it, but I suspect they may be storing metadata in those files. If it's a pattern you can write some regular PowerShell to grab it.
3 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Thanks for the input, @Joey - I'll check it out and let you know.
0 Likes 0 ·
C:\WindowsAzure - doesn't look helpful. (Thought I would try the easiest option first!) There's a lot of interesting logging information there, but nothing with that level of detail.
0 Likes 0 ·
Currently thinking about a solution involving a combination of Azure Automation and the CustomScriptExtension.
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.