Welcome!

Silverlight Authors: Automic Blog, Michael Kopp, AppDynamics Blog, Kaazing Blog, Steven Mandel

Related Topics: Containers Expo Blog, Microservices Expo, Microsoft Cloud, Silverlight, Agile Computing, @CloudExpo

Containers Expo Blog: Blog Post

Building an Automated Tiered Storage Lab with Windows Server 2012 R2

New R2 Release adds tiered storage that combines performance benefits of SSD with low cost of HDD

I’ve been speaking with lots of IT Pros over the past several weeks about the new storage improvements in Windows Server 2012 R2, and one feature in particular has gained a ton of attention: automated Storage Tiers as part of the new Storage Spaces feature set.

Automated Storage Tiers in Windows Server 2012 R2

However, not all of us have multiple SSD and HDD disks in our demo or lab gear, so it’s sometimes challenging to evaluate and demo this new feature.  In this article, I’ll step through the process of leveraging Virtual Hard Disks ( VHDs ) to simulate SSD and HDD tiers for the purpose of evaluating and demonstrating the functionality of Storage Tiers in a Windows Server 2012 R2 lab environment.

What are “Storage Tiers”?
Storage Tiers combine the best performance attributes of solid-state disks (SSDs) with the best cost:capacity attributes of hard disk drives (HDDs) together by providing the ability to create individual storage LUNs ( called “Virtual Disks” in Windows Server 2012 R2 – not to be confused with Hyper-V “Virtual Hard Disks” or VHDs ) that span SSD and HDD “tiers”.  Once a tiered LUN is created, Windows Server 2012 R2 analyzes disk IO requests on-the-fly and keeps the most frequently accessed data blocks located on speedy SSDs while moving less frequently accessed data blocks to HDDs – all transparently to applications and users.

You can learn more about Storage Improvements in Windows Server 2012 R2 in Chapter 3 of our latest free eBook, Introducing Windows Server 2012 R2.

The Net Result?
Most organizations tend to frequently access only a small subset ( generally 15% – 20% ) of their overall data storage, and Storage Tiers provide an ideal solution to keep all data online and accessible while providing super-quick access to the most frequently accessed data – without administrators needing to manually move files between separate tiers of storage.  When using storage tiers, you only need to invest in enough SSD storage to hold your most frequently accessed data blocks, so this helps to dramatically reduce storage costs over investing in large amounts of SSDs.  In fact, the cost-performance ratio of tiered Storage Spaces also appears to obviate the need for expensive 15K SAS hard disks – by investing in a small amount of SSD for performance-driven needs and less-expensive high-capacity 7.2K hard disks for capacity-driven needs, I’m not seeing a practical need for more-expensive 15K disks – further helping to reduce costs.

But, I don’t have lots of SSDs for my Storage Tiers lab!
For production environments, you’ll definitely want to invest in SSDs and HDDs to gain the cost-performance benefits of Storage Tiers. But, when learning, evaluating and demonstrating Storage Tiers in a lab environment, you don’t necessarily need all these physical disks to just show Storage Tiers functionality.  Instead, with a Hyper-V VM, a set of Virtual Hard Disks and a bit of PowerShell “magic”, you can create a Storage Tiers demo lab environment.

Here’s the steps to begin building a Storage Tiers demo lab without physical disks …

  1. Download Windows Server 2012 R2 installation bits

    Be sure to download the VHD distribution of the easiest provisioning as a virtual machine.
  2. In Hyper-V Manager, use the VHD downloaded in Step 1 above as the operating system disk to spin-up a new VM on a Hyper-V host.
  3. Once the new VM is provisioned, use Hyper-V Manager to modify the VM settings and hot-add 6 new virtual SCSI hard disks, as follows:

    - Add 3 250GB Dynamic VHDs ( these will be our simulated SSDs )
    - Add 3 500GB Dynamic VHDs ( these will be our simulated 7.2K HDDs )

    When completed, the settings of your VM should resemble the following:

    Hot-Adding Virtual SCSI Hard Disks
    Hyper-V Manager: Hot-adding virtual SCSI hard disks

Now … a bit of PowerShell Magic!
After hot-adding the virtual SCSI hard disks above, you may notice that the Windows Server 2012 R2 guest operating system running inside the VM sees the new VHDs as SAS disks with an “Unknown” media type, as shown below using the Server Manager tool.

Unknown Media Types
Virtual SCSI Hard Disks showing “Unknown” Media Type

Well, if the Windows Server 2012 R2 guest operating system doesn’t see these disks as “SSD” and “HDD” disks, it won’t know how to tier storage properly across them.  And, that’s where PowerShell comes in! Using PowerShell 4.0 in Windows Server 2012 R2, we can create a new Storage Pool and specifically tag the 250GB virtual hard disks with an SSD media type and tag the 500GB virtual hard disks with an HDD media type.  This will allow us to build a tiered storage lab environment with "simulated" SSD and HDD tiers.

Let’s get started with PowerShell …

  1. Launch the PowerShell ISE tool ( as Administrator ) from the Windows Server 2012 R2 guest operating system running inside the VM provisioned above.
  2. Set a variable to the collection of all virtual SCSI hard disks that we’ll use for creating a new Storage Pool:

    $pooldisks = Get-PhysicalDisk | ? {$_.CanPool –eq $true }
  3. Create a new Storage Pool using the collection of virtual SCSI hard disks set above:

    New-StoragePool -StorageSubSystemFriendlyName *Spaces* -FriendlyName TieredPool1 -PhysicalDisks $pooldisks
  4. Tag the disks within the new Storage Pool with the appropriate Media Type ( SSD or HDD ).  In the command lines below, I’ll set the appropriate tag by filtering on the size of each virtual SCSI hard disk:

    Get-PhysicalDisk | Where Size -EQ 267630149632 | Set-PhysicalDisk -MediaType SSD # 250GB VHDs

    Get-PhysicalDisk | Where Size -EQ 536065605632 | Set-PhysicalDisk -MediaType HDD # 500GB VHDs


    If you created your VHDs with different sizes than I’m using, you can determine the appropriate size values to include in the command lines above by running:

    (Get-PhysicalDisk).Size
  5. Create the SSD and HDD Storage Tiers within the new Storage Pool by using the New-StorageTier PowerShell cmdlet:

    $tier_ssd = New-StorageTier -StoragePoolFriendlyName TieredPool1 -FriendlyName SSD_TIER -MediaType SSD

    $tier_hdd = New-StorageTier -StoragePoolFriendlyName TieredPool1 -FriendlyName HDD_TIER -MediaType HDD

At this point, if you jump back into the Server Manager tool and refresh the Storage Pools page, you should see a new Storage Pool created and the Media types for each disk in the pool listed appropriately as shown below.

New Tiered Storage Pool
Server Manager – New Tiered Storage Pool

How do I demonstrate this new Tiered Storage Pool?
To demonstrate creating a new Storage LUN that uses your new Tiered Storage Pool, follow these steps:

  1. In Server Manager, right-click on your new Tiered Storage Pool and select New Virtual Disk…
  2. In the New Virtual Disk wizard, click the Next button until you advance to the Specify virtual disk name page.
  3. On the Specify virtual disk name page, complete the following fields:
    - Name: Tiered Virtual Disk 01
    - Check the checkbox option for Create storage tiers on this virtual disk
    Click the Next button to continue.
  4. On the Select the storage layout page, select a Mirror layout and click the Next button.
    Note: When using Storage Tiers, Parity layouts are not supported.
  5. On the Configure the resiliency settings page, select Two-way mirror and click the Next button.
  6. On the Provisioning type page, click the Next button.
    Note: When using Storage Tiers, only the Fixed provisioning type is supported.
  7. On the Specify the size of the virtual disk page, complete the following fields:
    - Faster Tier (SSD): Maximum Size
    - Standard Tier (HDD): Maximum Size
    Click the Next button to continue.
  8. On the Confirm selections page, review your selections and click the Create button to continue.
  9. On the View results page, ensure that the Create a volume when this wizard closes checkbox is checked and click the Close button.
  10. In the New Volume Wizard select the default values on each page by clicking the Next button.  On the Confirm Selections page, click the Create button to create a new Tiered Storage Volume.
    Note: When using Tiered Storage, the Volume Size should be set to the total available capacity of the Virtual Disk on which it is created.  This is the default value for Volume Size when using the New Volume Wizard.

Storage Tiering Completed!
Congratulations!
You now have a new Tiered Storage Virtual Disk and Volume.  Any disk IO that is written-to or read-from this new volume will automatically be processed across Storage Tiers based on frequency of access.  Keep in mind that this demo configuration is intended for functional demos only – because we’re not really using SSD disks in this demo configuration, you shouldn’t expect to be able to see the same performance as in a production environment that is using real SSD and HDD disks.

What if I know a particular file is always frequently accessed?
Great question!
If there’s particular files in your environment that you absolutely know are always frequently accessed, such as a Parent VHD that is used for multiple linked Child VHDs via Differencing Disks, an IT Pro can specifically pin those files to the SSD tier.  In this manner, Windows Server 2012 R2 doesn’t have to detect frequent access to keep the data blocks associated with those files in the fastest tier – they will always be there!

To pin a file to the SSD tier, use the new Set-FileStorageTier PowerShell cmdlet, as follows:

Set-FileStorageTier -FilePath <PATH> -DesiredStorageTier $tier_ssd

To un-pin a file from the SSD tier, use the new Clear-FileStorageTier PowerShell cmdlet, as follows:

Clear-FileStorageTier –FilePath <PATH>

Additional resources you may also be interested in …

More Stories By Keith Mayer

Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

@ThingsExpo Stories
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
DXWorldEXPO LLC announced today that the upcoming DXWorldEXPO | CloudEXPO New York event will feature 10 companies from Poland to participate at the "Poland Digital Transformation Pavilion" on November 12-13, 2018.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...