Historically, data tiering's negatives outweighed its positives. Recently, that equation has been changing, and this approach to storage management is piquing the interest of enterprises.
Data tiering is gaining traction with lower costs, improved automation and the emergence of solid-state drives (SSDs), prodding more enterprises toward tiering deployments.
Tiering acknowledges that different data is valued differently by an enterprise: Not all information needs to be treated as high-priority.
Theoretically, tiering reduces storage costs and enables companies to collect more information by prioritizing data types and moving rarely used, noncritical information to less expensive solutions, thus increasing the size of the storage pool. This data storage technique also improves system performance by placing frequently used information on more responsive systems.
The growing use of SSD is creating a ripple effect: Enterprises are ditching HDD storage solutions entirely.
Despite the benefits, tiering was not widely adopted in the past. Enterprises encountered various roadblocks to implementation, especially cost. "Vendors often charged a premium for their tiering solutions," said Mark Peters, senior analyst at Enterprise Strategy Group Inc.
Early deployments of this storage approach required a fair amount of work by the IT department as well. Before installation, corporations had to determine the value of data, examine their information flows and identify what data belonged where. Then IT set up tiers, placing information on different storage environments -- a practice that traditionally requires technicians to touch a lot of different devices. As a result, this old deployment process was tedious, manually intensive and time consuming.
Given the dynamic nature of business today, data usage patterns change -- sometimes dramatically and quite quickly. This means refreshing and rearranging tiers at a determined time or intervals.
In addition, businesses need to understand data usage patterns and guard against aberrations, such as the Monday morning syndrome: If an array was quiet over the weekend, operational data moved to slower disk. On Monday morning, users report slow performance. Eventually, the data storage group moves the information back to faster disk. This unnecessarily creates more work for often harried data center staffs.
Work hard, store more
Recent advances have made tiering more attractive to businesses.
"Tiering software pricing has dropped," said George Crump, president and founder of Storage Switzerland.
Rather than push data tiering as an expensive add-on, vendors are including these capabilities as part of their standard storage management solutions. Dell Compellent Fluid Data storage, EMC's Fully Automated Storage Tiering, Hewlett-Packard's 3Par, IBM's Storwize and Oracle's Oracle Optimized Solution for Tiered Storage software are examples of bundled solutions.
These management tools are more intelligent and automate more of the data tiering process. Rather than having people manually gather data, many products do the evaluation work automatically. Recent advances in areas like big data and analytics fuel new storage performance analytics tools that provide IT managers with high-level snapshots of data usage. "The amount of time that data center staffs spend setting up and managing data tiers has been decreasing," Crump said.
SSDs are driving data tiering's growing popularity. Historically, SSD devices presented enterprises with a dilemma: On the one hand, this option is used as top-tier storage because its performance is lightning fast. Customers report that SSDs deliver tenfold (or more) improvements for storage-intense tasks, like operating high-end database management systems, running nightly backups or bringing a new virtual machine online. In addition to speed, SSDs allow customers to reduce their storage rack space needs by 50% or more and cut power consumption by 50% to 75%.
But SSDs are expensive. Market research firm Gartner Inc. found that business-grade hard disk drive (HDD) systems cost about $0.05 per gigabyte compared to a dollar a gigabyte for SSD solutions. For mission-critical storage, the prices rise to $0.27 per gigabyte for HDD and $1.87 for SSD.
SSD purchases are rising. "[SSDs] are now gaining significant momentum in the enterprise," said Enterprise Strategy Group's Peters. In 2009, vendors shipped 280,000 SSD units, accounting for $450 million in worldwide revenue, found Gartner. Those numbers increased to 5.3 million units and $1.9 billion in revenue in 2013.
The growing use of SSD is creating a ripple effect: Enterprises are ditching HDD storage solutions entirely. SSDs are replacing high-capacity serial advanced technology attachments, serial attached SCSI and Fibre Channel drives as top-tier storage solutions.
Storage revenue fell 3.5% year-over-year, totaling $5.7 billion during the third quarter of 2013, according to International Data Corp., which indicates that storage prices are dropping overall since volumes are increasing by double digits. Data center hardware is a commodity with falling prices. In addition, SSD pricing dropped by double digits in the past few years, a trend that will increase its use and drive more data tiering deployments.
About the author:
Paul Korzeniowski is a freelance writer who specializes in data center issues. He has been covering technology for two decades and is based in Sudbury, Massachusetts, and can be reached at firstname.lastname@example.org.
Dig deeper on Storage concerns in the data center
Paul Korzeniowski asks:
What's the tipping point for tiered storage?
0 ResponsesJoin the Discussion