New storage options are blurring the delimiters once evident among different data types.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Data tiering, which started as a fairly straightforward concept, has become quite convoluted. Which means enterprises might find it harder to stage data.
Data tiering has been a touted strategy for years, because different data groups have different values to the enterprise. "Not all information needs a Mercedes-level storage system," said Forrester Research senior analyst Henry Baltazar -- and the cost savings are significant.
Storage tiers place the highest-valued, highest-performance-driven data on the most costly storage drives. Less time-sensitive data migrates to lower-cost hardware. Rather than pay for top-of-the-line storage across the board, corporations mix in lower-cost alternatives.
Three storage tiers are common:
- Tier 1 is fast, expensive storage, such as Fibre Channel drives. These storage systems are often reserved for important complex applications, such as database management systems, and include sophisticated scalability and reliability functions.
- Tier 2 storage is slower, less expensive disk systems supporting applications, such as email. Companies invest less in making sure that this data is available 24 hours a day, seven days a week.
- Tier 3 takes care of backup and recovery systems, usually stored on low-cost disk or even tape drives. Here, information recovery can take hours or days.
Problems with data tiering
Data tiering never really took off. "Frankly, fewer companies have adopted tiering than [storage] vendors would like to admit," said George Crump, president of Storage Switzerland, a consulting firm.
The data-tiering process requires IT departments to do a fair amount of up-front work. Corporations evaluate the value of their data, examine their information flows and determine what data belongs where -- which all takes time. Given the dynamic nature of business, these deductions must be updated periodically -- say, every six months. That's bad news for already-harried data center staff.
Creating tiered storage is also manually intensive. Allocating storage requires IT technicians to touch a lot of devices. But this is changing.
"The data migration tools vendors offer have become more sophisticated and offer much more automation than they did in the past," said Mark Peters, senior analyst at Enterprise Strategy Group. This is cutting down on data migration time.
While automation has improved the tiering process, other recent technical advances blur the distinctions among the three tiers. "Solid-state drives (SSDs) are now gaining significant momentum in the enterprise," Peters said.
Typically, SSDs are the top tier because their performance is lightning fast. But they're expensive -- sometimes 10 times the price of other storage products. Some enterprises mix SSDs with other high-cost, high-performance storage arrays to lessen the sticker shock. You will see SSD called Tier 0 and Tier 1½ to convey how high-value/high-cost a solution it is.
Clouds rolling in
"I’ve been surprised that SSDs have begun to play a very significant role in archiving, backup and recovery applications," Baltazar said. SSDs are used in cloud storage because enterprises are backing up high-bandwidth media, such as pictures and video, that tax traditional backup storage solutions. Cloud storage vendors are also adopting the technology to satisfy speed requirements as enterprises move information from their offices to the cloud.
Cloud computing is having a significant effect on data tiering. In a growing number of cases, corporations are using cloud as a Tier 3 option to back up their information, instead of using cheap physical storage.
User behavior also muddles neat data tiers. Employees open cloud accounts and store company documents there. "Increasingly, businesses are being forced to put policies in place to determine how to deal with user-stored data," Crump said.
Enterprises also have to take into account market flux when planning a storage scheme. SSD pricing has been dropping by double digits in the past few years. The underlying economics of a data-tiering plan can quickly become outdated.
Corporations have more choices for stacking their storage systems than ever before -- to the point of having too many options. To make good decisions, enterprises may call in outside consulting services and pay for training courses for their employees.
If you can slog through the short-term confusion of tiers, SSD and cloud evolution, the data center stands to benefit with greatly enhanced storage efficiency.
About the author:
Paul Korzeniowski is a freelance writer who specializes in cloud computing and data-center-related topics. He is based in Sudbury, Mass., and can be reached at firstname.lastname@example.org.