This article can also be found in the Premium Editorial Download "Virtual Data Center: The data center of the future."
Download it now to read this article plus other related content.
Planning a modern data center is a challenging exercise. Designers must include the obvious technological considerations, such as servers, storage, management, power, cooling, space, and so on. But as organizations become more accessible to a variety of internal and external users, data center planners must also consider the social and market forces that change how employees, vendors and customers use computing resources. In this tip, we’ll review some of the most important strategic
Virtualization is still evolving in the data center
Virtualization has had an enormous effect on computing efficiency. By hosting multiple virtual machines (VMs) on the same physical host, the utilization of those computing resources can be increased dramatically. The net result is less hardware, power, cooling and space requirements for the data center. However, according to David J. Cappuccio, a Gartner Inc. analyst, typical servers are only utilized 25% to 30%. This leaves lots of headroom for additional virtualization growth across the data center. Virtualization adoption will also accelerate as the technology takes on more prominence in other areas of data center infrastructure, such as storage and networking.
The bigger potential for virtualization is in handling a data center’s scale. Since virtualization abstracts the computing workload from the computing hardware running underneath, a workload is no longer tied to a computing platform. This enables workload migration between servers regardless of their physical location. Multiple physical facilities can be considered a single “data center.” Consider migrating workloads from a server in one region where it is night and local utilization is low, to a server in another region where it is morning and local utilization is high, and then back again; day in and day out. The offloaded servers can then shut down to save power or remain on standby for disaster recovery purposes until it’s time to move the workload again. It’s an efficiency concept that is catching on as organizations develop the tools and support needed for IT to “chase the daylight.”
Use your data and store it efficiently
According to Cappuccio, there has been an 800% increase in system data stored over the last decade, and about 80% of that enormous ocean of data is unstructured. The issue is that if you’re collecting data, storing it, backing it up and replicating it, it only makes sense to work with that data intelligently and use analytics tools to extract as much business value as possible from the data. Data mining and analytics can help a business spot and exploit trends that may otherwise go unnoticed.
Storage is also under pressure to perform. Companies simply cannot invest in Fibre Channel or other high-performance storage systems for data that is only accessed occasionally. Similarly, human administrators cannot be burdened with the processes of moving that data around the data center manually. Organizations must adopt a tiered storage architecture that is capable of assessing the relative importance of data and moving that data automatically to the appropriate tier. The tools to support this kind of capability are out there and improving every day.
Monitor and manage energy efficiency
Power costs a lot of money. A moderately-sized 25,000 square-foot data center can use more than $4 million per year in energy, Cappuccio said. At that level, the investment in energy monitoring management platforms (using standards like power usage effectiveness), such as data center infrastructure management (DCIM) and resulting power reduction projects, are often easy to justify with simple return on investment calculations.
But today’s energy concerns are more than a matter of money. Government agencies around the world are starting to tax carbon emissions, so standards like carbon usage effectiveness are gaining importance in helping companies ensure environmental compliance. Public perceptions of pollution and wasted energy are also prompting companies to cut energy use as a matter of social responsibility.
Consider that a new server may provide four times the computing capability of a previous generation in the same form factor, yet use only half the power. This actually accelerates the technology refresh cycle, allowing companies to buy powerful new servers sooner, get more computing resources, and recover that capital investment through energy savings. The power efficiency of new IT equipment has become about as important as its performance. A company can now perform many server upgrades in-place, systematically multiplying their computing capability without adding power or taking more space.
computing and mobile devices
The platforms that employees, vendors and customers use and the way they use computing resources is changing, and IT planners must accommodate the reality of those changes in new data centers. Consider the prevalence of tablets and hand-held devices, along with the ready availability of downloadable applications. The goal for developers is to create applications that enhance interaction through unified communication and collaboration. Rather than simply porting existing PC-based applications to hand-held devices, the real challenge is to re-imagine new ways to perform important tasks that are more suited to mobile devices, which truly takes advantage of the intelligence that modern mobile devices provide.
Mobile devices are not likely to replace traditional PCs. And while new applications are prompting new use cases, it is likely that mobile devices will fill certain use cases into the foreseeable future. However, mobile devices can carry significant amounts of data, and this poses two problems for IT. First, there are security concerns about data theft–much like previous concerns about flash drives–that IT needs to secure the business against. But even more troubling is the idea that mobile devices contain increasing amounts of data that is not protected. Consider that a mobile phone may have contact information, calendar records, pictures, video and other data that is not duplicated on a laptop or desktop PC. IT will need to think about ways to back up and protect that kind of irreplaceable mobile device data.
Take another look at IT staffing and retention
A large concentration of baby boomers are preparing to retire, taking an irreplaceable wealth of undocumented experience and expertise with them. As more junior IT staff takes over, the business will begin to see a lack of management skills and a shrinking scope of skills. Few IT staff will have the breadth and depth of technical and soft skills needed to master advancing technologies without support from mentors or peers that are able to share their various skills. Companies need to recognize that these staffing trends will become a major impediment to their data center support and growth unless there is a new emphasis placed on training and skill exchange.
Consider ways that social networking affects IT
1.9 billion people now use the Internet, and 33% of those are on Facebook, Cappuccio said. This makes social media an important factor in the way that businesses run and maintain their image in social media communities. Just consider how often you base buying decisions on the experiences that you solicit from others. Companies are actively drafting canned “corporate” responses to tweets and taking other actions to formalize the way that their employees represent the company through social media. Future data centers must consider the computing needs of their social media users and implement the oversight and security needed to facilitate the business uses of social media outlets.
Alternate infrastructures are taking hold
There are times when data centers are simply too much for the existing business or IT staff to handle, but there are options to streamline and simplify the data center infrastructure.
One popular alternative is the cloud, which provides agility for a business while minimizing the expense. There are certainly concerns with public cloud options, including reliability and security. But for non-critical applications and data, the public cloud is emerging as a viable tool for commodity computing. Outsourcing non-critical tasks to the public cloud can also simplify the data center by further reducing the number of servers and workloads that IT staff needs to support. When more critical tasks require a greater level of automation and agility, it might be time to consider an in-house private cloud or a hybrid cloud that combines the features of both approaches.
Integrated infrastructures, such as Cisco’s UCS, are also slowly gaining traction. The concept of a highly integrated server, storage and networking data center infrastructure that is tuned and ready to go right out of the box is appealing in principle. Still, the fear of vendor lock-in and concerns about support for heterogeneous devices has slowed the adoption of this technology. As vendors begin to provide a level of support between each other’s systems, the interest in integrated infrastructures will grow.
This was first published in October 2011