Expert Predictions

Data center predictions 2008

Last year I made a list of data center trends and predictions, that, when I look back now, seems kind of lame, especially in light of Bob Plankers' blog post on data center predictions. But these lists give readers a new year's trend roundup and help ease SearchDataCenter.com's editors back into work mode. So I set out to do a better job this year, got a bit bolder, and hopefully will get more right than wrong this year.

Table of contents
ASHRAE owns green computing
Data center construction booms, then busts
Users who can make the most of Windows Server 2008 will
Thin clients: PCs move into the data center
More to manage outside the data center
Data center automation inches forward
Government data centers catch up with green computing
Say goodbye to the super-user
Utilization is out; I/O and memory bottlenecks are in
Modular data center infrastructure

1. ASHRAE owns green computing. Move over, Green Grid, the American Society of Heating, Refrigerating and Air-Conditioning Engineers Inc. (specifically ASHRAE Technical Committee 9.9) is the organization that will shape the way companies tackle data center energy efficiency. The group is expanding the temperature and humidity range recommendation for servers (enabling companies to use less energy and potentially take advantage of air- and water-side economizers), and it will push server vendors to accept the organization's recommendations. The committee will soon publish a book to standardize the way data center pros measure energy use. And in December 2007, ASHRAE Journal published its proposed points-based system for rating the "green-ness" of data centers. These people are engineers; they solve problems. You won't see them twiddling their thumbs, signing "memorandums of understanding". To boot, ASHRAE has longevity, a publishing apparatus and large user base, and it is vendor neutral.

2. Data center construction booms, then busts. More than 22% of respondents to our November 2007 data center construction survey said they had built or renovated their data centers in 2007. More than 60% of respondents said they would be involved in data center construction in 2008. The figures are consistent with similar findings from data center real estate firm Digital Realty Trust Inc. The expansion is a boon for IT manufacturers, because many companies will use the opportunity to upgrade to new technologies. But the impetus for "green computing," consolidation and virtualization may also flatten out-of-control growth curves -- at least somewhat. And from Wall Street to data center insiders, experts expect this current frenzy to result in overbuilding, creating a glut of data center space in about two years.

3. Users who can make the most of Windows Server 2008 will. A lot of users aren't chomping at the bit for Windows Server 2008 to ship this February. John Enck at Stamford, Conn.-based research firm Gartner Inc. advised users (those running Windows 2003, at least) to take their time, and Microsoft's virtualization offering, Hyper-V, will be in beta till the summer. But for the folks who upgrade to Windows Server 2008, its user-friendly features may be worthwhile, according to Tony Iams, an analyst at Ideas International in Rye Brook, N.Y.

For starters, Microsoft has introduced the concept of "roles," which will simplify the installation and management of Windows servers. Instead of installing the proverbial kitchen sink by including all Windows Server features on every Windows machine, roles enable users to install a specific profile, such as "Web server" or "File/print" that is tailored specifically for a usage scenario. "There are 18 roles you can pick from," Iams said. "You push a button, it asks a couple questions, and it will install all of the software you need for that role and leave out the unnecessary features." The benefits are twofold: It's easier to install, and the server is more secure and reliable. "There are fewer moving parts," Iams said. "You're not patching and protecting stuff you don't need."

Another new feature of Windows Server 2008 is PowerShell, a scripting system that allows admins to type in powerful commands to configure the system and write scripts that can automate management tasks. "You can always get a lot more done typing command than you can with a mouse and GUI," Iams said. "Unix and the mainframe have had this forever."

Will Windows Server 2008 encourage a switch from Unix or Linux? It's hard to say, according to Iams. "Is it as flexible as Unix or Linux? No," said Iams. "But it's more flexible than before. Put PowerShell on top of it, and you have something that will make Windows much more like what you had on a Unix system. That's a very deep part of the Unix/Linux experience, and it will get the attention of people in that camp."

4. Thin clients: PCs move into the data center. As an increasing number of companies bring PCs into the data center for management, security and energy-efficiency benefits, thin clients are gaining traction. Employees who don't need mobile capabilities are potential candidates for thin-client computing, and new technologies like PC blades and desktop virtualization are making thin clients even more attractive. In our 2007 data center purchasing intentions survey, 42% of respondents had implemented thin clients, and this percentage will grow in 2008.

"People spend a lot of time and money managing their desktop units," Iams said. "Moving the desktop to the server is a good thing. This isn't a new idea; it's been growing in fits and starts for about 10 years." According to Iams, the new functionality of Microsoft Terminal Services in Windows Server 2008 will offer greater flexibility for hosting a desktop on the server.

Northern California utility Pacific Gas and Electric Co. recognizes the benefit of thin clients on the energy front. In an interview, PG&E's Mark Bramfitt, supervisor of the customer energy-efficiency program for the high-tech market, said thin clients are going to be the next big thing for the electric company's IT energy-efficiency rebates. "If you can take 1,000 PCs and virtualize them on two boxes in the data center, there is big-time energy efficiency to be had there," Bramfitt said.

5. More to manage outside the data center. While some data center managers will reel desktop users back into the data center in 2008, others will struggle to manage a greater number of applications outside of their control. The increasing use of Software as a Service, social networking sites and unified communications requires data center managers to monitor more applications that run outside their domain than ever.

For data center managers, Web-based services are a double-edged sword. On the one hand, you may not maintain the software, but you're still responsible for ensuring service levels and security and compliance issues. "There is a management responsibility even if you're a consumer," said Carl Claunch, a research vice president at Gartner.

In 2008, data center managers will seek answers on this issue, and you can expect a slew of vendors to promise solutions next year.

6. Data center automation inches forward. According to Gartner, about one-third of data center managers attending this year's 2007 Gartner Data Center Conference have moved beyond the tire-kicking stage and are ready to buy into data center automation tools. While data center managers have been slow to adopt automation, the functionalities of these tools have made substantial advances, and the benefits of using them have increased.

While some companies have made the leap, CA's chief technology officer, Al Nugent, said data center automation will get interesting over the next two to five years. "We work in an industry where software is none too reliable," Nugent said. "Would you really want to trust your data center to software without adult supervision?"

For now, data centers managers will likely use automation tools to tell them when something needs to be fixed. In the coming year, if users see that alerts and mending approaches work, they may get comfortable enough to put a policy in place and begin receiving alerts when something has already been mended, Nugent said.

7. Government data centers catch up with green computing. The Environmental Protection Agency (EPA) data center report mandated federally operated data centers to adopt energy-efficiency measures. While the report lacked specifics about how this should happen, it laid down the requirement for government facilities to pursue green computing.

News writer Mark Fontecchio dug up this excerpt from page 109 of the report:

The federal government can act as a model in encouraging improved efficiency of data centers, by reporting energy performance, conducting assessments, and implementing efficiency improvements in its own data centers.
Federal government: Commit to publicly reporting the energy performance of its data centers, once standardized metrics are available. The federal government should commit to conducting energy-efficiency assessments in all its data centers within two to three years and implement all cost-effective operational improvements.

Jay Fry, vice president of marketing at power management software vendor Cassatt Corp., said that private-sector companies are way ahead of the public sector on data center energy efficiency. But in 2008, the government will get onboard with green technologies, he predicted.

"The EPA report kicked off a process of discovery steps and policy steps," Fry said. "It's a many-year cycle, but the cycle has started." He mentioned Gartner analysts who said they had received an increasing number of inquiries about how to deal with the power problem from government agencies. "Even without an official federal mandate or legislation in place yet, some agencies are starting to investigate solutions in this space."

8. Say goodbye to the super-user. According to Rob Soderbery, the senior vice president of the Storage Foundation group at Symantec Corp., the super-user will become extinct. "In the old days, sys admins were Unix hackers that lived in the basement of the computer science building and wrote code since the day they were born. Now those people are all architects, engineers and VPs, there isn't a next generation behind them."

Thanks to increased standardization throughout the data center and the push for ITIL-based operations, companies don't need an army of specialists anymore, Soderbery said.

"In the old days, you had to have experts across the entire stack," Soderbery said. "Now instead of having guys that understand every nook and cranny, you've got an engineering team that builds out a standardized environment. The old model wasn't beneficial. Each system had its own expert, with its own scripts and processes they don't want you to mess with. If you're running a factory, you don't let everybody do what they want to do."

Nugent from CA agrees. "The notion of the single super-user with the scepter controlling the thing is as passé as the high priests of the old glass data centers," he said. "There has been an evolution."

9. Utilization is out; I/O and memory bottlenecks are in. The x86 server utilization rates of 5% to 10% have weighed on data center managers in recent years, but thanks to server virtualization, that problem is disappearing and a new one is on the horizon. In its 2008 data center predictions, Light Speed Venture Partners nailed it. "As server racks are populated with more cores per CPU and more VMs [virtual machines] per core, memory and network I/O limitations will become priority concerns." Light Speed said. "How VMs share those physical resources will impact overall system performance and significantly influence the rate at which mission-critical applications are run in virtualized environments."

The folks with large scale-out computing environments have seen this coming for some time. Back in 2006, SearchEnterpriseLinux.com reported on the looming I/O crisis for large virtualization deployments: "The benefits of consolidating servers in the near term are obvious. … However … the industry should be laying the groundwork to address a looming traffic jam of data from overlooked I/O capacity."

10. Modular data center infrastructure. Traditionally, companies have bought more infrastructure -- more overhead for air conditioners and more backup power -- than they need. The practice is highly inefficient, though, so more people are turning to smaller, modular CRAC units and UPS systems. Further, people are less certain about their future infrastructure needs and are thus building out data centers incrementally.

Products like Eaton Corp.'s BladeUPS, American Power Conversion Corp.'s InRow cooling units and Emerson's Liebert NX with Softscale allow users to build out an infrastructure in a modular fashion rather than buying large units for potential capacity and operating products inefficiently during the life of the equipment. So look for data center managers to purchase more modular equipment in 2008.

These are my best guesses for 2008. I think I'm on pretty safe ground here, but if you think any of these trends are off base or have ideas for developments that I missed, send me an email at mstansberry@techtarget.com , and have a great 2008.

You can also check out our Server Specs blog.


This was first published in January 2008

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: