The old adage "You can't manage what you can't measure" has become a familiar mantra in data center circles, and...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
data center managers have found myriad ways to measure data center metrics beyond the usual speeds and feeds of IT equipment.
Since energy consumption has become a major concern for data centers -- only 12% of respondents in a SearchDataCenter.com survey this year said it was not a priority -- data center pros have begun describing their facilities in terms of megawatt capacity rather than square footage.
With this new way to express energy consumption has come a greater awareness about how to measure all environmental information in a data center, including power consumption, temperature, humidity, and airflow. Data center pros now use time-tested ways to measure factors such as readouts at the uninterruptible power supply (UPS) as well as newer wireless sensors. And they now use that data for capacity planning, tracking their data center energy efficiency, and determining their carbon footprint.
"We look at the efficiency of every data center," said Christian Belady, the data center power and cooling architect at Microsoft. "There is so much data, and we're constantly adding more columns to track and monitor, and then we can do more data mining."Data center data gathering
There are dozens of ways that data center managers now collect power. Some measure the power output number on UPSes. Dividing that number by the number of IT equipment racks served by a UPS gives an approximate average. Drilling down further, many power distribution units (PDUs --essentially power strips for data centers -- can provide detailed power information for each circuit and insight into the consumption of each rack.
But power consumption is just one piece of the data puzzle, said Alex Carroll, the co-owner of Indianapolis-based colo Lifeline Data Centers. The company monitors the chilled-water plant outside to monitor water temperature coming out and going into a data center, water volume going to the cooling tower, and whether the compressor is low on oil. Inside, the company uses remote sensors from Geist Manufacturing to measure temperature, humidity and airflow at each of its air handlers.
"If these [metrics] start to fall out of range, you can run into problems downstream," he said.
Plenty of vendors offer power and environmental collection devices. Earlier this month, Raritan Inc. introduced some new power meters that can measure voltage, current and kilowatt-hours as well as temperature and humidity with some plug-in sensors.
"What we see customers doing is tracking and trending this data over time," said Herman Chan, head of Raritan's power business unit. "Then they know if they're operating within the safe range of environmental conditions that that piece of equipment requires."
Then there is Microsoft, which created its own sensors. It uses its "Scry" sensor system to gather data on power, temperature, power usage effectiveness (PUE) and carbon emission reporting, among many other measures.Analyzing data center environmental data
Of course, having the data and doing something with it are different things. A data center facilities director for a major health care organization said that one of his company's data centers collects almost 20,000 environmental data points. A huge mound of data is like a jigsaw puzzle, and it won't make any sense until data center pros begin putting the pieces together.
This can be done with tools as simple as a spreadsheet. Data centers could collect power output numbers from the UPS periodically throughout the day, record them in a spreadsheet and create graphs on their own to monitor trends in the data over time.
"Don't use the fact that you don't have sophistication as an excuse," Belady said. "You should always use whatever data you have available to you."
Carroll at Lifeline Data Centers said most of the company's data comes in automatically through a remote desktop, where the IT staff can then predict potential problems in the facility, such as localized hot spots, as well as look at trends to do capacity planning. That way they know if they need to bring more power into a facility, or start thinking about building a new one down the road.
There are plenty of vendors here too. Part of Hewlett-Packard's new Converged Infrastructure news this week concerns the company's "dynamic smart grid" comprising parts of HP OpenView and Insight Control software that promise to enable a full view of the data center using wireless sensors.
"The data center would know where the hot spots are and could also plan for brown-out protection for applications inside Insight Control. It could shut down noncritical apps or move them so a data center can plan for a minor disaster or brownouts," said Doug Oathout, the VP of HP enterprise servers and networking.Hurdle: Differing protocols on IT, facilities sides
One thing data center pros still need, however, is a way to more easily connect facility environmental data with systems management software. Carroll said that this taks is made more difficult because of diverging protocols. On the IT side, just about everything runs under an SNMP protocol that can tie into big software packages such as IBM Tivoli, HP's OpenView or CA's Unicenter. On the facilities side, protocols are different and stem from the building maintenance world. These protocols are more general, meant to monitor comfort cooling levels or other data in any kind of building, not just a data center.
"There isn't anything to my knowledge that can bridge that over so the computer world can pick it up and do something with it," Carroll said.
Senior news director Barbara Darrow contributed to this story. Mark Fontecchio can be reached at mailto:firstname.lastname@example.org.