There are proponents on both sides of the DC vs. AC power debate. Here are a few answers specifically for data...
The global power grid distributes electricity in the form of alternating current (AC) instead of direct current (DC). The choice of AC over DC dates back to the 1800s when Thomas Edison first touted the simplicity of DC, while notables like George Westinghouse and Nikola Tesla supported the use of AC. Since AC proved easier to deliver commercially across great distances using thinner -- and far less expensive -- copper wiring, the industry ultimately adopted AC.
However, AC is not necessarily the most efficient means of delivering power, and the use of DC to data center racks and systems has gained a following as power costs force organizations to watch their power budgets. Let's consider several key issues in DC delivery.
What is direct current in the data center?
The problem with AC is loss. AC initially leaves a power plant at very high voltages. As those voltages are carried to cities, towns, neighborhoods and individual buildings, those high voltages are divided down several times using transformers. And even once the AC voltage enters the building at a moderate 600 VAC or 480 VAC, it must be stepped down again to 240 VAC or 120 VAC to feed the rack servers' power supplies, which convert the AC into several DC voltages that power server components such as the processors, memory, disk drives and so on.
The AC-to-DC translation is not perfect, and a certain amount of loss occurs with each conversion. But you're paying for all of the electricity that enters your facility whether it's used or not, so those conversion losses cost the business money. DC proponents suggest that a single conversion from AC to DC would eliminate much of this loss and be far more efficient. The resulting DC would then be distributed to racks and systems throughout the data center, displacing traditional AC power cabling and subsystems.
In the DC vs. AC power debate, what benefits should I look for?
The general benefits are efficiency and cost savings. The concept is straightforward; you save money by eliminating points where power is lost during conversions. Lawrence Berkeley National Laboratory in California performed a demonstration back in 2006 which compared AC and DC power in the data center. The laboratory claimed data centers could save up to 20 percent on power costs using DC distribution.
In addition, power supplies in individual servers or other hardware systems would essentially be removed since power would already arrive at the rack in DC form, which need only be regulated down to lower voltages as needed. This would eliminate the need for redundant power supplies along with their noisy and failure-prone power supply cooling fans.
The actual amount of savings through DC power distribution remains a matter of debate, and later testing performed by other groups -- such as The Green Grid -- questioned the ultimate difference between AC- and DC-powered data centers. For example, The Green Grid's report concludes that there are no significant differences between power distribution approaches -- mainly because no single AC or DC configuration is more efficient under every possible load condition and because servers and electrical distribution equipment are constantly getting more efficient with each new generation.
Still, when savings are realized, the actual benefit will be greatest for the largest data center operators handling multimegawatt installations. Today, giants such as Google and Swiss hosting company green.ch are among the organizations to deploy DC-powered data centers.
What equipment or changes would be needed to support DC power in my data center?
One of the biggest obstacles to DC adoption is that the technology is highly disruptive. It's not simply a matter of switching the electricity from AC to DC. A DC-driven data center would require an entirely different electrical distribution system and wiring to the racks. The electrical distribution would also need to integrate on-site generators so that backup generator power would be converted to DC for the facility.
And the renovations go right to the servers and systems. Existing servers and other hardware systems cannot be retrofit for DC, so an entirely different suite of hardware would be needed. Uninterruptible power supply systems which depend on AC-to-DC conversion to charge internal batteries, and on voltage inverters to convert DC back to AC, would need to be replaced with DC-only units. Organizations will typically wait for a new data center build before deploying DC power.
With rising energy costs and the demands of constant business availability, determining who wins the DC vs. AC power struggle is important. In the end, replacing the traditional AC power distribution scheme with a DC power infrastructure can reduce energy losses by eliminating extra power conversions. When properly deployed, this can save money on the monthly power bill and simplify equipment by removing separate power supplies, lowering its cost and improving reliability.