Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Micro data centers keep businesses competitive from next door

A fleet of micro data centers can conquer the latency and resiliency of one centralized data center. Is the edge model right for your enterprise?

Decentralized data centers can improve application performance and system availability by placing resources close...

to users.

The debate over whether to concentrate data center resources or spread them out in micro data centers isn't new -- and it isn't ending anytime soon. Many large enterprises pull their remote data centers into one or few central locations, but a distributed group of data centers offers better performance, increased reliability and lower costs.

Businesses opt for this edge computing -- smaller data centers located close to end-user populations -- to improve speed. Employees and consumers are often scattered geographically, but all need quick access to data. When information is stored closer to the user, data is delivered faster. For example, a retailer with an online store wants to serve its website closer to customers in the U.S., Russia and Brazil.

Bandwidth-intensive information, such as streaming content, medical images or complex architectural diagrams, can slow network response time and diminish productivity. Moving that data closer to the user eliminates those problems. For example, a simulation-rendering firm might locate data centers within each of its branch offices across the country to host valuable, bandwidth-hogging data instead of centralizing at its headquarters.

Go small or go home

Rack-sized, micro data center units meet niche use cases: remote offices, mobile deployments, temporary IT capacity for the military and disaster relief. Many companies historically were wary of deploying these new and unusual IT systems in small enclosures or outdoors, preferring the typical large, indoor, tightly controlled data center. But vendors are shifting system design in some cases to attract more users.

In the data center server market, vendors now compete to go smaller and more mobile. Established vendors like Dell, HP and IBM developed smaller, lightweight, more mobile data servers. In addition, suppliers like Elliptical Mobile Solutions, Schneider Electric and Silicon Graphics International have taken the competition to a new level with self-contained micro data centers.

You've got data centers

AOL Inc. embraces edge computing. After acquiring content-producing websites -- such as The Huffington Post, Engadget and the Patch -- it needed a flexible network of data centers to spin up server capacity quickly when any particular piece of content spiked in popularity.

The company deployed rack-sized micro data centers to reshape its IT infrastructure from a large central system to small, unmanned IT facilities managed remotely. The new high-density enclosures house servers with thousands of virtual machines that are core to AOL's content distribution network and allow it to quickly roll out new IT capacity.

Deploying equipment in colocation cages is a somewhat comparable approach for traditional large enterprises chasing the same decentralized, flexible data center back end as AOL. With colocation, the vendor's staff typically oversees the infrastructure, with varying degrees of remote management and administrator control. With the micro data center approach, in contrast, the enterprise is the first line of maintenance.

How data center approaches differ

Economics affect the move to remote data centers.

Businesses deploy hundreds to thousands of servers, which offers economies of scale in terms of purchasing and operational expenses in one location. As data centers grow, the connections among systems become more complicated. While hardware costs decrease, the ancillary areas that make a data center work become more expensive.

Larger, more complex data centers require more sophisticated, expensive management and troubleshooting tools. They also take up more physical space, which contributes to higher real estate costs. However, distributing IT resources to many locations increases management challenges; companies need to track all that infrastructure and interconnect it. As the number of remote sites grows, the task of pinpointing problems multiplies.

Downtime is another decentralized system driver. Many firms cannot afford to be offline. A centralized data center is a single point of failure, even with redundant systems in the racks, redundant power supplies to the site, and other location-specific safeguards. If the main system goes offline, the entire business sits idle.

Too often, enterprises have rudimentary backup and recovery systems in place; they run a main computer system and have a backup nearby for disaster recovery. If severe weather or similar disasters hit the area, it could take out the main site and the backup.

Distributed data center locations provide more reliability because main processing tasks can be done in several sites, rather than one. Virtualization plays a key role in to-the-edge data contingency capabilities. IT organizations can move a virtualized process off a failing server to any of a series of servers in remote facilities. A glitch in one machine does not hinder the workload. Faulty systems are isolated and repaired without downtime to the whole environment. 

The most common cause of unplanned data center outages is uninterruptable power supply equipment failures, according to Ponemon Institute LLC. As central systems become larger and more complex, the likelihood of trouble increases. Small, distributed data centers encounter such problems less frequently.

During its lifetime, a server costs the businesses more in energy bills than in sticker price, according to International Data Corp. Data centers, especially large centralized data centers, easily exhaust some local energy grids. Distributed data centers spread energy use among several locations and have less effect on the grid.

A large data center can also pressure the local water supply. In Prineville, Ore., Apple Inc. worked with the city to tap into an underground stream for the water needed to cool its 500,000 square foot data center. Remote sites may also use water for cooling, but consumption per site is significantly lower.

About the author:
Paul Korzeniowski is a freelance writer who specializes in data center issues. He has been writing about technology for two decades, is based in Sudbury, MA and can be reached at paulkorzen@aol.com.

Then read about how businesses are using these designs in real-world deployments.

Finally, take this quiz to see if containers are the right fit for your expansion project.

Next Steps

Check out the new generation of modular data centers and containers and look inside to see how the data center is built.

This was last published in February 2015

PRO+

Content

Find more PRO+ content and other member only offers, here.

Essential Guide

Cutting edge: IT's guide to edge data centers

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close