Tell me about Equinix's expansion plans.
We're going to spend more than $400 million on announced expansions. The centers we're building today can accommodate more than double the power and cooling requirements of our original centers. That's really driven by customers with more blades, more 1U boxes, stacking them up and higher processing per watt. All of those things drive much larger power and cooling requirements, and that's what we're building to. What is driving the
Disaster recovery needs are driving it for sure. Second, based on all the new boxes they're buying, power and cooling is a big issue. As they're bringing home their new blade servers from HP, they're trying to put them in the corporate data centers and their guys are saying, 'Hey, we can't handle it. We can maybe power it, but we can't cool it.' The third thing that's driving it is IP-enabled applications. As more and more companies start to move things to IP, they want a place where they can reach multiple networks all in one place. What does a typical Equinix data center look like?
It won't look like any data center you've been in. There are no raised floors, there are no low ceilings. There aren't these hideous, bright lights around. It's a very different design to handle the high-powered density needs of customers today. There is tons of security. There are five levels of biometric hand readers from the front door into somebody's cage. There are 30- to 40-foot high ceilings to handle the installations. We use overhead cooling technology, not this under floor perforated tile stuff. It just does not work in today's environment to really cool high-powered density equipment. What else do you do for security?
The whole dealing of an Equinix center is different because we have so many competitors in there, right next to each other. That's why when you go in, it's dark -- because AT&T doesn't want Sprint looking into their center and saying, 'What kind of equipment do you have? How much capacity do you have?' Once a customer walks in an puts their hand in their cage, only the light in their cage comes on. When they leave, that light goes off.
Unless a customer wants you to know they're there, you won't know who you're walking by unless
I'm telling you and they're OK with me telling you. If you went down to the New Jersey center now,
there are 1,200 cameras. There's one on every cage. They're all digital and archived for customers.
There are no keys, no proxy cards, no nothing. If something happens in a customer's cage, they want
to know whose hand was in there [at a given] time stamp. It's not, they gave their key to somebody
and somebody who got fired yesterday walked in and walked out with $3,000 worth of servers. It's
locked down pretty tight. What do you use for overhead cooling?
We used chilled water cooling outside of the collo floor. What you'll see when you go to a center are these big vents. You could literally drive a Volkswagen Beetle through them, that's how big they are. That's the cooling that goes across the data center floor forcing the cool air. We do all hot aisle, cold aisle. We work with customers on thermal dynamics on how to position their equipment. That's why we have high ceilings, so the heat rises quickly, and we can get it up and out of the center.
We can provide customers with computational thermal dynamics. It's actually a weather chart of a
customer's cage, so it shows where the cold spots are and where the hot spots are, so that we can
adjust for it. It looks like a little weather map inside a customer's cage. So that's the stuff
that we get involved with to make sure they're laying their stuff out right. These are obviously
some big, high-powered data centers. How do you get power to them?
Our site in New Jersey, for example -- the new one we're building right now -- there is going to be 32 megawatts of power in that center. It will be second only to Newark Airport in the state of New Jersey.
When we're going out and looking for a new site, the first conversation we're having is with the
utility. In Chicago, for example, we worked very closely with the power companies. We paid the
utility $8 million to build a new substation for us. You've got to have multiple substations coming
in on multiple sides of the building. The amount of redundancy you want on the utility side is
significant. You said the facilities had five levels of biometrics. What about the weatherproofing
of your facility?
When a customer comes in, these are hard-core data center guys. You have to be above a 100-year flood plain, period, and you have to be able to prove it. If you're in Silicon Valley, you're built to withstand earthquakes at the same level as hospitals. That's how we build. If you ever go into a Silicon Valley center, you'll see giant, blue steel reinforcements all over the center. There is a significant amount of weatherproofing.
Let us know what you think about the story; e-mail: Mark Fontecchio, News Writer.