Despite recent news articles regarding energy waste in the data center industry, there are plenty of designs that stand out for their innovation. A new book, The Art of the Data Center, highlights these gems and provides interviews with some of the key players who produced the designs.
The Art of the Data Center is published by Prentice Hall and is available now.
I'm Erin Watkins and joining me on the phone is Douglas Alger, author of The Art of the Data Center.
So, Doug, could you tell us a little bit about your background to get started?
Doug Alger: I am an IT architect at Cisco Systems. I've been in the data center industry for about 15 years. Started off in doing operations sort of work, helping people install gear into their cabinets. Then moved into doing design work in data centers, helping people create these rooms. My background though goes before that. I worked for years as a newspaper reporter, so I'm always one who's interested in getting to talk to people and finding out the stories behind things, which is a big piece of what led me to actually write this book.
Would you say that's what inspired you to make the book?
Alger: What actually inspired me was going back a few years to when I was working on a different book called Grow a Greener Data Center that was looking at how people can build rooms that are energy efficient and limit their environmental impact. While I was in the midst of doing that, I came across a data center that Banhoff ISP had built in Stockholm, Sweden. And if you looked at a couple of the articles about it, it showed pictures with artificial waterfalls, submarine engine generators, fog in their command center -- all built in an underground nuclear bunker. And I thought, "What a really cool place!"
But even more than that, what were the decisions that had to be made to come up with this. I just love the idea that somebody at some point said, "You know what this data center needs? Waterfalls." So I was really interested in it, but that wasn't the focus of the book, and I was doing research around geothermal cooling, which was what led me to some underground spaces.
I put that on the shelf and continued to work on the other book, but it was really starting there I thought people are making these interesting data center innovations. It's not all about having enough electrical capacity or computing power. People are doing really interesting things in these rooms.
Which of the data centers did you get to visit in person, and which ones were most interesting to you?
Alger: Sadly, I did not get to visit the majority of these data centers. Almost all of them I had to do remotely, but part of the fun of that was that I got to talk to the different people designing them and say, "Pretend you were taking me or someone else on a tour of this room. Tell me about it. Tell me what's interesting, and send me some images," and take it from there. There was also an element of "impress me" with it, to justify the fact that these are truly innovative data centers.
I just love the idea that somebody at some point said, 'You know what this data center needs? Waterfalls.'
In terms of the one that's most interesting, all of them to a certain degree are interesting or they wouldn't have made their way into the book, but I suppose special mention needs to go to Banhoff, because that's the first one that pulled in my attention.
Another one I thought was fascinating is a supercomputing center in Barcelona, Spain, that they built in a 1920s chapel. I had actually come upon a picture of it several years ago. I confess that when I first saw it I thought someone had Photoshopped it, because it shows this gorgeous chapel and you can see the elaborate stained glass windows and a beautiful space with what looks like this big glass box with all of the data center gear in it -- all of the computing hardware in the cabinets. And I thought, "Wait, that can't be real." But, sure enough, it is!
There's another facility -- the Calcul, Quebec, facility -- where they took this old, concrete silo that had housed a linear accelerator, and they installed their systems in there. So how do you build this sort of thing in a round, vertical building and control all the venting? Really, any of these rooms where I can imagine people getting together around a table and saying, "Here's an idea; why don't we build it this way?"-- those are really the ones that caught my attention.
In these data centers, what were some common trends did you see among them? What about greater trends overall in the industry?
Alger: Power density is going up -- probably not a big surprise to anyone who's involved with the data center industry. The good news is that we have more powerful computing systems that we can fit into a given footprint, which then causes some challenges of how to power and cool it.
Liquid cooling is coming back to data centers. Several people I spoke with talked about either installing it to begin with or having some capacity to make adjustments later on to bring in liquid cooling to be able to handle that growing density.
Raised floors seem to be alive and well. For a while the trend seemed to be moving away from those, but some people are still finding value in the flexibility that those provide.
Hardware doesn't seem to be getting coddled as much as it was in the past. People are pushing those to higher and higher temperatures to reduce their energy consumption and lower those costs.
Virtualization is big -- letting people do more with fewer resources and finally, the big bow that goes on top is that data centers can be green. This doesn't mean they won't consume anything, because they will. But people are finding ways to optimize everything from the back-of-house systems -- what's happening with power and cooling -- to the hardware that they're choosing, and so they're able to get a lot more productive work done while consuming fewer resources.
We've seen a little bit of pushback on the temperature issue at least. Do you know why people would push back?
Alger: The big driver for people to want to bump up temperatures in the hardware is all around energy savings from not trying to cool a box to 68 degrees. When you're packing more and more systems into the same footprint [it] gets really challenging, and the whole assumption we had on why we're trying to keep these systems so cool is a reliability issue.
If I can take my laptop and walk around with it and put it in all these really crazy conditions -- turn it on, turn it off, put it in the trunk of my cars, take it to crazy places -- and that laptop can withstand that, can't an enterprise-level server?
So as you start to approach it with that different point of view and are willing to abuse the system a little bit more, you don't have to run your cooling system quite so hard.
Because people are going more with virtualization, you no longer have one server performing this critical task for a company; you've created this resource pool with dozens or hundreds of servers, and any task that's going on is running on this virtual machine. If it fails, it's not to say that you don't care, but you care about it a lot less than when it was one critical device. In this case, if you lose one virtual instance, well, you've got plenty of others that can take over and do that same task, so it opens up the opportunities. We don't have to wrap the gear in cotton quite so much, and your back-of-house system doesn't have to work so hard.
What do you hope readers will take away from your book?
Alger: I would like folks to have a greater appreciation of what data centers are doing, what they can do. For many years, I think they were ignored. They were completely sitting behind the curtain doing what they do -- we didn't pay much attention to them. When they would get some attention, it was either because the power bill was high or there was a problem.
It's a lot like if you go into your house and flip the light switch. If the lights come on, you don't think about it. But if you flip that switch and the lights don't come on, you want to know the problem and how soon it's going to be fixed. Data centers got a lot of that same sort of attention.
But if you take a look at what is being done in these rooms today, people are getting in there and optimizing them -- doing very creative things. It's no longer just a room that you happen to roll some gear into and throw some cooling over in the back corner of the building.
These are really showpieces for technology, and I'll tell people even if they don't do anything that they can think of tied to a data center, it's probably actually having a significant impact on their life.
Every time you write an email and send it, go online to buy something, or if you go to your favorite social networking site -- any of these online activities are all facilitated by a data center somewhere. With that increased demand comes increased pressure on infrastructure so people are out there doing really interesting things to help data centers be more efficient and do more.