An old college buddy called me last week. He works for a manufacturer who sells a product into data centers (among other markets) and he wanted a data center professional's insight into "this whole cloud thing."
It seems a colleague of his is trying to convince his whole company to prepare for the cloud computing paradigm shift, where "everything will exist within about ten huge data centers." I have to admit, I laughed when I heard that. My friend wanted to know what impact cloud computing will have on the future. "Is what this guy's saying really going to happen?"
In my answer, I went over how much of the thinking and hype surrounding cloud computing is built upon fallacies while ignoring the market realities. Let me outline those fallacies here.
Could computing fallacy #1: New technology always supersedes old technology
I wish I had a dollar for every time I heard or read something akin to "Everything will move to the cloud." The basis of this statement is a deeply held fallacy in the minds of so many people who follow technology, especially the "pundits" one reads in the trade rags and blogs. Everything? Really?
This fallacy is an extension of an old economic fallacy -- that of the limited market. The more gains X makes in a market means that Y and Z lose. For some reason, perhaps people's desire to sort everyone and everything into piles named "winner" and "loser," they assume that in any given market there can only be one winner and everything else loses.
So all thinking becomes weighted toward the supposed winner. In reality, markets constantly shift and, for all practical purposes, have no limits. Further, if you have a product or service that satisfies a set of customer needs better than the competition, you will make sales. Not all customers have the same needs, so there is no way that any single technology, service or business model can exclude all others by its mere presence in the marketplace.
Indeed, for everything to move to the cloud, the cloud would have to become the solution to all the needs of all people. This is impossible.
For the new to replace the old seems like the natural order of things -- after all, we no longer drive horse-drawn buggies, right? But one could argue that the modern car is the natural evolution of the buggy, with the internal combustion engine merely replacing the horse itself as the prime motivator. The aircraft did not replace the car, despite the predictions of nearly all pundits in the middle of the 20th century. (Where are our flying cars, anyway?) Nor have many other older technologies vanished; trains and ships still traverse the planet despite much "better" technologies having been developed since their invention. Television hasn't fully replaced radio or movies. Those markets instead have expanded to allow all these technologies to survive in their own niches. Those niches may expand and shrink over time, but the new technology rarely, if ever, completely replaces the old.
Bringing it back to information technology, mainframes are still built and sold despite the perception that they're technological dinosaurs. Why? Because they serve a need that newer technologies cannot meet. If anything, the market for mainframes remains about what it was a decade ago. Sure the market isn't growing like it was in the 1960s, but it is likely actually larger in terms of the number of physical units operating than it was back then.
Cloud computing, even if it is wildly successful, will not replace the forms of computing we use today. It will only expand the markets and provide new solutions to some old -- but mostly new -- problems. Technologies really bloom and create markets when they solve new problems rather than replacing the solutions for old ones.
Cloud computing fallacy #2: Cloud computing is new technology
It is not a new technology, just a new name. At its core, cloud computing represents no new technology. It is just a buzzword du jour that's applied to a collection of older technologies being packaged and sold in a new way.
I've heard the term used to describe everything from Amazon's EC2 to Skype, from Gmail to Salesforce.com. I find it hard to believe that these all fall into a single definition. Their sole commonality is that they are services delivered over the Internet.
It seems I'm not the only one with this opinion.
If anything, I would argue that cloud computing isn't really about technology at all, but really a way of provisioning and selling computation.
Before it was called "cloud computing," it was called various names at various times, as the concept iterated itself though history: Time sharing, client/server, network computing, thin clients, utility computing, application service provider, grid computing, Software/Platform/Infrastructure as a Service, etc. None of these terms, including "cloud computing," describe any new technology, only ways to deliver or provision existing technology. It boils down to rental rather than purchase, period. If I rent you my car, I cannot claim to have invented the automobile, or the concept of renting it, either.
Fallacy #3: Cloud computing will replace data centers
I've heard this fallacy from many sources. Cloud computing represents no threat to the data center whatsoever. If anything, it will just require more data centers. That answers my friend's worry -- but how did this fallacy originate?
Well, data centers are very expensive. They are very expensive to build and very expensive to operate. As power densities (that is, the number of Watts per square unit of measure available within a given data center) go up, so do construction costs. The current average cost to build a data center in the U.S. up to modern standards is between $1,500 and $3,000 per square foot.
Compare this to the cost of the average office building at $150-$200 per square foot and you'll understand why CFOs tell their CIO/CTO counterparts to rent rather than buy. And that is just the building part. Once the construction is complete, you have to operate that facility, and that costs money too.
When you boil down what a data center does, it is pretty simple. A data center is a facility that turns electricity into bits, usually on a grand scale. The byproduct of that industrial process, like so many other industrial processes, is heat. Power comes in, usually in vast quantities, and gets burned up by the silicon and rotating discs and transformed into bits, which exit the facility on the wires to be delivered to you, the consumer of bits. This makes heat, which has to be mitigated by mechanical cooling, because without cooling, the computers will fail faster. This very website resides on a server in a data center. This server runs 24 hours a day, and even when people are not reading these bits, it burns up energy and makes heat all the day and night. Multiply that by millions, if not billions, of servers running in every data center around the globe. Now layer on top of that the cooling systems to mitigate the generated heat and you'll see how operating these facilities is very costly.
Cloud computing doesn't change data centers or their economics. Cloud computing providers still have to build and run data centers or rent the space from colocation providers. That requires capital expenditure. In order to pencil out economically, the cloud provider has to either charge its customers enough to pay for the build in a reasonable amount of time and cover the monthly operating costs, or oversell its capacity and hope it doesn't bite the business in the ass.
This is why I've said the only reasonable current cloud provider business model is Amazon's, which is based on excess capacity. Essentially, the cloud customers contribute to Amazon's data center return on investment while they scale their own operations. It is a brilliant model. But anyone who is starting out as a standalone cloud provider faces a rough road to profitability.
All the world's computing needs cannot be collapsed into those "ten huge data centers" my buddy heard about. The reality is that as industry, business and society use more and more information technology, there will be more and more data centers. They will range in scale from re-purposed broom closets to giant campuses of warehouse-sized facilities. Many organizations have very specific needs that cloud computing may never be able to address, and for those organizations there always has to be the choice of a traditional facility.
Cloud computing fallacy #4: Cloud computing can work for any IT need
There are several IT needs that cannot be solved with cloud computing. Meeting audit requirements is one. I've written about this fallacy before, and it caused a bit of an uproar. It seemed to be the first time anyone brought this issue up, and it became a hot topic in the cloud blogosphere for a short time. I felt vindicated when a cloud provider admitted what I said was true.
The basis of cloud computing is the same basis as Web hosting -- your data on somebody else's servers. The same reasons that people chose to not use a Web host apply to a cloud provider: control of assets, risks associated with overselling capacity, support concerns, interoperability concerns. There are literally hundreds, if not thousands, of reasons why IT organizations and individuals would prefer to keep their data out of a public cloud computing system. Most come down to a single word, which is trust. That brings us to our last fallacy.
Cloud computing fallacy #5: The cloud is secure
Cloud computing is no more secure than any other form of computing, which is to say, not very. Or perhaps more accurately, as secure as it is designed and managed to be. For an excellent analysis of data security in a cloud environment, I highly suggest reading Rich Mogull's thoughts on the subject.
Rich obsesses about all things security and does a far better job than I could in delving into the specifics. To his analysis, however, I'll add a more encompassing and less data-specific view of security that is more about trust and consequences than the integrity of the data itself.
The greatest hurdle to the widespread acceptance of cloud computing is trust. Trusting one's data to systems whose location, condition, environment and state of load are virtually unknown is a difficult thing to do. Many of these questions apply to any services (hosting, colocation, Software as a Service, etc.) purchased online, but the "cloudy" nature of cloud computing amplifies many of them beyond simple answers found in other scenarios.
- How well is that data protected?
- How stable is the company that owns the infrastructure?
- Is the data center owned and maintained by the cloud provider or is it colocated in some other company's facility?
- If the latter, is the cloud provider keeping current on all its bills or is installation subject to suspension by its colo provider?
- What about bandwidth?
- Is the cloud provider multi-homed?
- Does the provider have geographic redundancy?
- What happens when the power goes out?
- Has the provider tested its generator(s)? How well are its power backup, network and HVAC systems maintained?
- Is there anyone on-site if something goes wrong?
- What sort of service-level agreements does the provider have?
- What happens to our data if the cloud provider goes out of business?
- What sort of security is in place to monitor customers?
- What happens if somebody else on the system(s) we're using is a spammer?
- How is blacklisting handled?
- Could AUP violations by other customers impact our operations?
- Are assigned IPs SWIPed to customers or does everything track back to the cloud provider?
- How is DoS traffic filtered, or are we going to get billed for it?
- What happens to our data when we scale back usage or cancel our service?
I could go on and on.
Many of these issues resolve to how much can you trust your cloud provider? Trust takes a long time to build. Most of IT is fairly critical corporate data and infrastructure, so it may be some time before trust is built up enough to move much of this sort of data to cloud deployments. Trust can also evaporate almost instantly once it is lost, so all it will take is a single high-profile cloud-related failure to put all cloud business at risk.
Now, it may seem that I'm somehow anti-cloud. Nothing could be farther from the truth. It is a sensible method for provisioning computing resources on demand and fulfills a very real market niche. I just do not believe that it is the answer to every IT problem, nor is it the sole future of IT -- only small portion of it. Cloud computing will expand the market. I can envision a very near future where companies use a hybrid of traditional dedicated data center resources with cloud deployments to extend, replicate or expand as demand warrants. The cloud is indeed a new paradigm, but it lacks the underlying "shift" that alters the entire industry around it. The pundits should sheath their hyperbole and focus on what cloud computing can do for people rather than what it will do to the marketplace.
ABOUT THE AUTHOR: Chuck Goolsbee is an executive at colocation provider digital.forest in Seattle, Wash. He has achieved notoriety blogging about the obsolescence of raised floor in the data center and for threatening to gas server designers from Dell with FM200. In his spare time, he enjoys wrangling geeks and tuning SU Carburettors.
What did you think of this feature? Write to SearchDataCenter.com's Matt Stansberry about your data center concerns at email@example.com.
- Managed Colocation: Retaining Control –Flexential
- 5 Perils of Outsourcing Your Facilities Management –ServiceChannel
- How to choose between in-house and outsourced SAP managed services –ComputerWeekly.com
- Are businesses taking chances when it comes to managing outsourcing risk? –ComputerWeekly.com