Is energy-efficient software the next step to reduce operating costs?

When the limits of hardware innovation have been reached, energy-efficient software could be way to cut electricity costs.

The next data center innovation must come by way of energy-efficient software.

Hardware manufacturers, data center designers, energy-efficiency-standards groups and people in many other industries are all working to develop products with higher energy efficiency. Those designs may take shape in the data center sooner than we expect, when nearly every facility will be cooled entirely with outside air, in virtually any climate. Eliminating mechanical refrigeration leads to major energy savings.

How do we further reduce consumption and improve energy efficiency, when we've gone as far as we can with computing hardware and facility infrastructure? The only answer -- outside of some unforeseen breakthrough in digital electronics -- is slimming down software for better efficiency.

The thinning of software unfortunately won't happen unless software developers recognize its necessity and importance. Further, it's a wakeup call to the IT community, which has to start thinking differently about how it evaluates and buys, if it's going to influence software developers to head in the right direction.

Look to the past for software innovation

We all know -- as do the program developers -- that software has become bloated. In the early days of computers, memory was very expensive and processors were slow. Code had to be written tightly to run in any reasonable length of time on mainframes with only 64 KB of memory.

Why so little? Many younger IT people today have never seen magnetic core memory and don't realize why even big mainframes had such limited amounts. Core memory used tiny magnetic donuts, about the diameter of pencil lead, strung on tiny wires in three directions -- one magnet per data bit. So 64 KB of memory required 512,000 tiny magnets, all strung by hand onto wires inside a cube.

I once saw one of the only 1 MB core memories in existence: A 10-foot cube surrounded by a chain-link fence that served 14 big mainframes. Back in 1985, it cost $1 million -- just think how that would translate to the multi-GB memory sticks we carry around in our pockets today.

In the 1980s, both operating systems and applications had to be written with a minimum number of instructions to be viable. In short, code back then was highly efficient, and both programs and machines were benchmarked for speed before purchase.

Programmers used to count the cycles each instruction used to make sure the program could run in a realistic period. One infamous government team failed to do this, and a three-year effort to develop a program meant for daily updates took 28 hours to run. As a result, the industry developed programs that looked ahead and transferred data from tape or disc exactly when needed. Such techniques were needed to get maximum use out of expensive memory resources. That has changed.

The development of cheap memory and high processor clock speeds took the handcuffs off software developers. The result has been an explosion of capabilities that weren't even dreamed of 40 years ago, and that's not necessarily a bad thing.

However, it's also created many trillions of lines of code, which large teams of programmers wrote and then patched again and again to fix flaws or to plug security holes. The software bloat is no surprise, but it's as much the fault of us, who demand ever more capabilities and faster roll-out rates, as it is of the program writers who, for years, gave little consideration to complexity and run times. The hardware developers have enabled it by proving that Moore's Law still holds true.

Regardless of what hardware we're running, every instruction ends up inside the computer as binary numbers that use machine cycles – and therefore energy -- to process.

Bloated software has to go

So why is software "the final frontier" in gaining energy efficiency? When we reach the limit of what hardware and infrastructure can do, the only thing likely to achieve further improvement is streamlined software.

Enormous progress continues to be made in reducing hardware energy consumption and increasing power and cooling efficiency, but it can only go so far. A new standard for minimum energy efficiency of data centers is in development, and guidelines for hardware development have been published that will run continuously in virtually any climate without mechanical refrigeration -- just outside air -- for cooling. Once those benefits are tapped, we have to look to energy-efficient software.

Tiny signs of change are already here. For example, newer versions of a few well-known programs have mentioned reduced storage space and faster processing, along with the list of other version enhancements, but that's a small start.

There was a time when an important consideration in selecting applications software was how few keystrokes or mouse clicks were necessary for any operation. Now, every program has to have more features and be "all things to all people" -- even though the vast majority of users utilize only a very small fraction of the capabilities. Most programs like this are written in modules.

If only we could easily turn off or remove modules for which we have no use. But even configuring basic features on most software today requires drilling down into multiple layers of menus. One company even made energy-saving features on servers a special activation, because access was so hard to uncover otherwise.

As we've seen with legacy COBOL programs, you can't replace millions of lines of code overnight -- or even, sometimes, over decades. So, just as the hardware side of the business has been working diligently to improve energy efficiency, the software side must learn to recognize the effect their products have on data center energy usage and make a more aggressive attack on bloated, inefficient code. It will take a long time for such an effort to have any significant effect, but if new programs start to consider this important, in time, the results will follow.

Some software needs to be robust, but it would be surprising if anything -- even robust programs -- written today could not be streamlined, while retaining speed and functionality. Like it or not, every machine cycle takes energy, even on highly efficient processors. And memory draws power as well -- especially spinning disc storage. So if we truly want to maximize the energy efficiency of our data centers --and reduce consumption by our millions of personal computers as well -- the industry will need to address how it produces software and return to the mindset of the days when both memory and processing speed were precious commodities, not to be wasted.

About the author:
Robert McFarlane is a principal in charge of data center design for the international consulting firm Shen Milsom and Wilke LLC. McFarlane has spent more than 35 years in communications consulting, has experience in every segment of the data center industry and was a pioneer in developing the field of building cable design. McFarlane also teaches the data center facilities course in the Marist College Institute for Data Center Professionals program, is a data center power and cooling expert, is widely published, speaks at many industry seminars and is a corresponding member of ASHRAE TC9.9, which publishes a wide range of industry guidelines.

This was last published in April 2013

Dig Deeper on Data center design and facilities



Find more PRO+ content and other member only offers, here.

Join the conversation


Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How do you think the market for energy-efficient software will evolve?
I have been writing code since 1962. There is no reason why the modern compilers and interpreters cannot generate more efficient code, but I think that any significant efficiency are still in the hands of the programmers, not the tools.
Software efficiency vital for reducing GHG emission in future
Probably small-scale operations won't mind to spend a few more watts, in order to avoid software complexity.
The greatest opportunity resides in increasing hardware utilization, so the benefit software can bring to the table is to structure the code to enable virtualization.
I believe it has a chance to develop from a green certification standpoint, if an organization that can develop an industry certification is formed and once a few major players get on board. The first to focus on the issue will be those who whom it is a direct operational cost issue - the huge operations for whom a little incremental change in a few key places has measurable impact due to scalability. This will be players like Google, Apple, FaceBook, LinkedIn. Once the gauntlet is laid down, it becomes a marketing differentiator, and others will follow suit.
Hi Robert, I agree with you completely. Software is the next frontier, and a very large one, in energy efficiency. When I think of all the needless repeat data re-transfers in multi-tiered applications, and all the entropy producing unnecessary and redundant read/write operations that occur on everyone's desktop, not to mention the data centers, I have to believe there is avoidable ecological impact happening here. It all comes down to entropy.
Thanks for writing this prescient article.