News Stay informed about the latest enterprise technology news and product updates.

LinuxWorld: Kernel changes spice up next-gen technologies

An IBM executive looks into his crystal ball and shares his predictions on how the 2.6.0 kernel, grid computing and virtualization will affect next-generation technologies.

SAN FRANCISCO -- "Faster than a speeding bullet!" That's how IBM's vice president of grid computing strategy, Dan Powers, described advances in Linux technologies. "Many different technologies will be arriving that are changing the landscape for Linux users and businesses," said Powers, who is a speaker at LinuxWorld Conference & Expo this week. At IBM, Powers and his associates examine and make projections about IT changes during coming years and decades. "We look at different technology paths, trying to predict as far out as we can," he said. In this interview, he shares some of his thoughts and predictions about future technologies.

Let's start with the technology advancement that's on people's minds right now. Where do you think the new 2.6 kernel is going to take Linux?

Powers: It will help Linux take on more and more workloads in the commercial space: database-type workloads, transaction-type workloads, etc. It will be better from a multi-processing performance perspective, and there are applications in the commercial space that can take advantage of that. It will help Linux, from a feature-function perspective, become more enterprise-ready, particularly with the recovery services in the new kernel. Each new kernel release just continues to improve its readiness for business usage.

Isn't the pace of technology change too fast for most businesses to digest?

Powers: That's a topic I cover in most next-generation computing presentations. It's called, 'You can't handle the technology!' The barrier to adopting new technologies quickly is 'How do we integrate our processes and procedures?' How does a business deal with the fact that it has broken up a lot of its stovepipe IT infrastructure over the last 10 to 15 years? They're trying to figure out how to break down these process and procedure barriers, reduce the complexity of adopting new technologies and -- at the same time -- reduce total cost of ownership. Actually, this is where grid computing and autonomic technologies are starting to come in and help.

Virtualization technologies are supposed to help reduce the complexity of new technology adoption, as well as daily IT operations. Are people getting on board for it already?

Powers: Absolutely. They're really on board for server virtualization, where -- for example -- you use an IBM mainframe with Linux to run hundreds or thousands of virtual servers in that environment. Businesses and IT shops love it because there's a dramatic time-saving in setting up a server in the virtual world compared with setting up a server in the physical world. Our mainframe customers can set up servers in the virtual world in minutes versus weeks in the physical world.

Virtualization directly relates to a challenge businesses face every day: The requirement that they respond to their business needs in a non-demand fashion. Traditionally, they buy and set up new servers to handle growing loads. That takes a lot of time. So it's hard to respond to your business needs in a dynamic fashion. You respond to it, but in a time-consuming fashion.

What are some of the barriers to corporate adoption of virtualization?

Powers: The technology is relatively new, so the biggest barrier is education. Businesses have to be shown what the technology can do for them. It's a different way of doing things. Like I said, when businesses needed more processing power, they've gone out and bought five new servers. It's a change of your mindset to think, 'Well, wait a minute. We have extra capacity on these particular servers, so why don't we use virtual server technology to set them up quickly?' We saw this switch happen in networking 10 years ago, when vendors [started] to come out [with] virtual networking products to replace all the routers, switches and hardwiring in use then. It's just a change of mindset for companies to realize: 'Oh, wow! We could do it this different way.'

Grid computing is a big technology initiative for IBM. Which businesses will be the early adopters?

Powers: IBM is focusing on grid technology in the areas of research and development, enterprise optimization, government, business analytics, engineering and design. That gives grid an industry flavor. For example, the financial industry is interested in business analytics. Research and development is the focus of biotechnology, pharmaceutical and other technology industries. And engineering and design you'll find in the computer, digital media and/or industrial sectors. Enterprise optimization is used by almost any industry.

Those have been the hot areas. As more companies get experience with grids, they'll start employing them on a much grander scale across entire enterprises. That's when enterprise optimization will take off.

What should IT managers do to find out whether grid computing is right for their enterprises?

Powers: Look across the enterprise for applications that are parallel in nature or run for a certain amount of time and then return a result. Then get a pilot going. Most customers who have done proof-of-concept pilots have realized very quickly that the tremendous benefits grid offers across the entire enterprise. So a lot of these initial projects explode into many more projects across the enterprise.

What's going on with technologies that cut down the time spent on IT maintenance and management?

Powers: Management is one of the biggest costs of ownership today. 'Cut down the people cost,' most CIOs tell us. Eighty percent of [the IT shops' time] is spent trying to keep the business systems up and running. Only about 20% of their time can be spent on working on new projects that could provide value to the business. CIOs want to get that to a 50-50 mix.

Autonomic computing certainly can help. Autonomic computing can help build systems that are self-managing, self-optimizing, self-healing and self-protecting. If systems do a better job of managing themselves, people could work on other projects that are much more valuable to the business. Autonomic computing could help businesses keep up with rapid changes in technology by freeing IT staff to test those technologies.

How are new Web technologies changing the face of the enterprise?

Powers: The PC is not the center of the Internet-access universe anymore. As Internet access is expanded to more and more devices, more corporations will be expected to be online all the time. The expansion of Internet access is going to change the nature of content. So, it's important for businesses to be aware of what's going on in, for instance, the video gaming market and some of the fun Web-based consumer technologies.

Just take a gander at the Internet today and see the things that you can do today that you couldn't do seven years ago. It's quite stunning. In the next seven years, Web technology costs will continue to drop, and what you'll be able to do with the technology from a storage and CPU and bandwidth perspective will continue to skyrocket.

Are some of these new technologies going to increase efficiencies in networking?

Powers: Bandwidth costs continue to drop. The amount of data we can pipe over fiber gets greater and greater. These changes will foster enhancements to the user interface. So, use of more multimedia technologies -- real voice, real video without skips -- are all possible now. These things start to come together when you have greater network bandwidth. Broadband has opened up a whole new world.

Wireless is changing networking. Network access wasn't accessible to a lot of people at home or traveling, but -- as the wireless costs continued to drop -- it became very easy to have wireless in coffee shops or in airports and so on. We've got to a point where you can use wireless technology easily; it's as easy as picking up the phone and calling somebody. So wireless just becomes part of your every day, which is really exciting.

Doesn't wireless, particularly network access by wireless devices, complicate system management and integration in the business world?

Powers: Oh no, not anymore. Managing wireless devices and systems is very similar to managing network infrastructures in place today. The cost to deploy wireless is dramatically less than wiring buildings. The management costs, no matter how you look at it from a network perspective, are still relatively the same as wired costs.

FOR MORE INFORMATION: coverage of LinuxWorld news exclusive: "Torvalds talks 2.6.0 and SCO" news exclusive: "Analyst -- Virtualization will stall Linux" Ask the Experts

FEEDBACK: How will the 2.6.0 kernel further nudge Linux into enterprise data centers?
Send your feedback to the news team.

Dig Deeper on Linux servers

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.