IBM is restructuring the way it licenses software to accommodate the cornucopia of processors in the market today...
and the spread of virtualization in the data center.
Rather than basing software licensing on physical attributes, such as the number of chips or cores, IBM plans to test new processors as they hit the market and assign a number to them, called processor value units. The company says this will be a better way for a customer to get the appropriate number of licenses for the hardware they're running.
The first chip it plans to test is a quad-core processor from Intel Corp. expected out by the end of the year. For chips already out, IBM plans on taking the number of existing software licenses and multiplying by 100 to determine the new measurement. For example, a dual-core Xeon chip, which previously would have one license, would have 100 processor value units, with each core of the chip grabbing 50 units.
IBM says the benchmarking of the chip should be available on the same day the chip is released so that data centers will have a good idea of how much their software will cost if they want to upgrade to the new chip or buy new.
"We have been in this situation for a number of years now that a processor is a processor is a processor, that for the most part, everything is kind of close enough that we can kind of treat everything the same," said Gordon Haff, Analyst with Nashua, NH-based Illuminata. "With all the different flavors of multicore that exist and are coming to market, you can't really do that anymore."
Haff said it remains to be seen what will come of this new structure, in terms of how specific and varied IBM will get on the performance benchmarks. It could keep the measurements somewhat generalized or get them as specific as they used to be in the old minicomputer days.
For the time being, IBM has put a calculator online so data centers can plug in their processor type and get a result in processor value units. Once new chips are out, the company could use multiple benchmarking standards to come to a number: SPEC, TPC-C or vendor specific measurements.
Jeff Tieszen, an IBM spokesman said that there will probably be confusion among some customers at the beginning, but the company is providing training to business partners and making educational material available.
"Once the customers get used to this processor value unit model, the whole idea is giving customers the idea of what value they're getting out of their processors," said Tieszen said.
Haff said that IBM's software licensing structure may have to move beyond measuring processor performance. With the influx of server virtualization, it may become more important to measure how an entire network performs, rather than physical chips.
"I think we're going to have to move away from having physical-based licensing at all," he said. "If you move to a virtualized environment, one minute you might be running software on two processors and one system, and another minute you might be running on five processors on one system. This still presupposes that you're running software on a fixed set of hardware resources. It still assumes that things are static, at least up to a certain point. With virtualization, you don't have that."
Let us know what you think about the story; e-mail: Mark Fontecchio, News Writer