Five years ago, at the SC2004 supercomputing conference in Pittsburgh, several supercomputing experts tossed around...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
predictions that it wouldn't be long before large retail businesses started to adopt such machines for use inside their data centers.
The idea was intriguing and made good business sense -- for years, large retailers have gathered large amounts of data about their customers' buying habits, spending patterns and wants, and more and more computing power is required to crunch all that data to figure out what it means for retailers. Certainly, supercomputers, with their incredible speed and power, can help make that easier, especially since supercomputer prices have come down and their form factors have become more like smaller, traditional servers.
High-performance computing in the retail space
Five years after that conference, have supercomputers that make up the high-performance computing (HPC) marketplace really become more useful and affordable for businesses that hadn't embraced them in the past due to then-unnecessary complexity and higher costs?
Some supercomputing and HPC adoption by nontraditional customers has happened and more is likely to come, driven by increased need for business intelligence culled from all the data that retailers and other large businesses have collected, according to several industry analysts.
The key, says Dan Olds, principal of Gabriel Consulting Group Inc. in Beaverton, Ore., will increasingly be the use of predictive analytics combined with collected customer data to give companies an edge in a more competitive global economy.
"What you will be doing is sophisticated statistical techniques to tease out data relationships that you didn't know about before," Olds said. "What I think is cranking this up to the next level is that everybody's doing business intelligence. What you need to know if you are Sears or JCPenney is not what the customer has already bought but what will they buy next and, 'Will they buy it from me?' That's where this all fits together."
By having super-detailed data analyzed using powerful HPC equipment, companies will be better able to make accurate predictions on future sales, revenue and buying patterns, Olds said. "Using these kinds of techniques, it's better than human hunches or gut feelings. This is where I believe a hell of a lot of computing power is going to be sold. It's essentially the same kind of thing that's being done in scientific computing -- building [computer] models" to predict behaviors and trends.
Financial services companies on Wall Street have been doing this kind of analysis for years using HPC, Olds said.
Much of this kind of analysis, however, is likely being done under the radar of competitors, he said. "The companies that are early on this and are doing the best job of it will never talk about it," Olds said. "It is going to become the crown jewels."
The idea of using powerful machines such as supercomputers will give companies extra options, he said. "What I see in the future is that companies are going to have analytic models of their businesses so they can test out strategies without having to actually do them … to get a feel for future business."
The possibilities of supercomputing
What will actually be used, though, depends upon your definition of a supercomputer, he said. For some, supercomputers might be powerful servers that run large numbers of virtual machines, while for others, supercomputer power might be provided by large clusters of blade servers that equal the processing power of traditional supercomputers.
The possibilities are endless and very intriguing for businesses that want to harness and take advantage of this kind of power and analysis to delve deeply into their data, he said.
"In my mind, after we get out of this virtualization/consolidation thing that we're in right now, I think it's the next big thing," Olds said. "We have all this data out there. It's now time to analyze it and to figure out what it really means and how to use it as a weapon" against your competition.
Charles King, principal analyst for Hayward, Calif.-based Pund-IT Inc., said in an e-mail reply that he also sees that "supercomputing solutions are definitely headed down-market, though it's also fair to say that we're still in the early days of this development."One factor that will aid this transition, he said, is a "fundamental shift in supercomputing and HPC triggered by clustered industry standard (x86/64) technologies at every level of the market" replacing specialized and more costly supercomputing architectures.
On the other hand, King said, businesses turning to supercomputing will need to have adequate in-house Linux expertise, since that is often used in HPC machines, he said.
That is changing a bit since the introduction of Microsoft's Windows HPC Server operating system last year, King said. "So far, most of the successes I've heard about relate to product design and development -- basically using Windows HPC to create virtual models and simulations of complex products, and thus lowering the cost of those processes. But over time, I expect these solutions to become popular in others, including financial simulations."
Another analyst, Gordon Haff of Nashua, N.H.-based Illuminata Inc., said that while there is evidence that some large retailers have been doing these kinds of deeper data analyses, he doesn't necessarily think that they'll require true supercomputers.
"At one level, with a few exceptions, there's really no such thing as a supercomputer anymore," Haff said, because whatever is needed can be built from collections of commodity servers to assemble whatever computing power is required.
"We conveniently call these collections of blade servers or rack-mounted servers," he said. "One of the things that we're seeing is it's mostly the same kinds of technology that a business would buy to do it's regular computing on. You don't need to buy a $1 million supercomputer that's only good for supercomputing."
One large retailer that does heavily use supercomputing is Amazon.com, Haff said. The difference is that the huge e-commerce company does "massive analytics," he said. "Amazon does more analysis than more traditional retailers. One reason is because Amazon has all the data" from customer transactions, including data on which items customers search for on Amazon's website.
This more-detailed analysis in the retail segment has been promised for years, but hasn't fully occurred everywhere, he said. "I think the whole data warehousing and data analytics trend in retail is something that has been a long time coming."
For some companies, it's not even needed, he said. "The old thought with it was that Home Depot would then know which week people will buy snow shovels," Haff said. "But it turns out that stores like Home Depot don't need a computer model or analysis to tell them that -- they just need to watch the weather forecasts to see when it will snow," he said. "The local Home Depot is going to know and they don't really need a computer to tell them that."
ABOUT THE AUTHOR: Todd R. Weiss is an award-winning technology journalist and freelance writer who worked as a staff reporter for Computerworld.com from 2000 to 2008. He spends his spare time working on a book about an unheralded member of the 1957 Milwaukee Braves and watching classic Humphrey Bogart movies. Follow him on Twitter @TechManTalking.
What did you think of this feature? Write to SearchDataCenter.com's Matt Stansberry about your data center concerns at email@example.com.