Nmedia - Fotolia

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

GPU-accelerated computing makes its way into the data center

Once viewed narrowly as a gaming technology, GPUs have made their way into enterprise data centers, fueling initiatives around machine learning, artificial intelligence and more.

This article can also be found in the Premium Editorial Download: Modern Infrastructure: Not playing games: The GPU vs. CPU question gets more interesting:

IT pros struggling with a sluggish data analytics project might find inspiration from an unlikely source: video games.

Born to satisfy the demands of gamers, graphics processing units (GPUs) are finding a new niche in enterprise data centers. Their parallel architecture makes them well-suited to some traditional workloads that can benefit from GPU-accelerated computing, said Jason Stamper, an analyst at 451 Research.

"Because a GPU has thousands of cores and was designed to handle split-second movement of graphics on a large screen, if you point it at rows and columns of data, it's incredibly fast at doing analytics processing," Stamper said.

When CPU clock speeds doubled every two years, there was little demand for the type of speed GPU-accelerated computing enables. But as improvements in clock speed tapered, users with high compute demands began exploring GPUs as an alternative, he said. GPUs offer slower clock speeds than CPUs, but they can process thousands of threads simultaneously. Early adopters included scientific computing and real-world modeling applications, which must simultaneously process the influence of multiple variables.

"If you saw your weather forecast today, that's computed on the backend by a supercomputer processing thousands of parallel factors to calculate a model," said Roy Kim, director of data center GPU computing at Nvidia, a pioneer in graphics processing based in Santa Clara, Calif.

While supercomputer architects and scientific researchers were among the first to use GPUs for general-purpose processing, the volume of data that businesses collect and analyze is outpacing annual CPU improvements.

Sequel to CPUs for SQL

As a Middle Eastern studies graduate student at Harvard, Todd Mostak collected hundreds of millions of tweets for his thesis, which examined the Arab Spring through the lens of Twitter. Mostak's goal was to analyze the tweets and create geospatial visualizations, but he ran into trouble using conventional CPU-based systems.

"I found that some of these processes would take overnight," Mostak said. "I'd wake up in the morning, realize I'd done it wrong and have to start my analysis all over."

Frustrated, and without access to a large server cluster capable of handling the load, Mostak created his own database system powered by off-the-shelf gaming GPUs.

Mostak is now CEO of MapD Technologies, a San Francisco-based company that built a SQL-compliant database software platform spawned from his grad school prototype. A far cry from its gaming heritage, a GPU is ideally suited to accelerate the comparatively mundane task of processing SQL queries, Mostak said.

"Ultimately, SQL is a set language and you're applying the same operation to every row in the set," he said. "That set could be billions of rows. The GPU can basically parallelize this. You can assign each row of data to a core."

GPUs are not a silver bullet for all workloads. An application must be refactored or specifically written to run on a GPU. Some processes are obviously parallel and benefit the most from GPUs. Other processes are highly sequential and run best on CPUs.

GPUs container many more processing cores than CPUs

"Then you have the broader class of problems that are somewhere in between," Mostak said. "They're not trivial to parallelize, but, if you structure the code in the right way, you can still reap huge benefits from a GPU. It's about how you map the computation to the data."

This approach, used by MapD and others, is called GPU-accelerated computing. It aims to split the work between a CPU and GPU. Even if the majority of an application's code requires sequential processing, it is sometimes possible to offload certain compute-intensive processes of that code to a GPU.

As the volume of data that organizations collect has increased, so too has the business demand to derive value from that data. That's why GPU-accelerated databases are just starting to heat up, 451's Stamper said. "We're seeing very strong growth in this space," he said.

Nike, for example, uses GPU-enabled servers and MapD's database software to analyze historical sales data and predict demand in particular regions. Another MapD customer, Verizon, uses GPU-powered systems to analyze logs from servers tracking mobile phones.

MapD isn't alone in this emerging market. London-based Brytlyt and New York-based SQream Technologies offer competing approaches to GPU-accelerated analytics.

"It's not only about a tsunami of data; it's also a tsunami of completely new businesses with new demands," said SQream CEO Ami Gal. "Uber is a data company disrupting the taxi industry, and Airbnb is a data company disrupting the rental system."

In the last few years, we've seen this big bang of artificial intelligence and machine learning, which just happens to be the ideal parallel workload.
Roy Kimdirector of data center GPU computing at Nvidia

Many major cloud providers now offer instances for GPU-accelerated computing, which can lower the cost of entry for a company experimenting with an analytics or visualization project. However, a number of factors could slow the growth of GPU-powered analytics, Stamper said. With systems as large, complex and expensive as enterprise database platforms, there's an understandable hesitance to rip and replace. And most of the companies offering GPU-accelerated database systems are small startups that don't have the name recognition, sales force or support infrastructure of traditional database giants.

"Although there's frustration in the enterprise because they're not getting the information as fast as they'd like, there's pressure on the IT department not to spend too much money on new technologies and not to add to complexity," Stamper said.

Even so, for companies looking to quickly analyze large data sets, the raw performance gains will be impossible to ignore. GPU-accelerated database systems are capable of replacing entire server clusters for some workloads, he said.

A smarter approach

The ability of a single GPU to process simultaneous threads faster than a cluster of CPU-based servers makes GPUs ideal for another type of emerging workload.

"In the last few years, we've seen this big bang of artificial intelligence and machine learning, which just happens to be the ideal parallel workload," Nvidia's Kim said.

There are several approaches to artificial intelligence, but one that is growing rapidly -- thanks to the emergence of GPUs for general-purpose computing -- is deep learning.

The deep learning angle on AI aims to emulate the human brain's array of neurons with a virtual neural network of computer nodes. A network is made of several layers, with each node performing a specific function. Output data from each node is then weighed to deduce a probable result. The concept is decades old, but, until recently, only very large clusters and supercomputers were capable of creating a robust neural net.

Nuance Communications, the Burlington, Mass., company behind the popular Dragon NaturallySpeaking speech-recognition software, relies on deep learning neural networks powered by GPUs to develop its language-learning models.

"We train a system to understand language by showing it thousands of spoken utterances as a sample. That's a lot of computation," said Nils Lenke, research director at Nuance. "In these networks, the nodes perform simple calculations. And because we have lots of nodes that need to do the same thing at the same time, GPUs are ideally suited to help."

Volume of digital data chart

A growing number of applications support GPU-accelerated computing, including many of the top artificial intelligence frameworks.

While the horizon is promising for the GPU, that doesn't necessarily mean it will be the only technology to supplant CPUs. Intel is betting big -- specifically the $16.7 billion it spent to acquire specialized chipmaker Altera -- not on GPUs, but on field-programmable gate arrays (FPGAs). Like GPUs, FPGAs can perform certain calculations faster than CPUs. Rather than serve as general-purpose processors, FPGAs can be programmed to more efficiently execute a specific set of instructions. 

Instead of using GPUs, Maryland-based Ryft offers an FPGA-accelerated database analytics platform. Also, Microsoft recently revealed that its Azure-based artificial intelligence services are powered by Altera FPGAs.

"Whether it be GPUs or FPGAs, it's clear that accelerators are the way forward," Nvidia's Kim said. 

Next Steps

Experts predict GPU's role in deep learning

When do you need virtual GPUs?

Explore virtualized GPU vendors

Explore use cases for GPU cloud instances

This was last published in February 2017

Dig Deeper on Emerging IT workload types

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How would a GPU-accelerated database improve operations in your data center?
Cancel

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close