Get started Bring yourself up to speed with our introductory content.

Data center GPU use on the rise thanks to AI, big data

GPU vendors have added new devices and cards for data center servers, as data demanding workloads infiltrated the data center and overwhelmed traditional CPUs.

The recent push toward big data, AI and machine learning are creating a ripple effect among enterprise servers....

As traditional microprocessors struggle to effectively process information from these demanding workloads, data center GPUs move in to fill the void.

Graphics processing units, which have been around since the '70s, were initially used to offload video and graphic-heavy processing tasks from central processors. These systems have a different foundation than typical CPUs, which were built to maximize throughput on a single-stream, high-speed pipeline. CPUs were also designed to support rapid handoffs and to move information quickly from place to place, such as main memory to a storage system. GPUs have a different structure: They work with parallel processing and support multiple high-speed connections. These microprocessors have multiple data paths to process lots of data, which fits well with graphic applications.

Extending the reach of data center GPU use

GPUs have done a fine job of completing a narrow number of tasks, but gradually, their reach has expanded. Nvidia turned to GPUs to differentiate itself from other semiconductor suppliers and to find more uses for GPUs.

First, these products wormed their way into the high-performance computing arena. But recently, GPU vendors have designed devices and cards specifically for data center servers. The server-optimized GPUs use high-bandwidth memory and are offered either as modules for integration into a dedicated server design or as Peripheral Component Interconnect Express add-in cards. However, unlike the gaming cards, these cards provide no graphics interfaces.

Server vendors couple GPUs with CPUs to take advantage of the CPU's strengths. CPU performance improves when it doesn't work with data-intensive tasks.

Big data, machine learning and AI applications have high processing needs and work with massive amounts of information and different data types. These characteristics mesh well with GPU design.

AI and machine learning vendors use GPUs to support the processing of the vast amounts of data necessary to train neural networks. In this market, the availability of PCs with GPUs enables software developers to develop their algorithms on desktops prior to transferring the programs to higher-performance server-based GPUs, according to Alan Priestley, analyst at Gartner.

GPUs arrive in the data center

Data center GPU use will likely increase in the future. GPUs are important infrastructure attributes for mission-critical workloads. IT organizations can procure GPUs off the shelf and use standard libraries that they can easily incorporate into applications, Priestley said.

As a result, server vendors offer either dedicated servers that integrate GPU modules or products that support GPU add-in cards. Server-optimized GPU cards and modules using the highest-performing processors typically cost between $1,000 and $5,000, according to Gartner.

The established vendors are beginning to incorporate these add-ons in their product lines.

Dell supports the FirePro series of GPUs from Advanced Micro Devices, as well as GPUs from Nvidia, that are designed for virtual desktop infrastructure and compute applications and have processing power that supports up to 1,792 GPU cores. Hewlett Packard Enterprise's (HPE) ProLiant systems work with Nvidia Tesla, Nvidia GRID and Nvidia Quadro GPUs. The HPE Insight Cluster Management Utility installs and provisions the GPU drivers and monitors GPU health, such as temperature.

To prepare for further data center GPU use, administrators need to gain expertise in how to manage these processors. They should find individuals who are familiar with the technology, which isn't easy since the technology differs from traditional microprocessor design and there is little education for it, although Nvidia offers some training materials.

Next Steps

Data center opens up for GPU-accelerated computing

Who's winning in the GPU vs. CPU battle?

Get more efficiency and innovation with virtualized GPUs

This was last published in November 2017

Dig Deeper on Emerging IT workload types

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What workloads would GPUs benefit in your enterprise?
Cancel

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close