Gorodenkoff - stock.adobe.com

What does a DPU do?

Learn the basics of data processing units: What they are, what they do, who's making them and who's adopting them. Then decide whether your data center might use one someday.

A data processing unit is a hardware accelerator specifically geared toward data processing and data-centric computing. It differs from other hardware accelerators such as the CPU and GPU in that it has an increased degree of parallelism and MIMD architecture.

Many organizations use the DPU for supercomputing tasks such as AI and big data. To decide whether your organization requires a DPU in its data center, understand its use cases and drawbacks.

What does a DPU do?

A DPU offloads networking and communication workloads from the CPU, which enables the CPU to then tackle application support tasks instead. It focuses on data-centric workloads such as data transfer, data reduction, data security and analytics. The chip features a specialized design that combines processor cores with hardware accelerator blocks. This design makes the DPU a more versatile, general-purpose chip than the GPU. The DPU possesses its own dedicated OS, which means you can combine its resources with your primary OS's resources, and it can perform functions such as encryption, erasure coding, and compression or decompression.

Cloud and hyperscale providers have been the earliest adopters of this technology. However, vendors like VMware have started to add support for DPUs into their offerings, which gives them a broader appeal for other organizations.

Support storage with a DPU

Because of DPUs' versatility as a processing unit, you can use DPUs to support storage in your data center. For example, you can accelerate access to NVMe storage devices by connecting them to the DPU's PCIe bus.

DPU also gives you better access to remote storage devices that rely on NVMe-oF. The DPU presents these remote storage devices to the system as standard NVMe devices. This optimizes your connectivity to the remote storage because it means you no longer require special drivers to connect to these remote storage devices.

DPUs and data-centric architecture

The DPU represents just one part of a data-centric architecture. This paradigm requires you to build infrastructure around data requirements, as opposed to forcing data to fit infrastructure. It makes data the primary consideration for application development, business decisions and infrastructure deployment. A data-centric organization treats data as its central asset, eliminates silos and mitigates sprawl by implementing a single data strategy for multiple applications.

Data-centric hardware -- such as the DPU -- eases the movement and delivery of data. It should deliver high availability and reliability, and should enable the entire organization to access that shared data in real time. Its performance, capacity, scalability and security should change to meet new workload requirements and adapt to new technologies.

Within the context of a data-centric architecture, the DPU addresses server nodes' inefficiency when it comes to data-centric computation, and it also addresses slow or inefficient transfer or sharing of data between server nodes.

The increased popularity of the DPU

In 2020, the startup Fungible released the first version of the DPU. It created two separate versions of the processing unit: one for storage and one for networking. Both versions of the Fungible DPU included memory and on-chip processing intended for tasks such as storage, security, networking and virtualization. Fungible designed them to confer the benefits of hyper-converged infrastructure but with greater sharing of storage and networking resources.

Since the release of Fungible's DPU, vendors such as Nvidia and Intel have released their own versions of this technology. In June 2021, Intel released its infrastructure processing unit chip, which does the same job as a DPU. On Intel's heels, in July 2021, Nvidia unveiled its own DPU. These processing units -- as well as those from additional competitors such as Marvell and AWS -- all offload tasks from the host processor to accelerate and streamline data computing workloads. Nvidia expects telecommunications companies and cloud providers to adopt its technology first, but the boom in DPU offerings from major vendors means you might see it in other data centers soon as well.

Dig Deeper on Data center server infrastructure and OSes

SearchWindowsServer
SearchServerVirtualization
SearchCloudComputing
Close