DPU | 資料處理器
What is it?
Taken as a whole, the evolution of server technology and data center applications has revolved around finding new helpers for the central processing unit (CPU) to make computing faster. The most obvious example is the GPU, which was created to help render computer graphics; and by extension, the GPGPU, which came into existence when people realized the GPU could actually do more than render graphics. Through the synergies of heterogeneous computing and parallel computing, GPGPUs are now widely used to boost the CPUs' processing capabilities into the realms of high performance computing (HPC) and supercomputing.
The DPU, or data processing unit, is a more recent milestone in the actualization of this philosophy. Envisioned as the third pillar of the data center in addition to the CPU and the GPU, the DPU further helps out the CPU by taking over its networking and communication workloads. It uses hardware acceleration technology as well as high-performance network interfaces to excel at handling data transfers, data compression, data storage, data security, and data analytics. While these tasks have traditionally been carried out by the CPU, in a large-scale server farm or server room, delegating the tasks to DPUs can free up the CPUs for other workloads. This can make a huge difference in performance when working with data-intensive tasks, such as big data, artificial intelligence (AI), machine learning, and deep learning.
Free Download:《How to Build Your Data Center with GIGABYTE?》
The DPU, or data processing unit, is a more recent milestone in the actualization of this philosophy. Envisioned as the third pillar of the data center in addition to the CPU and the GPU, the DPU further helps out the CPU by taking over its networking and communication workloads. It uses hardware acceleration technology as well as high-performance network interfaces to excel at handling data transfers, data compression, data storage, data security, and data analytics. While these tasks have traditionally been carried out by the CPU, in a large-scale server farm or server room, delegating the tasks to DPUs can free up the CPUs for other workloads. This can make a huge difference in performance when working with data-intensive tasks, such as big data, artificial intelligence (AI), machine learning, and deep learning.
Free Download:《How to Build Your Data Center with GIGABYTE?》
Why do you need it?
Introducing DPUs into your computing cluster can upgrade your performance, especially if you are working on large-scale projects that are pushing the envelope of data technology. A data center that uses DPUs to move data between the processors can expect to see faster computing, higher availability, better security, and greater shareability. Its proponents are confident that DPUs will be the linchpin of future cloud computing data centers, as more and more of the world's data is poured into the cloud.
How is GIGABYTE helpful?
One of the foremost DPU products on the market is NVIDIA's BlueField®-2 DPU, which is designed to offload critical networking, storage, and security tasks from the CPUs, enabling organizations to transform their IT infrastructure into state-of-the-art data centers that are accelerated, fully programmable, and armed with "zero-trust" security features to prevent data breaches and cyberattacks. GIGABYTE G242-P32, a G-Series GPU Server, is part of the Nvidia Arm HPC Developer Kit, an integrated hardware and software platform for HPC, AI, and scientific computing applications; the kit is outfitted with two NVIDIA® Bluefield®-2 DPUs. The Arm HPC Developer Kit has already been put to good use by customers across many different sectors; including the Graduate Institute of Networking and Multimedia at Taiwan University (NTU), which has developed an intelligent transportation system, called a "high-precision traffic flow model", to test autonomous vehicles and identify accident-prone road sections for immediate redress.