Joseph White, Dell; John Kim, NVIDIA; Mario Baldi, Pensando; Yadong Li, Intel; David McIntyre, Samsung
About this talk
As covered in our first webcast “SmartNICs and xPUs: Why is the Use of Accelerators Accelerating,” we discussed the trend to deploy dedicated accelerator chips to assist or offload the main CPU. These new accelerators (xPUs) have multiple names such as SmartNIC, DPU, IPU, APU, NAPU.
This second webcast in this series will cover a deeper dive into the accelerator offload functions of the xPU. We’ll discuss what problems the xPUs are coming to solve, where in the system they live, and the functions they implement, focusing on:
Network Offloads
• Virtual switching and NPU
• P4 pipelines
• QoS and policy enforcement
• NIC functions
• Gateway functions (tunnel termination, load balancing, etc)
Security
• Encryption
• Policy enforcement
• Key management and crypto
• Regular expression matching
• Firewall
• Deep Packet Inspection (DPI)
Compute
• AI calculations, model resolution
• General purpose processing (via local cores)
• Emerging use of P4 for general purpose
Storage
• Compression and data at rest encryption
• NVMe-oF offload
• Regular expression matching
• Storage stack offloads
SNIA is a not-for-profit global organization made up of corporations, universities, startups, and individuals. The members collaborate to develop and promote vendor-neutral architectures, standards, and education for management, movement, and security for technologies related to handling and optimizing data. SNIA focuses on the transport, storage, acceleration, format, protection, and optimization of infrastructure for data.…