F5 and NVIDIA Collaborate to Accelerate Edge AI for Service Providers

2 Sources

Share

F5 announces the deployment of BIG-IP Next Cloud-Native Network Functions on NVIDIA BlueField-3 DPUs, enhancing edge AI capabilities for service providers and supporting the future of AI-RAN.

News article

F5 and NVIDIA Join Forces to Enhance Edge AI Capabilities

F5, a leading multicloud application security and delivery company, has announced a significant collaboration with NVIDIA to accelerate AI at the edge for service providers. This partnership involves the deployment of F5's BIG-IP Next Cloud-Native Network Functions (CNFs) on NVIDIA BlueField-3 Data Processing Units (DPUs), marking a substantial advancement in edge AI technology

1

2

.

Revolutionizing Edge Infrastructure

The collaboration aims to address the growing demand for AI inferencing at the edge, a critical area of interest for telecom providers. By leveraging F5's proven network infrastructure capabilities and NVIDIA's advanced DPU technology, the solution offers optimized performance in Kubernetes environments and supports emerging edge AI use cases

2

.

Key features of this collaboration include:

  1. Enhanced edge firewall, DNS, and DDoS protection
  2. Optimized computing resources and reduced power consumption
  3. Improved data sovereignty and user experience
  4. Support for low-latency AI applications

Advancing AI-RAN Technology

A significant focus of this partnership is the advancement of AI-RAN (Radio Access Network) technology. AI-RAN seeks to transform mobile networks into multi-purpose infrastructures that maximize resource utilization and create new revenue streams through hosted AI services

1

2

.

The integration of BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs is expected to:

  1. Accelerate AI-RAN deployments
  2. Streamline traffic management for both AI and RAN workloads
  3. Provide enhanced firewall and DDoS protections
  4. Enable multi-tenancy and tenant isolation for essential workloads

Implications for Service Providers

This collaboration opens up key opportunities for service providers, including:

  1. Distributed N6-LAN capabilities for User Plane Functions (UPFs)
  2. Edge security services to support Distributed Access Architecture (DAA) and Private 5G
  3. Ability to leverage RAN compute infrastructure for AI offerings alongside existing services
  4. Significant cost savings and revenue potential through enhanced user offerings

Ahmed Guetari, VP and GM of Service Provider at F5, emphasized the importance of this collaboration, stating, "Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures"

2

.

Technical Details and Availability

F5 continues to leverage the NVIDIA DOCA software framework for seamless integration with NVIDIA BlueField DPUs. This integration provides F5 with a robust set of APIs, libraries, and tools to harness the hardware acceleration capabilities of NVIDIA BlueField DPUs

2

.

The accelerated F5 CNFs on NVIDIA BlueField-3 DPUs are designed to free up CPU resources, allowing for the execution of other applications. General availability for F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs is anticipated for June 2025

2

.

As the demand for edge AI continues to grow, this collaboration between F5 and NVIDIA represents a significant step forward in building AI-ready distributed infrastructure, empowering businesses to leverage AI and maintain a competitive edge in our increasingly connected world.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo