Curated by THEOUTPOST
On Wed, 5 Mar, 4:04 PM UTC
2 Sources
[1]
F5 Leverages NVIDIA BlueField-3 DPUs to Accelerate Edge AI for Service Providers
Edge deployments open up key opportunities for service providers, including distributed N6-LAN capabilities for UPFs, and edge security services to support Distributed Access Architecture (DAA) and Private 5G. In addition, AI-RAN is gaining momentum, with SoftBank recently showcasing their production environment with NVIDIA. Unlocking the potential of AI-RAN with NVIDIA and F5 AI-RAN seeks to transform mobile networks into multi-purpose infrastructures that maximize resource utilization, create new revenue streams through hosted AI services, and improve cost efficiency. Enabling mobile providers to support distributed AI computing with reliable, secure, and optimized connectivity, AI-RAN strengthens edge infrastructure capabilities by taking advantage of otherwise dormant processing power. Together, BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will accelerate AI-RAN deployments with streamlined traffic management for both AI and RAN workloads, as well as provide enhanced firewall and DDoS protections. Multi-tenancy and tenant isolation for workloads tied to essential capabilities will be natively integrated into the solution. With F5 and NVIDIA, mobile providers can intelligently leverage the same RAN compute infrastructure to power AI offerings alongside existing RAN services, driving significant cost savings and revenue potential through enhanced user offerings.
[2]
F5 Accelerates AI at the Edge for Service Providers with NVIDIA BlueField-3 DPUs
F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs turbocharge data management and security, unlocking new edge AI innovations and driving the future of AI-RAN F5 (NASDAQ: FFIV) today announced BIG-IP Next Cloud-Native Network Functions (CNFs) deployed on NVIDIA BlueField-3 DPUs, deepening the companies' technology collaboration. This solution offers F5's proven network infrastructure capabilities, such as edge firewall, DNS, and DDoS protection, as lightweight cloud-native functions accelerated with NVIDIA BlueField-3 DPUs to deliver optimized performance in Kubernetes environments and support emerging edge AI use cases. The F5 Application Delivery and Security Platform powers a majority of the world's Tier-1 5G, mobile, and fixed line telco networks. Service providers recognize the challenges of scaling AI applications across distributed environments, particularly as legacy infrastructures in the network core often lack the processing power required to make AI inferencing practical. F5 CNFs running on NVIDIA DPUs can now be embedded in edge and far edge infrastructures to optimize computing resources, dramatically reduce power consumption per Gbps, and limit overall operating expenses. Further utilizing edge environments to add functionality and AI capabilities to subscriber services also comes with added security requirements, which F5 and NVIDIA BlueField technologies deliver alongside advanced traffic management while minimizing latency. Deploying CNFs at the edge puts applications closer to users and their data, promoting data sovereignty, improving user experience, and reducing costs related to power, space, and cooling. Enabling low latency remains essential for AI applications and capabilities such as: Immediate decision making, supporting autonomous vehicles and fraud detection. Real-time user interaction, including NLP tools and AR/VR experiences. Continuous monitoring and response, required for healthcare devices and manufacturing robotics. Including CNFs on BlueField-3 DPUs expands on F5's previously introduced BIG-IP Next for Kubernetes deployed on NVIDIA DPUs. F5 continues to leverage the NVIDIA DOCA software framework to seamlessly integrate its solutions with NVIDIA BlueField DPUs. This comprehensive development framework provides F5 with a robust set of APIs, libraries, and tools to harness the hardware acceleration capabilities of NVIDIA BlueField DPUs. By utilizing DOCA, F5 achieves rapid integration and high performance across various networking and security offloads while maintaining forward and backward compatibility across generations of BlueField DPUs. Further, accelerating F5 CNFs with NVIDIA BlueField-3 frees up CPU resources which can be used to run other applications. Edge deployments open up key opportunities for service providers, including distributed N6-LAN capabilities for UPFs, and edge security services to support Distributed Access Architecture (DAA) and Private 5G. In addition, AI-RAN is gaining momentum, with SoftBank recently showcasing their production environment with NVIDIA. Unlocking the potential of AI-RAN with NVIDIA and F5 AI-RAN seeks to transform mobile networks into multi-purpose infrastructures that maximize resource utilization, create new revenue streams through hosted AI services, and improve cost efficiency. Enabling mobile providers to support distributed AI computing with reliable, secure, and optimized connectivity, AI-RAN strengthens edge infrastructure capabilities by taking advantage of otherwise dormant processing power. Together, BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will accelerate AI-RAN deployments with streamlined traffic management for both AI and RAN workloads, as well as provide enhanced firewall and DDoS protections. Multi-tenancy and tenant isolation for workloads tied to essential capabilities will be natively integrated into the solution. With F5 and NVIDIA, mobile providers can intelligently leverage the same RAN compute infrastructure to power AI offerings alongside existing RAN services, driving significant cost savings and revenue potential through enhanced user offerings. Supporting Quotes "Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures, driving continued collaboration between F5 and NVIDIA," said Ahmed Guetari, VP and GM, Service Provider at F5. "In particular, service providers see the edge as an area of rising interest, in that data ingest and inferencing no longer must take place at a centralized location or cloud environment, opening up myriad options to add intelligence and automation capabilities to networks while enhancing performance for users." "As demand for AI inferencing at the edge takes center stage, building an AI-ready distributed infrastructure is a key opportunity for telecom providers to create value for their customers," said Ash Bhalgat, Senior Director of AI Networking and Security Solutions, Ecosystem and Marketing at NVIDIA. "F5's cloud-native functions, accelerated with NVIDIA's BlueField-3 DPUs, create a powerful solution for bringing AI closer to users while offering unparalleled performance, security, and efficiency for service providers. We're not just meeting edge AI demands; we're empowering businesses to leverage AI to maintain a competitive edge in our connected world." Availability General availability for F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs is anticipated for June 2025. About F5 F5 is a multicloud application security and delivery company committed to bringing a better digital world to life. F5 partners with the world's largest, most advanced organizations to secure every app -- on premises, in the cloud, or at the edge. F5 enables businesses to continuously stay ahead of threats while delivering exceptional, secure digital experiences for their customers. For more information, go to f5.com. (NASDAQ: FFIV)
Share
Share
Copy Link
F5 announces the deployment of BIG-IP Next Cloud-Native Network Functions on NVIDIA BlueField-3 DPUs, enhancing edge AI capabilities for service providers and supporting the future of AI-RAN.
F5, a leading multicloud application security and delivery company, has announced a significant collaboration with NVIDIA to accelerate AI at the edge for service providers. This partnership involves the deployment of F5's BIG-IP Next Cloud-Native Network Functions (CNFs) on NVIDIA BlueField-3 Data Processing Units (DPUs), marking a substantial advancement in edge AI technology 12.
The collaboration aims to address the growing demand for AI inferencing at the edge, a critical area of interest for telecom providers. By leveraging F5's proven network infrastructure capabilities and NVIDIA's advanced DPU technology, the solution offers optimized performance in Kubernetes environments and supports emerging edge AI use cases 2.
Key features of this collaboration include:
A significant focus of this partnership is the advancement of AI-RAN (Radio Access Network) technology. AI-RAN seeks to transform mobile networks into multi-purpose infrastructures that maximize resource utilization and create new revenue streams through hosted AI services 12.
The integration of BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs is expected to:
This collaboration opens up key opportunities for service providers, including:
Ahmed Guetari, VP and GM of Service Provider at F5, emphasized the importance of this collaboration, stating, "Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures" 2.
F5 continues to leverage the NVIDIA DOCA software framework for seamless integration with NVIDIA BlueField DPUs. This integration provides F5 with a robust set of APIs, libraries, and tools to harness the hardware acceleration capabilities of NVIDIA BlueField DPUs 2.
The accelerated F5 CNFs on NVIDIA BlueField-3 DPUs are designed to free up CPU resources, allowing for the execution of other applications. General availability for F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs is anticipated for June 2025 2.
As the demand for edge AI continues to grow, this collaboration between F5 and NVIDIA represents a significant step forward in building AI-ready distributed infrastructure, empowering businesses to leverage AI and maintain a competitive edge in our increasingly connected world.
Reference
[1]
F5 launches BIG-IP Next for Kubernetes, an innovative AI application delivery and security solution that leverages NVIDIA BlueField-3 DPUs to enhance efficiency and performance for large-scale AI deployments.
8 Sources
8 Sources
F5 introduces a new platform that combines high-performance load balancing with advanced security features, designed to meet the demands of AI-powered applications in hybrid multicloud environments.
4 Sources
4 Sources
F5 introduces ADC 3.0, a next-generation Application Delivery Controller framework designed to address the challenges of AI-driven applications in hybrid and multicloud environments.
3 Sources
3 Sources
Nokia, Nvidia, and major telecom companies are collaborating to accelerate the development of AI-powered Radio Access Networks (AI-RAN), aiming to enhance 5G network efficiency and pave the way for 6G technology.
2 Sources
2 Sources
Juniper Networks introduces a purpose-built solution for GPUaaS and AIaaS providers, aiming to accelerate AI delivery and simplify operations with enhanced visibility, performance, and security.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved