STOCK TITAN

F5 Accelerates AI at the Edge for Service Providers with NVIDIA BlueField-3 DPUs

Rhea-AI Impact
(Low)
Rhea-AI Sentiment
(Positive)
Tags
AI

F5 (NASDAQ: FFIV) has announced BIG-IP Next Cloud-Native Network Functions (CNFs) deployment on NVIDIA BlueField-3 DPUs, enhancing their technology partnership. This solution provides edge firewall, DNS, and DDoS protection as lightweight cloud-native functions for optimized Kubernetes performance.

The collaboration aims to address AI application scaling challenges in distributed environments, particularly at network edges. Key benefits include:

  • Optimized computing resources
  • Reduced power consumption per Gbps
  • Lower operating expenses
  • Enhanced security with minimal latency

The solution supports critical AI applications requiring low latency, such as autonomous vehicles, fraud detection, NLP tools, and AR/VR experiences. This integration expands on F5's previous BIG-IP Next for Kubernetes deployment on NVIDIA DPUs, utilizing the NVIDIA DOCA software framework for seamless integration.

General availability is expected in June 2025.

F5 (NASDAQ: FFIV) ha annunciato il deployment delle FUNZIONI DI RETE CLOUD-NATIVE BIG-IP Next (CNF) su NVIDIA BlueField-3 DPU, migliorando la loro partnership tecnologica. Questa soluzione offre firewall edge, DNS e protezione DDoS come funzioni cloud-native leggere per ottimizzare le prestazioni di Kubernetes.

La collaborazione mira a affrontare le sfide di scalabilità delle applicazioni AI in ambienti distribuiti, in particolare ai margini della rete. I principali vantaggi includono:

  • Risorse di calcolo ottimizzate
  • Riduzione del consumo energetico per Gbps
  • Minori spese operative
  • Sicurezza migliorata con latenza minima

La soluzione supporta applicazioni AI critiche che richiedono bassa latenza, come veicoli autonomi, rilevamento delle frodi, strumenti NLP e esperienze AR/VR. Questa integrazione si basa sul precedente deployment di F5 BIG-IP Next per Kubernetes su DPU NVIDIA, utilizzando il framework software NVIDIA DOCA per un'integrazione senza soluzione di continuità.

La disponibilità generale è prevista per giugno 2025.

F5 (NASDAQ: FFIV) ha anunciado el despliegue de las FUNCIONES DE RED Nativas en la Nube BIG-IP Next (CNF) en NVIDIA BlueField-3 DPU, mejorando su asociación tecnológica. Esta solución proporciona firewall en el borde, DNS y protección DDoS como funciones ligeras nativas en la nube para optimizar el rendimiento de Kubernetes.

La colaboración tiene como objetivo abordar los desafíos de escalado de aplicaciones de IA en entornos distribuidos, especialmente en los bordes de la red. Los beneficios clave incluyen:

  • Recursos de computación optimizados
  • Reducción del consumo de energía por Gbps
  • Menores gastos operativos
  • Seguridad mejorada con latencia mínima

La solución admite aplicaciones de IA críticas que requieren baja latencia, como vehículos autónomos, detección de fraudes, herramientas de PLN y experiencias de AR/VR. Esta integración se expande sobre el despliegue anterior de F5 BIG-IP Next para Kubernetes en DPU NVIDIA, utilizando el marco de software NVIDIA DOCA para una integración fluida.

Se espera que la disponibilidad general sea en junio de 2025.

F5 (NASDAQ: FFIV)NVIDIA BlueField-3 DPU에서 BIG-IP Next 클라우드 네이티브 네트워크 기능(CNF)의 배포를 발표하며 기술 파트너십을 강화했습니다. 이 솔루션은 최적화된 Kubernetes 성능을 위해 경량 클라우드 네이티브 기능으로 엣지 방화벽, DNS 및 DDoS 보호를 제공합니다.

이번 협업은 분산 환경에서 AI 애플리케이션의 확장 문제를 해결하는 것을 목표로 하며, 특히 네트워크 엣지에서의 문제를 다룹니다. 주요 이점은 다음과 같습니다:

  • 최적화된 컴퓨팅 자원
  • Gbps당 전력 소비 감소
  • 운영 비용 절감
  • 최소 지연으로 강화된 보안

이 솔루션은 자율주행차, 사기 탐지, NLP 도구 및 AR/VR 경험과 같은 낮은 지연 시간이 필요한 중요한 AI 애플리케이션을 지원합니다. 이 통합은 NVIDIA DPU에서 Kubernetes에 대한 F5의 이전 BIG-IP Next 배포를 확장하며, 원활한 통합을 위해 NVIDIA DOCA 소프트웨어 프레임워크를 활용합니다.

일반 사용 가능성은 2025년 6월로 예상됩니다.

F5 (NASDAQ: FFIV) a annoncé le déploiement des FONCTIONS DE RÉSEAU CLOUD-NATIVES BIG-IP Next (CNF) sur NVIDIA BlueField-3 DPU, renforçant ainsi leur partenariat technologique. Cette solution offre un pare-feu en périphérie, un DNS et une protection DDoS sous forme de fonctions légères cloud-natives pour optimiser les performances de Kubernetes.

La collaboration vise à relever les défis de mise à l'échelle des applications d'IA dans des environnements distribués, en particulier aux bords du réseau. Les principaux avantages incluent:

  • Ressources informatiques optimisées
  • Réduction de la consommation d'énergie par Gbps
  • Coûts d'exploitation réduits
  • Sécurité améliorée avec une latence minimale

La solution prend en charge des applications d'IA critiques nécessitant une faible latence, telles que les véhicules autonomes, la détection de fraudes, les outils de traitement du langage naturel et les expériences AR/VR. Cette intégration s'appuie sur le déploiement précédent de F5 BIG-IP Next pour Kubernetes sur les DPU NVIDIA, en utilisant le cadre logiciel NVIDIA DOCA pour une intégration fluide.

La disponibilité générale est prévue pour juin 2025.

F5 (NASDAQ: FFIV) hat den Einsatz der BIG-IP Next Cloud-Native Network Functions (CNFs) auf NVIDIA BlueField-3 DPUs angekündigt und damit ihre technologische Partnerschaft gestärkt. Diese Lösung bietet Edge-Firewall, DNS und DDoS-Schutz als leichte cloud-native Funktionen zur Optimierung der Kubernetes-Leistung.

Die Zusammenarbeit zielt darauf ab, die Herausforderungen der Skalierung von KI-Anwendungen in verteilten Umgebungen, insbesondere an den Netzwerkrändern, anzugehen. Zu den wichtigsten Vorteilen gehören:

  • Optimierte Rechenressourcen
  • Reduzierter Energieverbrauch pro Gbps
  • Niedrigere Betriebskosten
  • Verbesserte Sicherheit mit minimaler Latenz

Die Lösung unterstützt kritische KI-Anwendungen, die eine niedrige Latenz erfordern, wie autonome Fahrzeuge, Betrugserkennung, NLP-Tools und AR/VR-Erlebnisse. Diese Integration baut auf F5s vorherigem BIG-IP Next für Kubernetes-Einsatz auf NVIDIA DPUs auf und nutzt das NVIDIA DOCA-Software-Framework für eine nahtlose Integration.

Die allgemeine Verfügbarkeit wird für Juni 2025 erwartet.

Positive
  • Partnership with NVIDIA enhances edge AI capabilities
  • Solution reduces power consumption and operating expenses
  • Enables new revenue streams through hosted AI services
  • Optimizes existing infrastructure for dual AI and RAN workloads
Negative
  • Product not available until June 2025

Insights

F5's partnership with NVIDIA to deploy BIG-IP Next Cloud-Native Network Functions on BlueField-3 DPUs represents a calculated strategic move to capture the emerging edge AI infrastructure market. This collaboration positions F5 to extend its networking and security expertise into a high-growth segment where processing needs are shifting toward distributed architectures.

The technical implementation leverages F5's dominant position in telecom networks—where they already power most Tier-1 providers—while addressing critical pain points for service providers: resource optimization, power efficiency, and security at the edge. By embedding these functions directly on NVIDIA's specialized DPUs, F5 enables telecom operators to transform passive infrastructure into revenue-generating AI compute platforms.

What's technically significant is how this solution tackles the fundamental architectural challenge of bringing AI capabilities closer to data sources without compromising on security or performance. The AI-RAN initiative particularly demonstrates how F5 is helping convert traditional radio access networks into multi-functional compute environments that can simultaneously support connectivity and AI processing workloads.

While the June 2025 availability timeline indicates this won't impact immediate quarterly results, the announcement signals F5's strategic pivot toward higher-value, software-defined networking solutions that align with the distributed infrastructure requirements of AI deployments. This represents F5's evolution from traditional networking vendor to a strategic enabler of next-generation AI infrastructure.

This partnership with NVIDIA significantly strengthens F5's competitive positioning in the rapidly evolving edge AI infrastructure market. For F5 investors, this collaboration represents potential new revenue streams beyond the company's traditional application delivery and security business, expanding their addressable market into AI-enhanced networking infrastructure.

The timing is strategic—while general availability isn't until June 2025, F5 is establishing early positioning in the edge AI market, particularly with telecom service providers. These customers typically have lengthy procurement and implementation cycles, making early market positioning important despite the extended revenue timeline.

The solution directly addresses mounting cost pressures facing telecom operators by optimizing resource utilization across distributed environments. By enabling operators to monetize existing infrastructure for AI applications, F5 creates a compelling value proposition that could drive adoption among its established customer base, where they already maintain significant market share in Tier-1 providers.

From a business model perspective, this cloud-native approach aligns with F5's ongoing transition from hardware-centric sales to more predictable software and subscription revenue. The deployment model on NVIDIA hardware suggests a potentially favorable margin profile compared to F5's traditional appliance business, while strengthening their recurring revenue streams.

This partnership demonstrates F5's strategic evolution toward higher-growth AI infrastructure markets while leveraging their core competencies in networking and security—a prudent approach that should resonate well with the company's enterprise and service provider customer base.

F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs turbocharge data management and security, unlocking new edge AI innovations and driving the future of AI-RAN

BARCELONA, Spain--(BUSINESS WIRE)-- MOBILE WORLD CONGRESS--F5 (NASDAQ: FFIV) today announced BIG-IP Next Cloud-Native Network Functions (CNFs) deployed on NVIDIA BlueField-3 DPUs, deepening the companies’ technology collaboration. This solution offers F5’s proven network infrastructure capabilities, such as edge firewall, DNS, and DDoS protection, as lightweight cloud-native functions accelerated with NVIDIA BlueField-3 DPUs to deliver optimized performance in Kubernetes environments and support emerging edge AI use cases.

The F5 Application Delivery and Security Platform powers a majority of the world’s Tier-1 5G, mobile, and fixed line telco networks. Service providers recognize the challenges of scaling AI applications across distributed environments, particularly as legacy infrastructures in the network core often lack the processing power required to make AI inferencing practical.

F5 CNFs running on NVIDIA DPUs can now be embedded in edge and far edge infrastructures to optimize computing resources, dramatically reduce power consumption per Gbps, and limit overall operating expenses. Further utilizing edge environments to add functionality and AI capabilities to subscriber services also comes with added security requirements, which F5 and NVIDIA BlueField technologies deliver alongside advanced traffic management while minimizing latency.

Deploying CNFs at the edge puts applications closer to users and their data, promoting data sovereignty, improving user experience, and reducing costs related to power, space, and cooling. Enabling low latency remains essential for AI applications and capabilities such as:

  • Immediate decision making, supporting autonomous vehicles and fraud detection.
  • Real-time user interaction, including NLP tools and AR/VR experiences.
  • Continuous monitoring and response, required for healthcare devices and manufacturing robotics.

Including CNFs on BlueField-3 DPUs expands on F5’s previously introduced BIG-IP Next for Kubernetes deployed on NVIDIA DPUs. F5 continues to leverage the NVIDIA DOCA software framework to seamlessly integrate its solutions with NVIDIA BlueField DPUs. This comprehensive development framework provides F5 with a robust set of APIs, libraries, and tools to harness the hardware acceleration capabilities of NVIDIA BlueField DPUs. By utilizing DOCA, F5 achieves rapid integration and high performance across various networking and security offloads while maintaining forward and backward compatibility across generations of BlueField DPUs. Further, accelerating F5 CNFs with NVIDIA BlueField-3 frees up CPU resources which can be used to run other applications.

Edge deployments open up key opportunities for service providers, including distributed N6-LAN capabilities for UPFs, and edge security services to support Distributed Access Architecture (DAA) and Private 5G. In addition, AI-RAN is gaining momentum, with SoftBank recently showcasing their production environment with NVIDIA.

Unlocking the potential of AI-RAN with NVIDIA and F5

AI-RAN seeks to transform mobile networks into multi-purpose infrastructures that maximize resource utilization, create new revenue streams through hosted AI services, and improve cost efficiency. Enabling mobile providers to support distributed AI computing with reliable, secure, and optimized connectivity, AI-RAN strengthens edge infrastructure capabilities by taking advantage of otherwise dormant processing power. Together, BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will accelerate AI-RAN deployments with streamlined traffic management for both AI and RAN workloads, as well as provide enhanced firewall and DDoS protections. Multi-tenancy and tenant isolation for workloads tied to essential capabilities will be natively integrated into the solution. With F5 and NVIDIA, mobile providers can intelligently leverage the same RAN compute infrastructure to power AI offerings alongside existing RAN services, driving significant cost savings and revenue potential through enhanced user offerings.

Supporting Quotes

“Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures, driving continued collaboration between F5 and NVIDIA,” said Ahmed Guetari, VP and GM, Service Provider at F5. “In particular, service providers see the edge as an area of rising interest, in that data ingest and inferencing no longer must take place at a centralized location or cloud environment, opening up myriad options to add intelligence and automation capabilities to networks while enhancing performance for users.”

“As demand for AI inferencing at the edge takes center stage, building an AI-ready distributed infrastructure is a key opportunity for telecom providers to create value for their customers,” said Ash Bhalgat, Senior Director of AI Networking and Security Solutions, Ecosystem and Marketing at NVIDIA. “F5’s cloud-native functions, accelerated with NVIDIA’s BlueField-3 DPUs, create a powerful solution for bringing AI closer to users while offering unparalleled performance, security, and efficiency for service providers. We're not just meeting edge AI demands; we're empowering businesses to leverage AI to maintain a competitive edge in our connected world.”

Availability

General availability for F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs is anticipated for June 2025. For additional information, please visit F5 at the NVIDIA GTC event taking place March 17–21 in San Jose, California, read the companion blog, and contact F5.

About F5
F5 is a multicloud application security and delivery company committed to bringing a better digital world to life. F5 partners with the world’s largest, most advanced organizations to secure every app—on premises, in the cloud, or at the edge. F5 enables businesses to continuously stay ahead of threats while delivering exceptional, secure digital experiences for their customers. For more information, go to f5.com. (NASDAQ: FFIV)

You can also follow @F5 on X or visit us on LinkedIn and Facebook to learn about F5, its partners, and technologies. F5, BIG-IP, and BIG-IP Next are trademarks, service marks, or tradenames of F5, Inc., in the U.S. and other countries. All other product and company names herein may be trademarks of their respective owners. The use of the terms “partner,” “partners,” “partnership,” or “partnering” in this press release does not imply that a joint venture exists between F5 and any other company.

This press release contains forward-looking statements including, among other things, statements regarding potential benefits and availability of F5 BIG-IP Next Cloud-Native Network Functions running on NVIDIA BlueField-3 DPUs. Each customer’s unique environment, objectives, and constraints could impact potential benefits, while surrounding factors could affect availability timing.

Source: F5, Inc.

Jenna Becker

F5

(415) 857-2864

j.becker@f5.com

Holly Lancaster

WE Communications

(415) 547-7054

hluka@we-worldwide.com

Source: F5, Inc.

FAQ

When will F5's BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs be available?

General availability is anticipated for June 2025.

What are the main benefits of F5's FFIV new edge AI solution with NVIDIA?

The solution optimizes computing resources, reduces power consumption per Gbps, lowers operating expenses, and provides enhanced security with minimal latency.

How does F5 FFIV integrate with NVIDIA BlueField DPUs?

F5 uses NVIDIA's DOCA software framework, providing APIs, libraries, and tools to integrate with BlueField DPUs for hardware acceleration.

What key AI applications will F5 FFIV's edge solution support?

The solution supports autonomous vehicles, fraud detection, NLP tools, AR/VR experiences, healthcare devices, and manufacturing robotics.

What security features does F5 FFIV's new edge solution provide?

The solution includes edge firewall, DNS protection, DDoS protection, and multi-tenancy isolation capabilities.

F5 INC

NASDAQ:FFIV

FFIV Rankings

FFIV Latest News

FFIV Stock Data

16.82B
56.92M
0.65%
97.7%
2.78%
Software - Infrastructure
Computer Communications Equipment
Link
United States
SEATTLE