Aria Networks raises $125M to build AI-native networking for the AI factory era

3 Sources

Share

Aria Networks, a Palo Alto startup, has secured $125 million in its first funding round to develop AI-native networking infrastructure. The company unveiled Deep Networking, featuring 800G and 1.6T Ethernet switches designed to optimize token efficiency in AI data centers. With customer orders already in hand, Aria aims to make networking a performance differentiator rather than background plumbing.

Aria Networks Secures Major Funding for AI Infrastructure

Aria Networks has raised $125 million in its first series funding round, marking a significant entry into the competitive AI networking space

1

. The Palo Alto-based startup, founded in 2025, secured backing from Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures

2

. Gavin Baker, managing partner at Atreides Management, has joined Aria's board alongside Stefan Dyckerhoff of Sutter Hill Ventures and the company's founders

2

.

Source: ET

Source: ET

AI-Native Networking Platform Addresses Critical Infrastructure Needs

Concurrent with the funding announcement, Aria Networks unveiled what it calls the world's first AI-native networking platform, built specifically for the AI factory era

3

. CEO and co-founder Mansour Karam, who previously founded intent-based networking provider Apstra before its acquisition by Juniper Networks in 2021, emphasized that networking is no longer background plumbing but a critical performance differentiator for AI infrastructure at scale

3

. The platform centers on Deep Networking, a collection of hardware and software components designed to optimize token efficiency, which measures how much text an AI model can process during a single interaction and directly impacts data center costs.

Advanced Switches for Artificial Intelligence Clusters

The Deep Networking portfolio features switches based on 800G and 1.6T Ethernet protocols, the latest standards underpinning most AI clusters

1

. Aria offers a 64-port 1.6T switch available in both air and liquid-cooled editions, with the 1.6T version providing twice the throughput of 800G through enhanced SerDes components that consolidate parallel data streams

1

. The company's 800G Ethernet lineup includes 64-port and 128-port devices capable of managing up to 51.2 terabits and 102.4 terabits of traffic per second, respectively

1

. These switches incorporate packet buffers to temporarily store delayed data traffic during port congestion, eliminating the need to discard traffic and avoiding associated inefficiencies

1

.

Source: SiliconANGLE

Source: SiliconANGLE

Deep Telemetry and AI-Driven Network Optimization

Aria's switches run on a customized version of SONiC, an open-source network operating system based on Linux with a containerized design that allows administrators to update malfunctioning modules without taking others offline

1

. The platform continuously collects fine-grained, end-to-end telemetry data that is 10 to 10,000 times more detailed than traditional tools, gathering information across switches, transceivers, and hosts

3

. Unlike previous AIOps models that dumped telemetry into data lakes for manual interpretation, Aria's system automatically extracts actionable insights and takes real-time intelligent action to optimize AI cluster performance

3

. AI agents automate critical tasks including load balancing and congestion management, while administrators can investigate malfunctions through natural language queries

1

.

Hardware Flexibility and Market Strategy

Aria's network architecture works with any AI chip on the market, including those from Nvidia and Google, giving companies flexibility to upgrade or switch hardware without overhauling their entire network infrastructure

2

. The company is collaborating with AI-focused system integrators, neocloud builders, and managed AI infrastructure providers to deliver complete AI data centers, treating networking as a first-class design element rather than an afterthought

3

. Mansour Karam indicated that white glove deployment will be central to the partner model, with Aria embedding field deployment engineers directly into customer environments—an opportunity partners can eventually adopt as a value-added service

3

. The startup has already secured customer orders and is actively deploying its solutions

2

.

Source: CRN

Source: CRN

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo