3 Sources
3 Sources
[1]
Data center switch maker Aria Networks raises $125M - SiliconANGLE
Aria Networks Inc., a startup that makes switches for artificial intelligence clusters, has closed a $125 million funding round. The capital was provided by Sutter Hill Ventures, Atreides Management, Valor Equity Partners and Eclipse Ventures. Aria disclosed the raise today in conjunction with the debut of its product portfolio. Deep Networking, as it's called, is a collection of hardware and software components for building data center networks. The centerpiece of Deep Networking is a suite of switches based on Ethernet, the data transfer protocol that underpins most AI clusters. The switches use the latest 800G and 1.6T versions of the protocol. The former edition can process 800 gigabits of data traffic per second per port, while the latter technology can manage up to 1.6 terabits. The primary difference between the two Ethernet implementations has to do with a component called the SerDes. Processors represent information as multiple parallel streams of data. Before a chip can transmit information, it has to consolidate it into a single data stream. That task is the responsibility of the SerDes. The reason 1.6T Ethernet is faster than the 800G edition is that its SerDes provide twice the throughput. Aria sells a 64-port 1.6T switch that comes in air and liquid-cooled editions. When ports are congested, a component called a packet buffer can temporarily store the delayed data traffic that is waiting to be processed. That removes the need to discard the traffic and thereby avoids the associated inefficiencies. Aria also offers switches based on 800G Ethernet. There are 64- and 128-port devices that can manage up to 51.2 terabits and 102.4 terabits of traffic per second, respectively. The company's switches are powered by a custom version of SONiC, an open-source network operating system based on Linux. It has a containerized design that enables administrators to update a malfunctioning module without taking the others offline. There's also support for a technology called RoCE. It enables the data traffic in AI clusters to bypass certain hardware components on their way to their destination, which boosts performance. Aria has equipped its software stack with a telemetry engine that collects technical information about the underlying hardware. According to the company, the data is up to 10,000 times more detailed than the telemetry gathered by other tools. AI agents use the telemetry to optimize customer networks. According to Aria, the agents automate tasks such as load balancing and congestion management. When the software spots a potential technical issue, it alerts a human. Administrators can investigate malfunctions by entering natural language questions into a chat box.
[2]
AI networking firm Aria Networks raises $125 million in funding
Aria's network is designed to work with any AI chip on the market, Nvidia Google, giving companies the flexibility to upgrade or switch hardware over time without overhauling their entire network infrastructure. Aria Networks said on Tuesday it has raised $125 million in its first series funding round, as the startup seeks to develop its AI networking infrastructure of artificial intelligence. Aria's network is designed to work with any AI chip on the market, Nvidia Google, giving companies the flexibility to upgrade or switch hardware over time without overhauling their entire network infrastructure. Founded in 2025, Aria Networks is backed by Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures. Gavin Baker, managing partner at Atreides, has joined Aria's board, along with Stefan Dyckerhoff of Sutter Hill and Aria's founders. The company also announced the availability of what it calls the world's first AI-native network, built to help AI data centers run more efficiently and at lower cost. Its core focus is "token efficiency", a measure of a data center costs. Tokens are a measure of how much text an AI model can process or remember during a single interaction. A token roughly corresponds to a short piece of text, such as part of a word. Palo Alto-based Aria said it already has customer orders in hand and is actively deploying.
[3]
Upstart Aria Networks Unveils AI‑Native Networking Platform For The AI Factory Era
Networking is no longer a background utility in the AI factory. It's a critical differentiator that can make or break AI infrastructure performance at scale, CEO and Co‑Founder Mansour Karam tells CRN. Aria Networks, a Palo Alto, Calif.-based startup that's at the intersection of networking, distributed systems and AI, has announced the general availability of its AI-native networking offering that the company said has been designed for the AI factory era. In the AI factory, networking is no longer "plumbing" in the background. It is a performance differentiator, and an important component of delivering competitive AI infrastructure at scale, CEO and Co-Founder Mansour Karam told CRN. Karam was the founder and CEO of intent-based networking provider Apstra before it was acquired by Juniper Networks in 2021. During his time at Juniper Networks, he got back into owning the entire stack -- both hardware and software. That's when he saw the "explosion of new demand" for data center from customers because of AI. "We really needed to start the company from scratch, and that's where I left and started Aria," he said. [Related: 10 CEOs On How AI Is Changing The Networking Game] The one-year-old company founded by technology veterans from the likes of Arista Networks, Cisco, Google, Meta and Pure Storage, to name a few, got its start in 2025 delivering on its promise of creating "networks that think," Karam said. Aria's platform has been built to optimize token efficiency, what the company called a core metric for AI factories that ties network performance to revenue, cost per token, and model FLOPs utilization (MFU). At the center of the platform is deep networking, which Karam called "a fundamentally different approach" to how AI networks operate. Aria's platform continuously collects fine‑grained, end‑to‑end telemetry -- 10 to 10,000 times finer resolution than traditional tools -- across switches, transceivers, and hosts, the company said. Karam said that prior AIOps models have failed because they dumped telemetry into data lakes and left customers to interpret and act on it manually, which he called an "unworkable" approach at AI scale. Instead, Aria's platform has been designed to close that loop by automatically extracting what matters from telemetry and taking real‑time, intelligent action to optimize AI cluster performance. Aria Networks is working with AI‑focused system integrators, neocloud builders, and managed AI infrastructure providers to deliver full AI factories, with networking treated as a first‑class design element rather than a bolt‑on, Karam said. "We're working with the right type of partners for this market. In some cases, it could be someone that is delivering the entire factory -- they're delivering the compute, the storage, and our job is to make sure we deliver the network. Then, there's the service integrators [whose] purpose is really to deliver on this AI opportunity," he said. White glove deployment will be a core part of the partner model, with Aria embedding field deployment engineers directly into customer environments. It's an opportunity the company says partners can eventually take on themselves as a value‑added service. Karam said that partners don't need decades of networking expertise to win AI factory deals, but they do need to recognize that network performance underpins AI performance. As enterprises move toward private AI data centers, the expectation is that partners who can architect, integrate, and optimize AI‑native networks will have a competitive advantage, he said. Aria Networks said that it has customer orders in hand and is actively deploying its offering today.
Share
Share
Copy Link
Aria Networks, a Palo Alto startup, has secured $125 million in its first funding round to develop AI-native networking infrastructure. The company unveiled Deep Networking, featuring 800G and 1.6T Ethernet switches designed to optimize token efficiency in AI data centers. With customer orders already in hand, Aria aims to make networking a performance differentiator rather than background plumbing.
Aria Networks has raised $125 million in its first series funding round, marking a significant entry into the competitive AI networking space
1
. The Palo Alto-based startup, founded in 2025, secured backing from Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures2
. Gavin Baker, managing partner at Atreides Management, has joined Aria's board alongside Stefan Dyckerhoff of Sutter Hill Ventures and the company's founders2
.
Source: ET
Concurrent with the funding announcement, Aria Networks unveiled what it calls the world's first AI-native networking platform, built specifically for the AI factory era
3
. CEO and co-founder Mansour Karam, who previously founded intent-based networking provider Apstra before its acquisition by Juniper Networks in 2021, emphasized that networking is no longer background plumbing but a critical performance differentiator for AI infrastructure at scale3
. The platform centers on Deep Networking, a collection of hardware and software components designed to optimize token efficiency, which measures how much text an AI model can process during a single interaction and directly impacts data center costs.The Deep Networking portfolio features switches based on 800G and 1.6T Ethernet protocols, the latest standards underpinning most AI clusters
1
. Aria offers a 64-port 1.6T switch available in both air and liquid-cooled editions, with the 1.6T version providing twice the throughput of 800G through enhanced SerDes components that consolidate parallel data streams1
. The company's 800G Ethernet lineup includes 64-port and 128-port devices capable of managing up to 51.2 terabits and 102.4 terabits of traffic per second, respectively1
. These switches incorporate packet buffers to temporarily store delayed data traffic during port congestion, eliminating the need to discard traffic and avoiding associated inefficiencies1
.
Source: SiliconANGLE
Related Stories
Aria's switches run on a customized version of SONiC, an open-source network operating system based on Linux with a containerized design that allows administrators to update malfunctioning modules without taking others offline
1
. The platform continuously collects fine-grained, end-to-end telemetry data that is 10 to 10,000 times more detailed than traditional tools, gathering information across switches, transceivers, and hosts3
. Unlike previous AIOps models that dumped telemetry into data lakes for manual interpretation, Aria's system automatically extracts actionable insights and takes real-time intelligent action to optimize AI cluster performance3
. AI agents automate critical tasks including load balancing and congestion management, while administrators can investigate malfunctions through natural language queries1
.Aria's network architecture works with any AI chip on the market, including those from Nvidia and Google, giving companies flexibility to upgrade or switch hardware without overhauling their entire network infrastructure
2
. The company is collaborating with AI-focused system integrators, neocloud builders, and managed AI infrastructure providers to deliver complete AI data centers, treating networking as a first-class design element rather than an afterthought3
. Mansour Karam indicated that white glove deployment will be central to the partner model, with Aria embedding field deployment engineers directly into customer environments—an opportunity partners can eventually adopt as a value-added service3
. The startup has already secured customer orders and is actively deploying its solutions2
.
Source: CRN
Summarized by
Navi
10 Mar 2026•Startups

11 Mar 2026•Technology

22 Jan 2026•Startups
