2 Sources
2 Sources
[1]
Lightspeed, Andreessen Back $4.2 Billion AI Data Center Supplier
Nexthop plans to use its new capital to keep hiring and investing in technology, including manufacturing and supply chains for its switches, as demand for AI computing currently far outpaces supply. Nexthop AI, an artificial intelligence infrastructure startup, has raised $500 million in a deal led by Lightspeed Venture Partners, part of a surge of investments in players building out AI capacity. Nexthop, which builds networking hardware and software for AI data centers, hit a $4.2 billion valuation with the new funding round, it plans to announce Tuesday. Andreessen Horowitz joined the funding, alongside all of the company's existing investors, including Altimeter Capital and Kleiner Perkins. Huge amounts of capital are slated to flow into data center construction in the coming years. Tech's biggest data center operators -- Alphabet Inc., Amazon.com Inc., Meta Platforms Inc. and Microsoft Corp. -- are expected to spend about $650 billion in 2026 alone on AI data centers and related projects. Nexthop, founded in 2024 by former Arista Networks Inc. Chief Operating Officer Anshul Sadana, is building systems to facilitate connections within and between data centers. Its tech is designed to require less power while managing more traffic at lower latencies. It's largely codeveloping those networking systems alongside large data center providers to fit their needs, Sadana said. The startup is aiming to compete with companies like Cisco Systems Inc., Arista Networks and Hewlett Packard Enterprise Co. "There is competition, and there will be even more," said Lightspeed investor Guru Chahal. "But the number of teams that over the last 15 years have been able to earn the trust of these hyperscalers comes down to an extraordinarily small number." Alongside the funding, Nexthop is announcing three new switches -- the specialized devices that connect the servers inside data centers and link facilities to one another, enabling the high-speed communication necessary for large chip clusters to train and run AI models. The endeavor is an expensive one. Nexthop has more than 300 employees, almost all engineers, and plans to use its new capital to keep hiring, Sadana said. The startup also manages manufacturing and supply chains for its switches and will keep investing in that technology. At the moment, demand for AI computing is far outpacing supply. But some in the industry worry that if supply eventually meets or outpaces demand, or if data center build-outs are constrained by other factors like power or memory, it could mean widespread trouble for the industry. Sadana knows tech's AI spending can't accelerate at the same rate indefinitely, but he expects the AI infrastructure market to continue to grow significantly. "There will be corrections, which every company should be prepared for, and you shouldn't be reckless with how you think about money in the longer term," he said. "But I'm confident that in a decade from now, the market will be much bigger than it is today."
[2]
AI networking startup Nexthop AI raises $500M, launches new switches - SiliconANGLE
AI networking startup Nexthop AI raises $500M, launches new switches Nexthop AI Inc., a startup that develops network equipment for artificial intelligence data centers, has closed a $500 million Series B round. The company stated in its funding announcement today that Lightspeed Venture Partners was the lead investor. Andreessen Horowitz, Altimeter and Nexthop AI's existing backers participated as well. The company is now valued at $4.2 billion. Nexthop AI announced the investment in conjunction with three new additions to its hardware portfolio. The NH-4010, NH-4220 and NH-5010 are switches optimized to process data traffic in AI clusters. All three devices are based on Broadcom Inc. networking chips. The NH-4010 can process 51.2 terabits of traffic per second, while the NH-4220 has twice the capacity. Nexthop AI says that the former chip is up to 20% more power-efficient than comparable products from rivals. That translates into energy savings of up to several dozen megawatts per device. The two switches support a networking technology called RoCEv2. When one graphics card sends data to another, the traffic usually has to pass through a central processing unit. RoCEv2 skips that step to speed up connections. Additionally, it powers a second technology called DCQCN that automatically detects and fixes congested network links. The third switch that Nexthop AI debuted today, the NH-5010, is designed to power AI clusters with a so-called disaggregated spine architecture. That's an alternative to the standard spine-and-leaf design on which many data center networks are based. In a standard data center, servers are attached to switches referred to as leaves. The leaves, in turn, connect to a second set of switches collectively known as the spine. Nexthop AI's disaggregated spine architecture splits this second collection of switches into two "functional tiers." One tier is responsible for processing traffic inside the data center, while the other orchestrates packets flowing to and from other data centers. Nexthop AI offers its switch lineup alongside other network devices. The company makes linear LPOs and LROs, modules that lower the cost of optical networks by reducing the need for digital signal processors. A digital signal processor is a chip that performs tasks such as removing noise from fiber-optic links. Customers can manage Nexthop AI's hardware using an operating system called Nexthop NOS. It's based on SONiC, an open-source network orchestration platform originally developed by Microsoft Corp. Nexthop NOS adds cybersecurity optimizations and an update service that speeds up the process of installing patches. "AI clusters are pushing data center networks to their limits, and networking is now central to overall system performance," said Andreessen Horowitz general partner Raghu Raghuram. "Nexthop is purpose-built for this shift."
Share
Share
Copy Link
Nexthop AI has secured $500 million in a funding round led by Lightspeed Venture Partners and Andreessen Horowitz, valuing the AI networking startup at $4.2 billion. The company, founded by former Arista Networks COO Anshul Sadana, is building power-efficient networking hardware and software for AI data centers as tech giants plan to spend $650 billion on AI infrastructure in 2026 alone.
Nexthop AI has raised $500 million in funding in a Series B round led by Lightspeed Venture Partners, propelling the AI networking startup to a $4.2 billion valuation
1
. Andreessen Horowitz joined the funding round alongside existing investors including Altimeter Capital and Kleiner Perkins1
. The investment arrives as tech's biggest hyperscalers—Alphabet Inc., Amazon.com Inc., Meta Platforms Inc., and Microsoft Corp.—are expected to spend approximately $650 billion in 2026 alone on AI data centers and related projects1
.Founded in 2024 by Anshul Sadana, former Chief Operating Officer at Arista Networks Inc., Nexthop AI positions itself as an AI data center supplier building systems to facilitate connections within and between data centers
1
. The startup now employs more than 300 people, almost all engineers, and plans to use its new capital to continue hiring and investing in technology, including manufacturing and supply chains for its network switches1
.
Source: Bloomberg
Alongside the funding announcement, Nexthop AI unveiled three new network switches designed specifically for AI clusters: the NH-4010, NH-4220, and NH-5010
2
. All three devices are based on Broadcom Inc. networking chips and engineered to handle the intensive data traffic requirements of modern AI computing infrastructure2
.
Source: SiliconANGLE
The NH-4010 can process 51.2 terabits of traffic per second, while the NH-4220 offers double that capacity
2
. What sets these power-efficient systems apart is their energy consumption: the NH-4010 is up to 20% more power-efficient than comparable products from rivals, translating into energy savings of up to several dozen megawatts per device2
. Both switches support RoCEv2, a networking technology that bypasses the central processing unit when graphics cards exchange data, significantly speeding up connections2
.The third switch, the NH-5010, is designed for AI clusters with a disaggregated spine architecture—an alternative to standard spine-and-leaf designs
2
. This architecture splits the spine into two functional tiers: one handling traffic inside the data center and another orchestrating packets flowing to and from other facilities2
.The startup is positioning itself to compete with established players like Cisco Systems Inc., Arista Networks, and Hewlett Packard Enterprise Co.
1
. Lightspeed investor Guru Chahal acknowledged the competitive landscape but emphasized that "the number of teams that over the last 15 years have been able to earn the trust of these hyperscalers comes down to an extraordinarily small number"1
.Nexthop AI's networking hardware and software portfolio extends beyond switches. The company also manufactures linear LPOs and LROs—modules that reduce the cost of optical networks by minimizing the need for digital signal processors
2
. Customers can manage all Nexthop AI hardware using Nexthop NOS, an operating system based on SONiC, an open-source network orchestration platform originally developed by Microsoft Corp.2
. The company's version adds cybersecurity optimizations and an update service that accelerates patch installation2
.Related Stories
At present, demand for AI computing far outpaces supply, creating opportunities for companies like Nexthop AI
1
. The company is largely codeveloping its networking systems alongside large data center providers to fit their specific needs, according to Sadana1
. "AI clusters are pushing data center networks to their limits, and networking is now central to overall system performance," said Andreessen Horowitz general partner Raghu Raghuram2
.However, some industry observers worry that if supply eventually meets or outpaces demand, or if data center build-outs face constraints from factors like power or memory, it could spell trouble for the sector
1
. Sadana acknowledges these concerns but remains optimistic about long-term growth. "There will be corrections, which every company should be prepared for, and you shouldn't be reckless with how you think about money in the longer term," he said. "But I'm confident that in a decade from now, the market will be much bigger than it is today"1
.Summarized by
Navi
1
Technology

2
Policy and Regulation

3
Business and Economy
