Curated by THEOUTPOST
On Wed, 12 Mar, 12:08 AM UTC
5 Sources
[1]
Chip Startup Celestial AI Hits $2.5B Valuation
The fresh cash comes almost exactly a year after the company locked up a $175 million Series C led by Thomas Tull's US Innovative Technology Fund. The Santa Clara, California-based startup's photonic fabric platform helps separate compute and memory, making processing extensive AI faster and providing more energy-efficient computing. The new round included participation from new investors including funds and accounts managed by BlackRock, Maverick Silicon, Tiger Global Management and Lip-Bu Tan, as well as participation from existing investors including AMD Ventures, Koch Disruptive Technologies, Temasek Holdings, Temasek's Xora Innovation fund, Porsche Automobil Holding and Engine Ventures. Founded in 2020, Celestial AI has now raised more than $515 million, per the company The company's photonic fabric helps solve one of generative AI's biggest issues -- untangling compute power and memory to make the technology more efficient. "With the emergence of complex reasoning models and agentic AI, the requirements on AI infrastructure are compounding," said CEO David Lazovsky in a statement. "Cluster sizes must scale from a few AI processors in a server to tens of processors in a single rack and thousands of processors across multiple racks, all while relying on high-bandwidth, low-latency network connectivity to handle massive data transfers between processors." Thanks to AI, chips seem to be the talk of tech right now. Just last month, EnCharge AI -- a startup developing analog in-memory-computing AI chips -- raised a Series B of more than $100 million led by Tiger Global.
[2]
Celestial AI Raises $250 Million as It Looks to Speed up Links Between AI Chips
SANTA CLARA, California (Reuters) - Celestial AI, one of several Silicon Valley chip startups aiming to crack a key speed constraint in artificial intelligence, said on Tuesday it has raised an additional $250 million in venture capital, bringing its total raised to date to $515 million. Celestial AI is tapping photonics - a technology that uses light, rather than electrical signals - to create speedy links between AI computing chips and memory chips. The speed of that connection, a measure called memory bandwidth, has become so central to advancing AI systems that it is one of the factors that determine whether a chip is subject to U.S. government export controls such as those designed to limit China's AI advances. At present, Nvidia reigns supreme in memory bandwidth with proprietary technologies called NVLink and NVSwitch. That has set off a technology race among startups and a flurry of funding to find alternatives that can be used by other chip firms. Celestial AI rivals Lightmatter and Ayar Labs have raised $850 million and $370 million, respectively. Celestial AI, which is backed by the venture arm of Nvidia rival Advanced Micro Devices, is developing a technology that can sit like a bridge between two or more chips and uses a different kind of photonics technology than its rivals. The goal of this "photonic fabric," Celestial AI CEO Dave Lazovsky told Reuters, is to provide speed while saving space and power, the two things that are at a premium in every chip's design. "There are no good answers right outside of Nvidia," Lazovsky said in an interview at the firm's Santa Clara, California headquarters. "What we had created with the photonic fabric does the same thing, but at a different level of energy efficiency and of latency." Celestial AI said the new round of funding was led by Fidelity Management & Research and joined by BlackRock, Maverick Capital, Tiger Global Management and Lip-Bu Tan, the former CEO of chip design software firm Cadence Design Systems. Also joining were existing investors AMD Ventures, Koch Disruptive Technologies, Singapore state investor Temasek, Temasek's wholly-owned subsidiary Xora Innovation, Porsche Automobil Holding and The Engine Ventures. (Reporting by Stephen Nellis in Santa Clara, California; Editing by Jamie Freed)
[3]
Celestial AI raises $250 million as it looks to speed up links between AI chips
Celestial AI, one of several Silicon Valley chip startups aiming to crack a key speed constraint in artificial intelligence, said on Tuesday it has raised an additional $250 million in venture capital, bringing its total raised to date to $515 million. Celestial AI is tapping photonics - a technology that uses light, rather than electrical signals - to create speedy links between AI computing chips and memory chips. The speed of that connection, a measure called memory bandwidth, has become so central to advancing AI systems that it is one of the factors that determine whether a chip is subject to U.S. government export controls such as those designed to limit China's AI advances. At present, Nvidia reigns supreme in memory bandwidth with proprietary technologies called NVLink and NVSwitch. That has set off a technology race among startups and a flurry of funding to find alternatives that can be used by other chip firms. Celestial AI rivals Lightmatter and Ayar Labs have raised $850 million and $370 million, respectively. Celestial AI, which is backed by the venture arm of Nvidia rival Advanced Micro Devices, is developing a technology that can sit like a bridge between two or more chips and uses a different kind of photonics technology than its rivals. The goal of this "photonic fabric," Celestial AI CEO Dave Lazovsky told Reuters, is to provide speed while saving space and power, the two things that are at a premium in every chip's design. "There are no good answers right outside of Nvidia," Lazovsky said in an interview at the firm's Santa Clara, California headquarters. "What we had created with the photonic fabric does the same thing, but at a different level of energy efficiency and of latency." Celestial AI said the new round of funding was led by Fidelity Management & Research and joined by BlackRock, Maverick Capital, Tiger Global Management and Lip-Bu Tan, the former CEO of chip design software firm Cadence Design Systems. Also joining were existing investors AMD Ventures, Koch Disruptive Technologies, Singapore state investor Temasek, Temasek's wholly-owned subsidiary Xora Innovation, Porsche Automobil Holding and The Engine Ventures.
[4]
Celestial AI raises $250M for its optical interconnect technology - SiliconANGLE
Celestial AI raises $250M for its optical interconnect technology Celestial AI Inc., a startup that develops optical technology for linking chips, has raised $250 million in funding at a $2.5 billion valuation. The Series-C1 investment comes a year after the company's previous raise. Celestial disclosed in its announcement of the new funding round today that Fidelity Management & Research was the lead investor. It was joined by more than a half dozen other backers including prominent chip industry executive Lip-Bu Tan and AMD Ventures. Many modern processors comprise not one but multiple chiplets, or compute modules, that are linked together using a technology called an interconnect. This interconnect is a miniature network that moves data between the chiplets to coordinate their work. Data travels between chiplets in the form of electrical signals. Santa Clara, California-based Celestial AI develops an interconnect that transmits information in the form of light rather than electricity. Photons travel over fiber-optic links faster than electrons over copper, which allows data to move more quickly between a processor's chiplets. The result is an increase in processing speeds. Celestial AI's optical interconnect is known as the Photonic Fabric. According to the company, chipmakers can incorporate the technology into their processors in the form of an interposer. An interposer is a base layer on which a processor's chiplets can be placed. Celestial AI says that Photonic Fabric lends itself particularly well to powering artificial intelligence chips. Such chips often generate more heat than other types of processors, which can cause technical issues in the interposer that functions as their base layer. According to Celestial AI, Photic Fabric mitigates the challenge because it can operate at "much higher" temperatures than other optical interconnects. The company is also promising a second benefit: reduced hardware costs. Large language models store much of their data in a type of high-speed RAM called HBM memory. This memory is usually integrated into graphics cards. As a result, a company that wishes to add more HBM memory to its AI cluster must buy more graphics cards even if it doesn't need the extra processing capacity. According to Celestial AI, Photonic Fabric can be used to add HBM memory to an AI cluster without having to add graphics cards. The interconnect makes it possible to expand an AI processor's memory pool by connecting it to a remote appliance loaded with HBM RAM. In theory, this approach is more cost-efficient than buying additional graphics cards. Currently, connecting AI chips to remote HBM appliances is impractical because of range limitations. HBM memory only works effectively if it's placed immediately next to an AI processor's logic circuits. Celestial AI says that its optical interconnect removes this range limitation: it can transfer data between chips located more than 160 feet apart. The company is promising die-to-die bandwidth of up to 14.4 terabits per second. Celestial AI also offers its technology in another form factor: a network switch with a system-in-a-package, or SiP, design. A SiP is a processor that combines multiple integrated circuits in a single package. The switch allows multiple processors to exchange data with one another and function as one large chip. "With the emergence of complex reasoning models and agentic AI, the requirements on AI infrastructure are compounding," said Celestial AI Chief Executive Officer David Lazovsky. "Cluster sizes must scale from a few AI processors in a server to tens of processors in a single rack and thousands of processors across multiple racks, all while relying on high-bandwidth, low-latency network connectivity to handle massive data transfers between processors." Bloomberg reported that the company will use its latest funding round to commercialize its interconnect technology. Celestial AI plans to start mass producing Photonic Fabric in 2027.
[5]
Celestial AI raises $250 million as it looks to speed up links between AI chips
SANTA CLARA, California, March 11 (Reuters) - Celestial AI, one of several Silicon Valley chip startups aiming to crack a key speed constraint in artificial intelligence, said on Tuesday it has raised an additional $250 million in venture capital, bringing its total raised to date to $515 million. Celestial AI is tapping photonics - a technology that uses light, rather than electrical signals - to create speedy links between AI computing chips and memory chips. The speed of that connection, a measure called memory bandwidth, has become so central to advancing AI systems that it is one of the factors that determine whether a chip is subject to U.S. government export controls such as those designed to limit China's AI advances. At present, Nvidia (NVDA.O), opens new tab reigns supreme in memory bandwidth with proprietary technologies called NVLink and NVSwitch. That has set off a technology race among startups and a flurry of funding to find alternatives that can be used by other chip firms. Celestial AI rivals Lightmatter and Ayar Labs have raised $850 million and $370 million, respectively. Celestial AI, which is backed by the venture arm of Nvidia rival Advanced Micro Devices (AMD.O), opens new tab, is developing a technology that can sit like a bridge between two or more chips and uses a different kind of photonics technology than its rivals. The goal of this "photonic fabric," Celestial AI CEO Dave Lazovsky told Reuters, is to provide speed while saving space and power, the two things that are at a premium in every chip's design. "There are no good answers right outside of Nvidia," Lazovsky said in an interview at the firm's Santa Clara, California headquarters. "What we had created with the photonic fabric does the same thing, but at a different level of energy efficiency and of latency." Celestial AI said the new round of funding was led by Fidelity Management & Research and joined by BlackRock (BLK.N), opens new tab, Maverick Capital, Tiger Global Management and Lip-Bu Tan, the former CEO of chip design software firm Cadence Design Systems. Also joining were existing investors AMD Ventures, Koch Disruptive Technologies, Singapore state investor Temasek, Temasek's wholly-owned subsidiary Xora Innovation, Porsche Automobil Holding and The Engine Ventures. Reporting by Stephen Nellis in Santa Clara, California; Editing by Jamie Freed Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Artificial Intelligence
Share
Share
Copy Link
Celestial AI, a Silicon Valley startup, has raised $250 million in a Series-C1 funding round, bringing its total funding to $515 million. The company is developing photonic fabric technology to enhance AI chip connectivity and efficiency.
Celestial AI, a Santa Clara-based startup, has successfully raised $250 million in a Series-C1 funding round, elevating its total funding to $515 million and achieving a valuation of $2.5 billion 14. This latest investment, led by Fidelity Management & Research, comes just a year after the company's previous $175 million Series C round 1.
At the heart of Celestial AI's offering is its proprietary "photonic fabric" technology, which aims to revolutionize the connectivity between AI computing chips and memory chips 2. This technology utilizes light instead of electrical signals to create high-speed links, addressing a critical constraint in artificial intelligence processing 3.
The photonic fabric serves as an interposer, a base layer on which processor chiplets can be placed, offering several advantages:
Celestial AI's technology targets a crucial aspect of AI development - memory bandwidth. This factor has become so vital that it's even considered in U.S. government export controls related to AI advancements 2. The company's CEO, David Lazovsky, emphasized the growing demands on AI infrastructure:
"With the emergence of complex reasoning models and agentic AI, the requirements on AI infrastructure are compounding. Cluster sizes must scale from a few AI processors in a server to tens of processors in a single rack and thousands of processors across multiple racks, all while relying on high-bandwidth, low-latency network connectivity to handle massive data transfers between processors." 14
Celestial AI enters a competitive field dominated by Nvidia's proprietary NVLink and NVSwitch technologies 2. However, the startup's unique approach and backing from AMD Ventures position it as a potential alternative for other chip firms 3. Competitors in this space include Lightmatter and Ayar Labs, which have raised $850 million and $370 million respectively 2.
The funding round attracted a diverse group of investors, including:
Celestial AI plans to use the new funding to commercialize its interconnect technology, with mass production of the Photonic Fabric scheduled to begin in 2027 4.
One of the promising aspects of Celestial AI's technology is its potential to reduce hardware costs for AI systems. By enabling the connection of AI processors to remote high-bandwidth memory (HBM) appliances, the company suggests that its solution could offer a more cost-efficient alternative to purchasing additional graphics cards for memory expansion 4.
As the AI industry continues to grow and evolve, Celestial AI's photonic fabric technology represents a significant development in addressing the challenges of speed, efficiency, and scalability in AI computing infrastructure.
Reference
[1]
[2]
U.S. News & World Report
|Celestial AI Raises $250 Million as It Looks to Speed up Links Between AI ChipsAyar Labs secures $155 million in Series D funding from major chipmakers and investors to scale up its light-based chip-to-chip communication technology, promising to revolutionize AI infrastructure.
6 Sources
6 Sources
Lightmatter raises $400 million in Series D funding, while other photonic startups like Oriole Networks and Xscape Photonics also secure significant investments. The surge in funding highlights the growing importance of photonics in addressing AI data center challenges.
5 Sources
5 Sources
Lightmatter, a $4.4 billion startup, introduces revolutionary photonic interconnect technology to enhance AI chip connectivity, promising significant improvements in data transfer speeds and efficiency.
4 Sources
4 Sources
Enfabrica, an AI networking startup, secures $115 million in Series C funding and introduces the ACF SuperNIC chip, promising to revolutionize GPU networking for AI applications with unprecedented performance and scalability.
3 Sources
3 Sources
Xscape Photonics raises $44M in Series A funding to develop laser-based chip interconnects using silicon photonics, aiming to significantly boost data center performance for AI workloads.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved