Cerebras Systems raises IPO price range to $160 as AI chip demand draws 20x oversubscription

4 Sources

Share

AI chipmaker Cerebras Systems has increased its IPO price range to $150-$160 per share, up from $115-$125, as investor demand surges. The company could raise up to $4.8 billion and achieve a $48.8 billion valuation. With OpenAI and Amazon Web Services as major customers, Cerebras is positioning its wafer-scale chips as faster alternatives to Nvidia's GPUs for AI inference workloads.

Cerebras IPO Price Range Surges Amid Overwhelming Demand

AI chipmaker Cerebras Systems has increased the estimated price range for its initial public offering to $150-$160 per share, according to a Monday filing

1

. This marks a significant jump from the $115-$125 range disclosed just last week. The company is also considering raising the number of shares marketed from 28 million to 30 million

2

. At the high end of the new range, Cerebras could raise up to $4.8 billion in proceeds, up from the original target of $3.5 billion

4

. The surging investor demand has resulted in orders for more than 20 times the number of shares available, according to sources familiar with the matter . The Cerebras IPO is expected to price on May 13, with shares potentially trading on Nasdaq under the ticker CBRS as early as May 14

3

.

Source: Benzinga

Source: Benzinga

Valuation Doubles Since February Funding Round

The increased price range could value Cerebras Systems at up to $48.8 billion on a fully diluted basis

1

. This represents more than double the $23 billion valuation the company announced in February as part of a funding round. The dramatic increase reflects the broader surge in AI adoption that has driven sharp demand for high-performance chips and turned semiconductors into a key bottleneck in the technology supply chain

4

. This listing would mark the biggest IPO globally so far this year, according to Dealogic

2

. The offering is being led by Morgan Stanley, Citigroup, Barclays and UBS Group AG

4

.

Wafer-Scale AI Chips Challenge Nvidia's Dominance

Cerebras has positioned itself as a challenger in the AI chip trade dominated by Nvidia, along with AMD and Broadcom

3

. The Sunnyvale, California-based company sells wafer-scale AI chips called the WSE-3 that are several times larger than Nvidia's Blackwell B200 graphics processing units

2

. One of the main selling points of its chips is their 44-gigabyte pool of SRAM, a memory variety that features significantly more transistors per square millimeter than standard server DRAM and is considerably faster

2

. Cerebras says WSE-3's 900,000 cores can access the onboard SRAM pool with latency of one clock cycle, significantly faster than what a standard graphics card offers

2

. The company ships the WSE-3 as part of a 1.8-ton appliance called the CS-3, which customers can link together into clusters.

Processors for AI Model Inference Drive Revenue Growth

Cerebras is seeing surging demand for its processors as AI labs shift from chips for training generative AI models to deploying them in production

4

. The company's chips are better suited for inference, the computations that allow AI models to respond to user queries, than the GPU chips the industry has long relied on for model training

4

. When it filed its IPO paperwork last week, Cerebras revealed that its revenue had jumped by 76% in 2025 to $290.3 million

2

. The company is already profitable, having recorded a net income of $87.9 million that year, compared to a loss of $485 million in the year prior

2

. Rather than focus chiefly on selling hardware, Cerebras has been filling data centers with its own chips and providing cloud services to clients

1

.

OpenAI and Amazon Web Services Provide Customer Validation

Cerebras has secured marquee customers that provide validation for its technology. The company received a $20 billion-plus commitment from OpenAI, which relies on Cerebras for a model that writes code

1

. More specifically, OpenAI signed a deal with Cerebras worth more than $10 billion, tied to 750 megawatts of low-latency AI compute capacity coming online in phases through 2028

3

. In March, top cloud provider Amazon Web Services announced a deal to bring Cerebras chips into its data centers

1

. Cerebras received additional attention in the trial for Elon Musk's lawsuit against OpenAI CEO Sam Altman, where OpenAI's co-founder and president Greg Brockman testified that Cerebras' planned chips represented "the compute we thought we were going to need"

1

. OpenAI even discussed merging with Cerebras, and Musk was open to a deal, Brockman said

1

.

Customer Concentration and Geopolitical Factors Remain Key Risks

This marks Cerebras' second attempt to go public. The company first filed for an IPO in 2024 but pulled that plan last year due to controversy over its partnership with the United Arab Emirates-based firm G42, which accounted for more than 80% of its revenue in the first half of 2024

2

. The U.S. Committee on Foreign Investment in the United States subjected that partnership to a national security review but ultimately cleared both firms of any wrongdoing

2

. The IPO is more than just a chip story; it also involves customer concentration, export-control and geopolitical factors that investors will need to monitor

3

. While Cerebras has since diversified its customer base with OpenAI and Amazon Web Services, the question remains how much growth is already priced in at the elevated valuation

3

. If the offering prices strongly, CBRS could quickly become the next ticker traders use to express views on the AI chip trade, joining Nvidia, AMD, and Broadcom as public-market options for investors seeking exposure to AI infrastructure

3

.

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
© 2026 TheOutpost.AI All rights reserved