5 Sources
5 Sources
[1]
Nvidia admits the $100bn 'biggest AI infrastructure project in history' OpenAI deal still isn't finalized
Long lead times and regular refresh cycles are risky to long-term deals The $100 billion collaboration between Nvidia and OpenAI has received its fair share of publicity, but the deal is yet to be finalized and it remains little more than a letter of intent, to this day. Nvidia CFO Colette Kress confirmed at the UBS Global Technology and AI Conference that no definitive agreement has been completed yet. Originally, the plan included deploying millions of Nvidia GPUs over the course of a number of years, resulting in 10GW of data center capacity. "We still haven't completed a definitive agreement, but we're working with them," Kress noted (via Fortune). "There is no assurance that we will enter into definitive agreements with respect to the OpenAI opportunity or other potential investments, or that any investment will be completed on expected terms, if at all," Nvidia wrote in a 10-Q filing. The company also highlighted risk factors associated with long-term partnerships, such as long-lead, non-cancellable orders that could lead to excess inventory if customers scale back. Continual annual architecture releases could also make forecasting demand difficult. Kress said that the roughly $500 billion of Blackwell and Vera Rubin system demand for 2025-2026 does not include any OpenAI demand relating to the potential deal. Although Nvidia shares rose around 2.6% following the CFO's remarks, investor concerns about an 'AI bubble' remain - circular deals, where Nvidia invests in startups that then go on to purchase Nvidia chips, make the entire landscape dependent on each other. This has prompted large institutions like the Bank of England to warn about 'sharp market corrections' when this bubble bursts - outlining the potential risks for tech companies tied into long-term agreements. Speaking about competition, Kress also stressed that Nvidia is "absolutely not" worried about companies like Google making their own TPUs. "Everybody is on our platform," she noted, suggesting it would take a long time for the industry to shift away from Nvidia hardware.
[2]
Nvidia "absolutely not" losing AI lead, CFO says
Why it matters: Nvidia briefly passed the $5 trillion market cap threshold in October, becoming the first company to do so, but the stock is down more than 11% over the last month amid concern about competition. The big picture: Nvidia CFO Colette Kress -- the No. 2 executive at the chip giant -- said the company is "absolutely not" losing its lead, adding that "everybody is on our platform." * In an interview with an analyst at the UBS Global Technology and AI Conference, she added that the economy is "in the early parts" of transitioning to the data center infrastructure needed to power AI. * Kress reiterated that Nvidia envisions $3 trillion to $4 trillion in global AI investments by the end of the decade. Context: Nvidia shares slipped last week on reports that Meta was in talks with Google about using Google's custom chips, called TPUs, in its data centers. * That pushed Nvidia to social media to defend the wide applications of its own GPUs, saying on X that its popular chips were "a generation ahead of the industry." The intrigue: Kress said that Nvidia's recently announced plan to invest up to $100 billion in ChatGPT creator OpenAI isn't a done deal.
[3]
Nvidia's CFO admits the $100 billion OpenAI megadeal 'still' isn't 'definitive' -- two months after it helped fuel an AI rally | Fortune
Two months after Nvidia and OpenAI unveiled their eyepopping plan to deploy at least 10 gigawatts of Nvidia systems -- and up to $100 billion in investments -- the chipmaker now admits the deal isn't actually final. Speaking Tuesday at UBS's Global Technology and AI Conference in Scottsdale, Ariz., Nvidia EVP and CFO Colette Kress told investors that the much-hyped OpenAI partnership is still at the letter-of-intent stage. "We still haven't completed a definitive agreement," Kress said when asked how much of the 10-gigawatt commitment is actually locked in. That's a striking clarification for a deal that Nvidia CEO Jensen Huang once called "the biggest AI infrastructure project in history." Analysts had estimated that the deal could generate as much as $500 billion in revenue for the AI chipmaker. When the companies announced the partnership in September, they outlined a plan to deploy millions of Nvidia GPUs over several years, backed by up to 10 gigawatts of data center capacity. Nvidia pledged to invest up to $100 billion in OpenAI as each tranche comes online. The news helped fuel an AI-infrastructure rally, sending Nvidia shares up 4% and reinforcing the narrative that the two companies are joined at the hip. Kress's comments suggest something more tentative, even months after the framework was released. It's unclear why the deal hasn't been executed, but Nvidia's latest 10-Q offers clues. The filing states plainly that "there is no assurance that any investment will be completed on expected terms, if at all," referring not only to the OpenAI arrangement but also to Nvidia's planned $10 billion investment in Anthropic and its $5 billion commitment to Intel. In a lengthy "Risk Factors" section, Nvidia spells out the fragile architecture underpinning megadeals like this one. The company stresses that the story is only as real as the world's ability to build and power the data centers required to run its systems. Nvidia must order GPUs, HBM memory, networking gear, and other components more than a year in advance, often via non-cancelable, prepaid contracts. If customers scale back, delay financing, or change direction, Nvidia warns it may end up with "excess inventory," "cancellation penalties," or "inventory provisions or impairments." Past mismatches between supply and demand have "significantly harmed our financial results," the filing notes. The biggest swing factor seems to be the physical world: Nvidia says the availability of "data center capacity, energy, and capital" is critical for customers to deploy the AI systems they've verbally committed to. Power buildout is described as a "multi-year process" that faces "regulatory, technical, and construction challenges." If customers can't secure enough electricity or financing, Nvidia warns, it could "delay customer deployments or reduce the scale" of AI adoption. Nvidia also admits that its own pace of innovation makes planning harder. It has moved to an annual cadence of new architectures -- Hopper, Blackwell, Vera Rubin -- while still supporting prior generations. It notes that a faster architecture pace "may magnify the challenges" of predicting demand and can lead to "reduced demand for current generation" products. These admissions nod to the warnings of AI bears like Michael Burry, the investor of "the Big Short" fame, who has alleged that Nvidia and other chipmakers are overextending the useful lives of their chips and that the chips' eventual depreciation will cause breakdowns in the investment cycle. However, Huang has said that chips from six years ago are still running at full pace. The company also nodded explicitly to past boom-bust cycles tied to "trendy" use cases like crypto mining, warning that new AI workloads could create similar spikes and crashes that are hard to forecast and can flood the gray market with secondhand GPUs. Despite the lack of a deal, Kress stressed that Nvidia's relationship with OpenAI remains "a very strong partnership," more than a decade old. OpenAI, she said, considers Nvidia its "preferred partner" for compute. But she added that Nvidia's current sales outlook does not rely on the new megadeal. The roughly $500 billion of Blackwell and Vera Rubin system demand Nvidia has guided for 2025-2026 "doesn't include any of the work we're doing right now on the next part of the agreement with OpenAI," she said. For now, OpenAI's purchases flow indirectly through cloud partners like Microsoft and Oracle rather than through the new direct arrangement laid out in the LOI. OpenAI "does want to go direct," Kress said. "But again, we're still working on a definitive agreement." On competitive dynamics, Kress was unequivocal. Markets lately have been cheering Google's TPU - which has a smaller use-case than GPU but requires less power - as a potential competitor to NVIDIA's GPU. Asked whether those types of chips, called ASICS, are narrowing Nvidia's lead, she responded: "Absolutely not." "Our focus right now is helping all different model builders, but also helping so many enterprises with a full stack," she said. Nvidia's defensive moat, she argued, isn't any individual chip but the entire platform: Hardware, CUDA, and a constantly expanding library of industry-specific software. That stack, she said, is why older architectures remain heavily used even as Blackwell becomes the new standard. "Everybody is on our platform," Kress said. "All models are on our platform, both in the cloud as well as on-prem."
[4]
Nvidia CFO says chipmaker yet to finalise $100 billion OpenAI deal
Nvidia's agreement with ChatGPT parent OpenAI to invest up to $100 billion in the startup is still not finalised, the chipmaker's chief financial officer Colette Kress said on Tuesday at the UBS Global Technology and AI Conference in Arizona. Kress' comments add to intensifying discussion around a partnership that ties two of the most significant players in the artificial intelligence race and is at the center of rising concerns around circular deals in the AI ecosystem. The world's most valuable company in September unveileda letter of intent to invest in OpenAI, under an agreement that would involve deploying at least 10 gigawatts of Nvidia systems for the startup, enough capacity to power more than 8 million U.S. homes. "We still haven't completed a definitive agreement, but we're working with them," Kress said, addressing questions about the framework of Nvidia's agreement with OpenAI. OpenAI, the startup at the heart of the generative AI boom that kicked off with the launch of ChatGPT in late 2022, is a major customer for Nvidia's chips, alongside large cloud providers which make up a large portion of the chipmaker's sales. Nvidia CEO Jensen Huang has said the company has $500 billion in bookings for its advanced chips through 2026. The chips Nvidia could provide to OpenAI after its agreement is finalized are not included in these bookings and would add to the number, Kress said on Tuesday. "That half a trillion doesn't include any of the work that we're doing right now on the next part of the agreement with OpenAI," she said. Nvidia shares were up 2.6%. Over the past year, Nvidia has struck a series of deals with AI startups and invested in firms that are also major customers, stoking Wall Street concerns about an AI bubble and so‑called circular deals. Nvidia last month announced plans to commit up to $10 billion to OpenAI rival Anthropic. Kress said the Anthropic deal could also add to Nvidia's $500 billion in chip bookings.
[5]
Nvidia CFO says chipmaker yet to finalize $100 billion OpenAI deal
Dec 2 (Reuters) - Nvidia's agreement with ChatGPT parent OpenAI to invest up to $100 billion in the startup is still not finalized, the chipmaker's chief financial officer Colette Kress said on Tuesday at the UBS Global Technology and AI Conference in Arizona. Kress' comments add to intensifying discussion around a partnership that ties two of the most significant players in the artificial intelligence race and is at the center of rising concerns around circular deals in the AI ecosystem. The world's most valuable company in September unveileda letter of intent to invest in OpenAI, under an agreement that would involve deploying at least 10 gigawatts of Nvidia systems for the startup, enough capacity to power more than 8 million U.S. homes. "We still haven't completed a definitive agreement, but we're working with them," Kress said, addressing questions about the framework of Nvidia's agreement with OpenAI. OpenAI, the startup at the heart of the generative AI boom that kicked off with the launch of ChatGPT in late 2022, is a major customer for Nvidia's chips, alongside large cloud providers which make up a large portion of the chipmaker's sales. Nvidia CEO Jensen Huang has said the company has $500 billion in bookings for its advanced chips through 2026. The chips Nvidia could provide to OpenAI after its agreement is finalized are not included in these bookings and would add to the number, Kress said on Tuesday. "That half a trillion doesn't include any of the work that we're doing right now on the next part of the agreement with OpenAI," she said. Nvidia shares were up 2.6%. Over the past year, Nvidia has struck a series of deals with AI startups and invested in firms that are also major customers, stoking Wall Street concerns about an AI bubble and so-called circular deals. Nvidia last month announced plans to commit up to $10 billion to OpenAI rival Anthropic. Kress said the Anthropic deal could also add to Nvidia's $500 billion in chip bookings. (Reporting by Arsheeya Bajwa in Bengaluru; Editing by Tasim Zahid)
Share
Share
Copy Link
Two months after unveiling what CEO Jensen Huang called 'the biggest AI infrastructure project in history,' Nvidia CFO Colette Kress confirms the $100 billion OpenAI partnership is still just a letter of intent. The admission raises questions about forecasting demand and highlights investor concerns about circular deals in the AI ecosystem.
The Nvidia OpenAI deal, initially announced in September as a landmark partnership worth up to $100 billion, has not progressed beyond a letter of intent, Nvidia CFO Colette Kress confirmed at the UBS Global Technology and AI Conference in Arizona
1
2
. "We still haven't completed a definitive agreement, but we're working with them," Kress stated, addressing investor questions about the framework of what CEO Jensen Huang once described as "the biggest AI infrastructure project in history"3
. The revelation is notable given that the announcement helped fuel an AI infrastructure rally in September, sending Nvidia shares up 4% at the time3
.Source: Market Screener
The original plan outlined deploying millions of Nvidia GPUs over several years, backed by at least 10 gigawatts of data center capacity—enough to power more than 8 million U.S. homes
4
. Nvidia pledged to invest up to $100 billion in OpenAI as each tranche comes online, with analysts estimating the deal could generate as much as $500 billion in revenue for the chipmaker3
. Despite the lack of a signed agreement, Kress emphasized that Nvidia's relationship with OpenAI remains "a very strong partnership" spanning more than a decade, with OpenAI considering Nvidia its "preferred partner" for compute3
.Nvidia's latest 10-Q filing reveals why the $100 billion OpenAI deal has not yet finalized, highlighting significant risk factors associated with long-term partnerships
1
. The company states plainly that "there is no assurance that we will enter into definitive agreements with respect to the OpenAI opportunity or other potential investments, or that any investment will be completed on expected terms, if at all"1
. The filing emphasizes that AI data center infrastructure deployment depends heavily on the availability of data center capacity, energy, and capital—all of which face regulatory, technical, and construction challenges3
.Forecasting demand has become increasingly complex as Nvidia moves to an annual cadence of architecture releases, including Hopper, Blackwell, and Vera Rubin systems
3
. The company must order GPUs, HBM memory, networking gear, and other components more than a year in advance, often via non-cancelable, prepaid contracts3
. If customers scale back, delay financing, or change direction, Nvidia warns it may end up with excess inventory, cancellation penalties, or inventory provisions3
. Kress noted that the roughly $500 billion of Blackwell and Vera Rubin system demand for 2025-2026 does not include any OpenAI demand relating to the potential deal1
.
Source: ET
The admission that the Nvidia OpenAI deal is not yet finalized adds to intensifying discussion around investor concerns about an AI bubble and circular deals in the AI ecosystem
4
5
. Over the past year, Nvidia has struck a series of deals with AI startups and invested in firms that are also major customers, creating a landscape where companies become dependent on each other1
. Nvidia recently announced plans to commit up to $10 billion to OpenAI rival Anthropic, with Kress stating this deal could also add to Nvidia's $500 billion in chip bookings4
.
Source: Axios
Large institutions like the Bank of England have warned about "sharp market corrections" when this bubble bursts, outlining potential risks for tech companies tied into long-term partnerships
1
. Despite these concerns, Nvidia shares rose around 2.6% following Colette Kress's remarks at the conference1
4
. However, the stock remains down more than 11% over the last month amid concern about competition2
.Related Stories
Addressing competition concerns, Colette Kress was emphatic that Nvidia is "absolutely not" worried about companies like Google making their own TPUs
1
2
. "Everybody is on our platform," she noted, suggesting it would take a long time for the industry to shift away from Nvidia hardware1
. Nvidia shares slipped last week on reports that Meta was in talks with Google about using Google's custom chips, called TPUs, in its data centers2
. In response, Nvidia took to social media to defend its GPUs, saying on X that its popular chips were "a generation ahead of the industry"2
.Kress reiterated that Nvidia envisions $3 trillion to $4 trillion in global AI investments by the end of the decade, emphasizing that the economy is "in the early parts" of transitioning to the AI data center infrastructure needed to power AI applications
2
. OpenAI currently purchases through cloud partners like Microsoft and Oracle rather than through the new direct arrangement laid out in the letter of intent, though Kress noted that OpenAI "does want to go direct" once a definitive agreement is reached3
.Summarized by
Navi
[1]
[5]
22 Sept 2025•Business and Economy

06 Oct 2025•Business and Economy

21 Nov 2024•Technology

1
Science and Research

2
Technology

3
Policy and Regulation
