AMD's Lisa Su dismisses AI bubble fears as chip demand surges with OpenAI deal

Reviewed byNidhi Govil

5 Sources

Share

AMD CEO Lisa Su has emphatically rejected concerns about an AI bubble, calling such fears "somewhat overstated" at WIRED's Big Interview conference. Her comments come as AMD prepares for a massive 6-gigawatt Instinct GPU deployment with OpenAI and navigates a 15% export tax on MI308 chips bound for China.

AMD CEO Pushes Back Against AI Bubble Concerns

Lisa Su, CEO of AMD, has taken a firm stance against growing speculation that the artificial intelligence sector is overheating. Speaking at WIRED's Big Interview conference in San Francisco, Su responded "emphatically" when asked whether the tech industry is experiencing an AI bubble, stating from her perspective the answer is "no"

1

. The AMD chief argued that concerns about an AI bubble are "somewhat overstated" and insisted that AI remains in its infancy, requiring continued investment in compute power and infrastructure

2

.

Source: Tom's Hardware

Source: Tom's Hardware

Su's confidence reflects AMD's position in the AI chip market, where the company has grown from a $2 billion market cap when she became CEO in 2014 to approximately $353 billion today

1

. While still dwarfed by Nvidia's $4.4 trillion valuation, AMD is gaining momentum through strategic partnerships and massive investments in AI data centers. Su emphasized that what keeps her awake at night is not competition from Nvidia or hyperscalers like Google and Amazon, but rather "how do we move faster when it comes to innovation"

1

.

Major OpenAI Partnership Signals Long-Term Growth Trajectory

AMD's commitment to the AI sector is exemplified by its landmark deal with OpenAI, announced earlier this year. The partnership involves deploying 6 gigawatts of AMD's Instinct GPUs over several years, with the first one-gigawatt block scheduled for rollout in the second half of next year

1

2

. In a unique arrangement, AMD agreed to allow OpenAI to purchase 160 million shares of the company's stock for a penny per share, effectively granting a 10% stake once deployment milestones are met

1

.

Source: PC Gamer

Source: PC Gamer

AMD presented this structure as a way to align long-term incentives around infrastructure delivery rather than focusing on a short window of product availability

2

. This massive commitment underscores the demand for AI compute that Su believes will justify today's rapid data-center buildout. Speaking at the UBS Global Technology and AI Conference 2025, Su characterized the current moment as part of a "ten-year super cycle" where computing capabilities unlock progressively higher levels of intelligence

3

.

Navigating US Export Restrictions and China Market Challenges

AMD's expansion plans face regulatory headwinds, particularly regarding sales to China. Su confirmed that AMD will pay a 15% export tax instituted by the Trump administration on MI308 chips the company plans to resume shipping to China

1

2

. The US government previously halted sales of these chips in April before reopening a licensing process over the summer that allowed vendors to apply for restricted shipments.

AMD told investors earlier this year that US export restrictions on the MI308 chips would result in approximately $800 million in inventory and purchase-commitment charges

1

2

. While China will not be the main driver of AMD's data-center revenue in the near term, it remains one of the few regions with customers capable of absorbing large AI accelerator deployments at short notice

2

.

Industry Debate Over Massive Investments in AI Infrastructure

Su's optimistic outlook contrasts sharply with warnings from other tech leaders about unsustainable spending. IBM CEO Arvind Krishna estimates that building the 100 gigawatts of AI data-center capacity announced by the industry would require $8 trillion in investment

4

. According to Krishna, AI ventures would need to generate $800 billion in profit annually just to pay interest on such infrastructure buildout, with HSBC Holdings estimating that OpenAI alone will burn hundreds of billions of dollars for years

4

.

Source: TechSpot

Source: TechSpot

Despite these concerns, Su maintains that well-capitalized companies are making rational decisions to invest heavily because "it's such a special point in time in terms of AI learning and AI capabilities"

3

. She argues that the shift from model training to inference workloads, combined with the need for fine-tuning across diverse use cases, continues to drive demand for AI compute. The constant refrain from customers, Su claims, is "we need more compute"

3

.

Many analysts have drawn parallels between current AI valuations and the dot-com bubble, pointing to overinflated startup valuations and reports that many AI projects fail to progress beyond pilot stages

3

. OpenAI itself is valued at $500 billion despite not expecting to turn a profit until 2030

3

. Yet Su believes each generation of AI models raises performance expectations, and the industry's underlying trajectory supports sustained investment in training and inference clusters

2

. As she puts it, "as good as the models are today, the next one will be better"

1

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo