5 Sources
5 Sources
[1]
AMD CEO Lisa Su Says Concerns About an AI Bubble Are Overblown
Lisa Su leads Nvidia's biggest rival in the AI chip market. When asked at WIRED's Big Interview event if AI is a bubble, company said "Emphatically, from my perspective, no." Earlier this year, WIRED said that AMD CEO Lisa Su was "out for Nvidia's blood." The American chipmaker is still small compared to the juggernaut that is Nvidia -- their market caps are $353 billion and $4.4 trillion, respectively -- but Su's company is gaining steam. Today, when Su took the stage at WIRED's Big Interview conference in San Francisco, she had something else in her sights: the AI bubble. When asked by WIRED senior writer Lauren Goode if the tech industry is in an AI bubble, her response was "emphatically, from my perspective, no." The AI industry is going to need scores of chips from companies like AMD, and fears of such a bubble, Su said, are "somewhat overstated." That might sound bold, but boldness is Su's whole deal. Since she became AMD's CEO in 2014, she has increased the company's market cap from $2 billion to $300 billion. Now, Su is betting big on the need for much more computing power for AI, and the data centers needed to provide that. Still, there are plenty of hurdles ahead for AMD. One is all of those data centers being built, and another is getting its chips out into the hands of as many customers as possible. During the discussion, Goode asked the AMD CEO about selling chips to China. She confirmed that AMD will pay a 15 percent tax instituted by the Trump administration on MI308 chips it plans to resume shipping to China. The US government previously halted sales of the chips to China, but then began reviewing applications again over the summer. AMD said earlier this year that US export restrictions on the MI308 chips would cost the company roughly $800 million. Earlier this year, AMD made a huge deal with OpenAI, under which the AI company will deploy 6 gigawatts of AMD's Instinct GPUs over the course of several years. As part of the deal, AMD agreed to allow OpenAI to buy 160 million shares of the company's stock for a penny per share. effectively giving it a 10 percent stake in the company. The first gigawatt deployment is set to rollout in the second half of next year. It's one of several big bets AMD is making on AI data centers to power artificial intelligence. What Su said she's not worried about is competition from Nvidia, or even Google or Amazon, both of which have their own chip-making plans. "When I look at the landscape, what keeps me up at night is 'How do we move faster when it comes to innovation?'" Su said. Su believes that AI is still in its infancy and her company needs to be ready to provide chips for the future. "As good as the models are today," she says, "the next one will be better." There's huge potential in AI, and "there's not a reason not to keep pushing that technology" into the future.
[2]
AMD CEO Lisa Su 'emphatically' rejects talk of an AI bubble -- says claims are 'somewhat overstated'
AMD CEO says long-term demand for compute will justify today's rapid data-center buildout. AMD CEO Lisa Su used her appearance at WIRED's Big Interview conference in San Francisco to push back against growing speculation that the AI sector is overheating. Asked whether the industry is in a bubble, Su replied "emphatically" no, arguing that concerns are "somewhat overstated" and that AI is still in its infancy. According to Su, AMD needs to be ready to provide chips for the future -- "there's not a reason not to keep pushing that technology." Her remarks come as AMD prepares for several of its largest data-center commitments to date, including a multi-gigawatt accelerator deployment with OpenAI and the resumption of MI308 shipments to China under a new export-control framework. OpenAI plans to deploy six gigawatts of Instinct GPUs over the next several years under a joint announcement the companies made earlier this year. The first one-gigawatt block is scheduled for the second half of next year. As part of that arrangement, OpenAI secured the option to buy up to 160 million AMD shares at a penny each once deployment milestones are met. AMD presented the structure as a way to align long-term incentives around infrastructure delivery rather than a short window of product availability. Meanwhile, the company's operations in China have been shaped by a different kind of uncertainty. AMD has confirmed that it will pay a 15% export tax on MI308 shipments under revised export rules, and that it is ready to do so. Washington halted sales of the part in April before reopening a licensing process that allowed vendors to apply for restricted shipments. AMD has told investors that the original controls would create up to $800 million in inventory and purchase-commitment charges, which makes re-entering the market on known terms a positive step, even with the additional fee. China will not be the main driver of AMD's data-center revenue in the near term, but it remains one of the few regions with customers capable of absorbing large accelerator batches at short notice. Su's comments also addressed pressure from hyperscalers that are expanding their in-house silicon portfolios. She argued that AMD's challenge is not matching any single rival but advancing its own roadmap quickly enough to capture the next wave of deployments. In her view, each generation of AI models raises performance expectations, and the industry's underlying trajectory supports sustained investment in training and inference clusters. For a company that has spent much of the past decade rebuilding its position in high-performance computing, the coming cycle will test how well that confidence translates into delivered hardware and long-term customer commitments.
[3]
Tech leaders fill $1T AI bubble, insist it doesn't exist
Even as enterprises defer spending and analysts spot dotcom-era warning signs Tech execs are adamant the AI craze is not a bubble, despite the vast sums of money being invested, overinflated valuations given to AI startups, and reports that many projects fail to make it past the pilot stage. HPE is one of the giants tapping into booming demand for high-performance hardware to drive AI development, and the president and general manager of its Networking division, Rami Rahim, told us he doesn't see that ending anytime soon. "It wouldn't be the first time in the history of this industry that there would be a correction, and that's fine, you know. So we would adjust, and we would be just fine in the end. But I don't see it at this point in time. Right now, I don't see any signs of a slowdown based on the projects that are in the market right now and the conversations and the plans that we're talking about with our customers," he said. Rahim was speaking to The Register at HPE's Discover event in Barcelona this week, where it showcased various new and upcoming technologies, most with an AI twist. Asked about many AI projects not making it into production, he said: "There are pilots, but I also think there's a lot of actual, real value being created with production products. I mean, I can tell you from the standpoint of what we do in engineering inside of HPE Networking, more and more of our developers are getting far more efficient by leveraging copilots to write software and to verify software." But aren't there concerns about the quality of the code produced by AI assistants? "In the early days, there was a lot of concern, but I've seen an inflection point, the technology and trust based on more and more experience with it has actually improved dramatically. So it takes time. These things never happen overnight, but I sincerely do see the value," Rahim said. Many people have drawn parallels with the current situation and the dotcom market bust at the turn of the millennium, but Rahim said what is happening with AI right now is not completely analogous to what happened then. "I don't think it's a good idea to look at the past as an indicator of what might happen in the future," he told us. "It's just different technologies right now. I mean, right now, the appetite for consumption of AI products, GPU cycles, is enormous," Rahim said. Whether that changes in a year or two is "difficult to say," he admitted. AMD chief exec Lisa Su also maintains that AI is definitely not a bubble. Speaking at the UBS Global Technology and AI Conference 2025 this week, Su said: "I spend most of my time talking to the largest customers, the largest AI users out there. And there's not a concept of a bubble." Instead, she believes the tech industry is a couple of years into a "ten-year super cycle," one where "computing allows you to unlock more and more levels of capability, more and more levels of intelligence." The cycle started with model training as the primary use case, but is now shifting to inference, Su noted, and with no single model fitting all situations and use cases, customers are having to fine-tune to meet their requirements, and this continues to drive demand for infrastructure. "The one thing that is constant as we talk to customers is we need more compute. That at this point, if there was more compute installed, more compute capability, we would get to the answer faster," Su claimed. As the head of a company that makes CPUs, GPUs, ASICs, and FPGAs, she would say that, of course. But what of companies like the industry darling OpenAI, which is valued at $500 billion even though it doesn't expect to make money until 2030 and may have to raise hundreds of billions to cover losses and its investments in AI datacenters? "I think all of the capex forecasts that have increased over the last three to six months have certainly shown that there is confidence that those investments are going to lead to better capabilities going forward," Su said. "And so, from the standpoint of do we see a bubble, we don't see a bubble. What we do see is very well-capitalized companies, companies that have significant resources, using those resources at this point in time because it's such a special point in time in terms of AI learning and AI capabilities," she added. This is despite OpenAI CEO Sam Altman admitting earlier this year that he thinks the industry is in the midst of a bubble. When asked about complaints from many early adopters that there is little or no return on the investments they have made in AI, Su claimed this has not been AMD's experience. "What started as, let's call it, let's try AI for our internal use cases, has now turned into significant clear productivity wins going forward. So there's no question that there is a return on investment for investment in AI," she said. Su did concede that AI has not lived up to all the hype being broadcast about it. "If you look at today's AI, as much progress as we've made over the last couple of years, we're still not at the point where we're fully exploiting the potential of AI," she said. "And I still say that we are in the very, very early innings of seeing that payoff. So as we talk to the largest enterprise customers, I think every conversation is, 'Lisa, how can you help us, how can we learn faster so that we can take advantage of the technology?' So I think the return on investment certainly will be there." Meanwhile, Microsoft was forced to deny reports this week which claimed several of its divisions had lowered growth targets for products using AI after sales staff missed goals for the fiscal year that ended in June. Elsewhere, the head of South Korean conglomerate SK Group, which owns memory chipmaker SK hynix, opined that AI stocks might be due a haircut after rising too fast and too high. "I don't see a bubble in the AI industry," SK Group chairman Chey Tae-won said at a forum in Seoul, as reported by Reuters. "But when you look at the stock markets, they rose too fast and too much, and I think it is natural that there could be some period of corrections," he added. This could come soon, according to research firm Forrester, which recently found that large organizations are set to defer a large chunk of planned AI spending until 2027 because of the current gap between vendor promises and reality. Even the Bank of England's Financial Policy Committee has warned of the dangers of a sudden correction in the financial markets because of AI stocks, comparing the risks to the dotcom bubble. ®
[4]
AMD and IBM's CEO doesn't see an AI bubble, just $8 trillion in data centers
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Editor's take: Normal people are now shelling out hundreds of dollars for modest RAM upgrades, while the companies powering the AI boom are looking forward to even more market hysteria. And despite record spending, spiraling hardware costs, and mounting skepticism from analysts, the CEOs of IBM and AMD insist there is no AI bubble - or at least not one that threatens their business. IBM CEO Arvind Krishna argues that AI is not going through a financial bubble, despite many analysts, banks, and investors claiming otherwise. Krishna, who has led the former PC giant since 2020, is more concerned with the unsustainable infrastructure investments planned by OpenAI and other AI ventures seeking AGI paradise. Krishna recently appeared on a Verge podcast, where he discussed quantum computing, the Jeopardy-winning supercomputer Watson, AI, and more. He doesn't believe there is an AI bubble, but warns that many AI-focused companies will never see a return on investment if they keep spending as recklessly as they are now. The Indian American executive estimates that a single one-gigawatt AI data center requires around $80 billion. If a company wants to commit massive amounts of GPUs, RAM, and power to build a 20 to 30 gigawatt capacity, that cost rises to $1.5 trillion. The entire AI industry has already talked up about 100 gigawatts of capacity in its grand, earth-shaking announcements, which would require $8 trillion to actually build the data-crunching facilities behind them. No one is going to recoup that kind of money anytime soon. Krishna said AI ventures would need to generate $800 billion in profit just to pay interest on an $8 trillion infrastructure buildout - and no company in the industry is anywhere near those numbers. HSBC Holdings recently estimated that OpenAI will likely burn hundreds of billions of dollars for years. The pursuit of artificial general intelligence (AGI) is the driving bet behind this sort of unrealistic financial commitment, Krishna said. But AGI will require much more than a few hallucinating large language models, with the IBM CEO estimating a zero to one percent chance of successfully building the first real AGI in history. Credit: Value Sense IBM is mostly focused on the infrastructure and cloud side of the IT business these days, so the company has little interest in pushing GPUs or AI accelerators as the next big thing. AMD, however, is trying to do exactly that. CEO Lisa Su says the industry's growing concerns about an AI bubble are overblown. In a recent interview with Wired, the head of the x86 chipmaker said she doesn't see any bubble from her perspective. There's an industry that needs massive amounts of chips to build new computing capacity, and AMD is more than ready to supply at least part of that demand. Credit: Value Sense The Santa Clara - based company recently struck a major deal with OpenAI, committing to deliver 6 gigawatts of Instinct GPUs over several years. Su isn't worried about an AI bubble so much as she is about AMD's ability to keep up with chip demand from AI ventures. She said AI has massive potential, and sees no reason not to continue pushing this planet-straining, job-reshaping technology for the foreseeable future. As Rockstar's co-founder recently suggested, the executives promoting AI everywhere aren't necessarily the most human or creative people around.
[5]
AMD's Lisa Su doesn't believe there's an AI bubble: 'Emphatically, from my perspective, no'
As AI continues to balloon and pull in even more investment, one of the fiercest debates (other than copyright, ethics, and the environment) is about whether or not it's a bubble. AMD's Lisa Su has weighed in on the debate. Recently, in an interview with Wired, the AMD chief was asked if she thought AI was a bubble. Her response: "emphatically, from my perspective, no." Su claims that fears around a potential bubble are "somewhat overstated". Unfortunately, as this is part of the Wired Big Interview series, and the larger interview hasn't been published yet, it's hard to fully surmise where Su's stance lies outside of this answer. In its barest form, a bubble is where the asset price in some industry far exceeds actual value, due in part to speculation. The 2008 financial crisis and the dot-com bubble popping didn't mean people wouldn't buy houses or use the internet, but that the value tied up in it exceeded what it was worth, and those betting billions on it could no longer see a return on investment. Investment is certainly there when it comes to AI. Nvidia is the first company to have been valued at $5 trillion. Nvidia, in turn, invested $100 billion into OpenAI back in September. In October, AMD signed a multi-year deal with OpenAI, so there's some justification for wishful thinking here. Talking to Wired, Su says, "When I look at the landscape, what keeps me up at night is 'How do we move faster when it comes to innovation?'" Thoughts seem divided on whether or not there's an AI bubble, so here's a non-exhaustive list of those who have weighed in on the topic: A common argument, regardless of whether people think AI is a bubble or not, is that many companies will be disrupted due to the gold rush that has happened with the technology. As the IBM CEO puts it, "maybe two or three" of every ten companies might achieve what they want to with AI, and the rest will have to stomach losses made in the process. Nvidia is one of the companies that has benefited the most from the AI boom, with stock prices rising from around $3 in 2019 to over $180 at the time of writing. However, AMD has also had a boom, with stock prices rising from $30-40 in 2019 to around $215. But a bubble isn't just when companies have major booms: it's when companies have major booms and the asset value doesn't align with legitimate value. Whether or not AI is a bubble will be partially dependent on whether AI companies can actually match the potential they're selling consumers. Another factor is whether wider adoption is possible long-term. AI is such an all-encompassing force that even the US and UK governments have come out in favour of it, though Sam Altman claims he's not looking for a government bailout if things go bad. While the whole idea of an AI bubble is still up for debate, most seem to agree that only a few can win, and one just has to hope major governments are on the winners' podium.
Share
Share
Copy Link
AMD CEO Lisa Su has emphatically rejected concerns about an AI bubble, calling such fears "somewhat overstated" at WIRED's Big Interview conference. Her comments come as AMD prepares for a massive 6-gigawatt Instinct GPU deployment with OpenAI and navigates a 15% export tax on MI308 chips bound for China.
Lisa Su, CEO of AMD, has taken a firm stance against growing speculation that the artificial intelligence sector is overheating. Speaking at WIRED's Big Interview conference in San Francisco, Su responded "emphatically" when asked whether the tech industry is experiencing an AI bubble, stating from her perspective the answer is "no"
1
. The AMD chief argued that concerns about an AI bubble are "somewhat overstated" and insisted that AI remains in its infancy, requiring continued investment in compute power and infrastructure2
.
Source: Tom's Hardware
Su's confidence reflects AMD's position in the AI chip market, where the company has grown from a $2 billion market cap when she became CEO in 2014 to approximately $353 billion today
1
. While still dwarfed by Nvidia's $4.4 trillion valuation, AMD is gaining momentum through strategic partnerships and massive investments in AI data centers. Su emphasized that what keeps her awake at night is not competition from Nvidia or hyperscalers like Google and Amazon, but rather "how do we move faster when it comes to innovation"1
.AMD's commitment to the AI sector is exemplified by its landmark deal with OpenAI, announced earlier this year. The partnership involves deploying 6 gigawatts of AMD's Instinct GPUs over several years, with the first one-gigawatt block scheduled for rollout in the second half of next year
1
2
. In a unique arrangement, AMD agreed to allow OpenAI to purchase 160 million shares of the company's stock for a penny per share, effectively granting a 10% stake once deployment milestones are met1
.
Source: PC Gamer
AMD presented this structure as a way to align long-term incentives around infrastructure delivery rather than focusing on a short window of product availability
2
. This massive commitment underscores the demand for AI compute that Su believes will justify today's rapid data-center buildout. Speaking at the UBS Global Technology and AI Conference 2025, Su characterized the current moment as part of a "ten-year super cycle" where computing capabilities unlock progressively higher levels of intelligence3
.AMD's expansion plans face regulatory headwinds, particularly regarding sales to China. Su confirmed that AMD will pay a 15% export tax instituted by the Trump administration on MI308 chips the company plans to resume shipping to China
1
2
. The US government previously halted sales of these chips in April before reopening a licensing process over the summer that allowed vendors to apply for restricted shipments.AMD told investors earlier this year that US export restrictions on the MI308 chips would result in approximately $800 million in inventory and purchase-commitment charges
1
2
. While China will not be the main driver of AMD's data-center revenue in the near term, it remains one of the few regions with customers capable of absorbing large AI accelerator deployments at short notice2
.Related Stories
Su's optimistic outlook contrasts sharply with warnings from other tech leaders about unsustainable spending. IBM CEO Arvind Krishna estimates that building the 100 gigawatts of AI data-center capacity announced by the industry would require $8 trillion in investment
4
. According to Krishna, AI ventures would need to generate $800 billion in profit annually just to pay interest on such infrastructure buildout, with HSBC Holdings estimating that OpenAI alone will burn hundreds of billions of dollars for years4
.Source: TechSpot
Despite these concerns, Su maintains that well-capitalized companies are making rational decisions to invest heavily because "it's such a special point in time in terms of AI learning and AI capabilities"
3
. She argues that the shift from model training to inference workloads, combined with the need for fine-tuning across diverse use cases, continues to drive demand for AI compute. The constant refrain from customers, Su claims, is "we need more compute"3
.Many analysts have drawn parallels between current AI valuations and the dot-com bubble, pointing to overinflated startup valuations and reports that many AI projects fail to progress beyond pilot stages
3
. OpenAI itself is valued at $500 billion despite not expecting to turn a profit until 20303
. Yet Su believes each generation of AI models raises performance expectations, and the industry's underlying trajectory supports sustained investment in training and inference clusters2
. As she puts it, "as good as the models are today, the next one will be better"1
.Summarized by
Navi
[2]
[3]