Curated by THEOUTPOST
On Fri, 13 Sept, 12:04 AM UTC
2 Sources
[1]
AMD CEO Reiterates AI Roadmap "Acceleration", Stating That The AI Supercycle Has Just Started
AMD's CEO, Lisa Su, believes that the "AI supercycle" is just getting started and that the firm has accelerated its product roadmap to meet the massive demand from the markets. If we look at the competition in the AI markets, it is evident that apart from NVIDIA, the only company in sight is AMD. This is not just because of its market presence; the firm has expanded its AI portfolio massively over the past few quarters, which has prompted the attention of several mainstream clients, including Microsoft and Amazon. While the firm hasn't managed to replicate the success NVIDIA has seen in the markets, Team Red is indeed optimistic about the future, which is why it feels that the AI hype has just started to kick in. We have accelerated our AI roadmap and are on a one-year cadence of new products. It is an AI supercycle. AI is a much larger cycle than I would have expected five years ago. We are making big bets now for the next five years. - AMD's CEO Lisa Su via Finance Yahoo Team Red did reveal plans for AI roadmap acceleration on a one-year cadence a few months ago, and we have already seen a glimpse of it since the firm is anticipated to hold its "Advancing AI" event by next month. The event is set to feature the next-gen Instinct MI325X AI accelerator, and it is expected that next year, the firm will unveil its flagship Instinct MI300 AI accelerator, the MI350. So overall, things are pretty stacked up for AMD in terms of advancements, but in generational gaps, AMD does seem apparent behind since NVIDIA is set to push out its Blackwell architecture in the upcoming months. Team Red says that their MI300 portfolio is the "fastest growing" product in the firm's history, claiming that it witnessed massive adoption this year and is slated to generate revenue of up to $4.5 billion in 2024, which is a $500 million rise to what was forecasted in the previous quarterly results. However, the big bet by AMD would be with their next-gen MI400 Instinct lineup, which is said to feature capabilities that might put the company on par with NVIDIA, and that's not only it. AMD recently announced the merging of its consumer and data center architectures into one single unit called "UDNA," claiming that this will hasten up the development on both platforms. Now, this is something quite interesting, given that with this approach, AMD tends to compete with the industry on the software level, potentially coming close to the likes of NVIDIA's CUDA with their own ROCm software stack. The firm is significantly scaling up development efforts. Hence, it won't take long before we see a decisive breakout.
[2]
AMD chief talks AI opportunities and recent acquisitions
CEO Lisa Su sets sights on being best in GPUs, CPUs, FPGAs, everything... as Intel struggles Comment Once the relative minnow of the chip industry, AMD senses blood in the water following a series of misstesp by arch rival Intel, and head honcho Lisa Su is wasting no time in talking up its gameplan to investors. The fabless semiconductor biz has long played second fiddle to that other Santa Clara chipmaker, as well as Nvidia in the GPU accelerator stakes. But in its most recent results, AMD reported datacenter revenues up 115 percent and forecast it would make $4.5 billion from GPUs this year. Speaking at the Goldman Sachs Communacopia and Technology Conference this week, president and CEO Su confirmed that "AI is a huge priority for us," but added that end-to-end support for the technology is key. "I'm a big believer that there's no-one-size-fits-all in terms of computing. And so our goal is to be the high-performance computing leader that goes across GPUs and CPUs and FPGAs and also custom silicon as you put all of that together," she said. It can't hurt that AMD's biggest rival is going through a tough patch at the moment, though Su did not mention Intel and surprisingly - or perhaps by design - the subject wasn't brought up by the Goldman Sachs host. Billions are being spent on AI, but Su admitted the industry is still at an early stage in this particular compute cycle, and predicted there will be continued demand for more powerful infrastructure. "Whether you're talking about training of large language models or you're talking about inferencing or you're talking about fine tuning or you're talking about all of these things; the workloads will demand more compute," she claimed, adding that it isn't just about GPUs. "We believe GPUs will be the largest piece of that [forecast] $400 billion total addressable market, but there will also be some custom silicon associated with that. And when we look at our opportunity there, it really is an end-to-end play across all of the different compute elements." AMD moved to a one-year cadence for GPUs partly to keep up with Nvidia, but the CEO claimed it was also so that AMD could bulk out its portfolio with products covering the gamut of needs. "Of course, you have the largest hyperscalers who are building out these huge training clusters, but you also have a lot of need for inference, some are more memory-intensive workloads that would really focus there, some are more datacenter power constrained," she explained. "So what we've been able to do with our MI325 that's planned to launch here in the fourth quarter, and then the MI350 series and the MI400 series, is really just broaden the different products such that we are able to capture a majority of the addressable market with our product road map." Su also outlined AMD's bid to become a key supplier of AI chips for the hyperscalers as well as the largest enterprise customers, part of which includes the pending purchase of ZT Systems, a company that makes high-performance servers for cloud operators. "We're continuing to build out the entire infrastructure of what we need. So we just recently announced several software acquisitions, including the acquisition of Silo AI, which is a leading AI software company. And we just recently announced the acquisition of ZT Systems, which also builds out sort of the rack scale infrastructure necessary." This decision was prompted by talking to customers and looking at what would be necessary three to five years down the road, Su claimed. "The rackscale infrastructure - because these AI systems are getting so complicated - really needs to be thought of in design, sort of at the same time in parallel with the silicon infrastructure. So we're very excited about the acquisition of ZT," she explained, adding that "the knowledge of what are we trying to do on the system level will help us design a stronger and more capable road map." The other advantage will be on speeding validation of the various components required for AI infrastructure. "The amount of time it takes to stand up these clusters is pretty significant. We found, in the case of MI300, we finished our validation, but customers needed to do their own validation cycle. And much of that was done in series, whereas with ZT as part of AMD, we'll be able to do much of that in parallel. And that time to market will allow us to go from design complete to large-scale systems, running production workloads in a shorter amount of time, which will be very beneficial to our customers." Another piece of the puzzle is software, and the chipmaker has another pending purchase here in Finnish developer Silo AI. AMD has been honing its own GPU software stack, ROCm, to compete with market kingpin Nvidia, whose associated tools including the CUDA platform have almost become a de facto standard. "Over the last nine or ten months, we've spent a tremendous amount of time on leading workloads. And what we found is, with each iteration of ROCm, we're getting better and better... in terms of the tools, in terms of all the libraries, [and] in terms of knowing where the bottlenecks are in terms of performance," Su claimed. "We've been able to demonstrate with some of the most challenging workloads that we've consistently improved performance. And in some cases, we've reached parity, in many cases, we've actually exceeded our competition, especially with some of the inference workloads because of our architecture, we have more memory bandwidth and memory capacity." But AI isn't just a datacenter concern as the AI PC is also one of the hyped trends the industry has been desperately pushing this year in an effort to pep up flagging desktop and laptop sales. "I believe that we are at the start of a multiyear AI PC cycle," Su told the conference. "We never said AI PCs was a big 2024 phenomena. AI PCs is [making] a start in 2024. But more importantly, it's the most significant innovation that's come to the PC market in definitely the last ten-plus years," she opined, adding that it represents an opportunity for her company - traditionally the underdog in the PC market - especially as there is so much confusion around even the definition of an AI PC. "We find that many enterprise customers are pulling us into their AI conversations. Because, frankly, enterprise customers want help, right? They want to know, 'Hey, how should I think about this investment? Should I be thinking about cloud or should I be thinking about on-prem or how do I think about AI PCs?' And so we found ourselves now in a place of more like a trusted adviser with some of these enterprise accounts." Speaking more broadly, Su said that AI is an opportunity that cannot really be ignored by AMD. "I think this AI sort of technology arc is really a once-in-50-years type thing, so we have to invest. That being the case, we will be very disciplined in that investment. And so we expect to grow opex slower than we grow revenue. But we do see a huge opportunity in front of us." Su ended with her sales pitch, of sorts, with what enterprise customers need to bear in mind in the current time. "This is a computing super cycle, so we should all recognize that. And there is no one player or architecture that's going to take over. I think this is a case where having the right compute for the right workload and the right application is super important." And that's what AMD has been working towards over the last five-plus years, she claimed - to have the best CPU, GPU, FPGA, and semi-custom capabilities to meet customer needs.
Share
Share
Copy Link
AMD's CEO Lisa Su emphasizes the company's accelerated AI roadmap and the ongoing AI industry growth. She discusses AMD's strategic positioning and future plans in the rapidly evolving AI market.
AMD's CEO, Dr. Lisa Su, has reiterated the company's commitment to an accelerated AI roadmap, emphasizing that the AI industry is still in its early stages of growth. Speaking at the Goldman Sachs Communacopia + Technology Conference, Su highlighted AMD's strategic positioning in the rapidly evolving AI market 1.
Su expressed her belief that the AI industry is experiencing a "supercycle" of growth and innovation. She stated that the market is still in its infancy, with significant potential for expansion across various sectors. The CEO emphasized that AI's impact would be felt across multiple industries, driving demand for advanced computing solutions 1.
The company is focusing on three key areas in the AI market:
Su highlighted AMD's efforts to accelerate its roadmap to meet the growing demand in these sectors. The company aims to capitalize on the increasing need for AI-capable hardware in both consumer and enterprise markets 2.
While acknowledging the competitive nature of the AI chip market, Su expressed confidence in AMD's ability to gain market share. She noted that the company's x86 CPU business provides a strong foundation for expanding into AI-specific hardware. AMD's strategy involves leveraging its existing strengths while developing new technologies tailored for AI applications 2.
Su acknowledged the challenges in predicting the exact trajectory of AI market growth but remained optimistic about AMD's prospects. She emphasized the importance of staying agile and responsive to market demands. The CEO also highlighted the need for continued innovation in chip design and manufacturing to meet the increasing computational requirements of AI workloads 1.
AMD is actively working on developing a robust ecosystem around its AI hardware. Su stressed the importance of collaboration with software developers, cloud service providers, and enterprise customers to create comprehensive AI solutions. This approach aims to ensure that AMD's hardware is well-supported by a wide range of AI applications and frameworks 2.
Reference
[1]
[2]
AMD's AI GPU business, led by the Instinct MI300, has grown rapidly to match the company's entire CPU operations in revenue. CEO Lisa Su predicts significant market growth, positioning AMD as a strong competitor to Nvidia in the AI hardware sector.
4 Sources
4 Sources
AMD announces its new MI325X AI accelerator chip, set to enter mass production in Q4 2024, aiming to compete with Nvidia's upcoming Blackwell architecture in the rapidly growing AI chip market.
25 Sources
25 Sources
AMD reports strong Q2 2024 earnings, driven by exceptional AI chip sales and data center growth. The company's Instinct MI300 accelerators gain traction in the AI market, challenging NVIDIA's dominance.
3 Sources
3 Sources
AMD's CEO Lisa Su asserts that the AI chip market has room for multiple players, challenging Nvidia's perceived monopoly. Su emphasizes the need for diverse solutions in the rapidly evolving AI landscape.
3 Sources
3 Sources
AMD unveils its next-generation AI accelerator, the Instinct MI325X, along with new networking solutions, aiming to compete with Nvidia in the rapidly growing AI infrastructure market.
16 Sources
16 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved