AMD Unveils Next-Gen AI Accelerators: MI325X and MI355X to Challenge Nvidia's Dominance

5 Sources

Share

AMD announces its latest AI GPU accelerators, the Instinct MI325X and MI355X, aiming to compete with Nvidia's offerings in the rapidly growing AI chip market.

News article

AMD Launches Instinct MI325X: A Mid-Cycle Upgrade

AMD has unveiled its latest AI GPU accelerator, the Instinct MI325X, as part of its "Advancing AI" event. This new accelerator builds upon the CDNA 3 architecture of its predecessor, the MI300X, offering significant improvements in memory capacity and bandwidth

1

.

Key features of the MI325X include:

  • 256 GB of HBM3e memory (up from 192 GB in MI300X)
  • 6 TB/s of memory bandwidth
  • 2.6 PFLOPs of FP8 and 1.3 PFLOPs of FP16 performance
  • 153 Billion transistors
  • 1000W TDP per GPU

AMD claims the MI325X outperforms NVIDIA's H200 in various AI workloads, including a 40% performance advantage in Mixtral 8x7B inference

1

2

.

Next-Generation MI355X: CDNA 4 Architecture

Looking ahead, AMD has also previewed its next-generation AI accelerator, the Instinct MI355X, scheduled for release in the second half of 2025

3

. Built on TSMC's 3nm process, the MI355X will feature:

  • CDNA 4 architecture
  • 288 GB of HBM3e memory
  • 8 TB/s of memory bandwidth
  • Support for FP4 and FP6 data types
  • Up to 2.3 PFLOPs of FP16 and 4.6 PFLOPs of FP8 performance
  • 9.2 PFLOPs of FP4/FP6 compute performance

AMD projects a 35x performance leap over CDNA 3, with a 7x increase in AI compute and a 50% increase in memory capacity and bandwidth

1

4

.

Competitive Landscape and Market Impact

The introduction of these new accelerators positions AMD to compete more effectively with NVIDIA in the AI chip market. While NVIDIA currently dominates with over 90% market share, AMD's aggressive development cycle and performance claims suggest a narrowing gap

5

.

Key competitive aspects include:

  • Memory capacity advantage: MI355X's 288 GB vs. NVIDIA H200's 141 GB
  • Comparable FP4 compute: 9.2 PFLOPs for MI355X vs. 9 PFLOPs for NVIDIA's Blackwell B200
  • Focus on inference performance: AMD claims significant advantages in LLM inference tasks

Industry Adoption and Future Outlook

AMD's Instinct accelerators have gained support from major AI companies and cloud providers, including Meta, OpenAI, and Microsoft

1

. The company's commitment to an open ecosystem and customer-focused portfolio has contributed to this adoption.

As the AI accelerator market continues to grow, AMD's yearly release cadence for Instinct GPUs mirrors NVIDIA's approach, indicating an intensifying competition in this crucial tech sector

2

4

. The success of these new accelerators could potentially reshape the AI hardware landscape in the coming years.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo