2 Sources
[1]
Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates
The letters AI, which stands for "artificial intelligence," stand at the Amazon Web Services booth at the Hannover Messe industrial trade fair in Hannover, Germany, on March 31, 2025. Amazon said Wednesday that its cloud division has developed hardware to cool down next-generation Nvidia graphics processing units that are used for artificial intelligence workloads. Nvidia's GPUs, which have powered the generative AI boom, require massive amounts of energy. That means companies using the processors need additional equipment to cool them down. Amazon considered erecting data centers that could accommodate widespread liquid cooling to make the most of these power-hungry Nvidia GPUs. But that process would have taken too long, and commercially available equipment wouldn't have worked, Dave Brown, vice president of compute and machine learning services at Amazon Web Services, said in a video posted to YouTube. "They would take up too much data center floor space or increase water usage substantially," Brown said. "And while some of these solutions could work for lower volumes at other providers, they simply wouldn't be enough liquid-cooling capacity to support our scale." Rather, Amazon engineers conceived of the In-Row Heat Exchanger, or IRHX, that can be plugged into existing and new data centers. More traditional air cooling was sufficient for previous generations of Nvidia chips. Customers can now access the AWS service as computing instances that go by the name P6e, Brown wrote in a blog post. The new systems accompany Nvidia's design for dense computing power. Nvidia's GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs that are wired together to train and run large AI models. Computing clusters based on Nvidia's GB200 NVL72 have previously been available through Microsoft or CoreWeave. AWS is the world's largest supplier of cloud infrastructure. Amazon has rolled out its own infrastructure hardware in the past. The company has custom chips for general-purpose computing and for AI, and designed its own storage servers and networking routers. In running homegrown hardware, Amazon depends less on third-party suppliers, which can benefit the company's bottom line. In the first quarter, AWS delivered the widest operating margin since at least 2014, and the unit is responsible for most of Amazon's net income. Microsoft, the second largest cloud provider, has followed Amazon's lead and made strides in chip development. In 2023, the company designed its own systems called Sidekicks to cool the Maia AI chips it developed.
[2]
Amazon Unveils Powerful New AI Servers To Support Nvidia's Most Advanced Chips - NVIDIA (NASDAQ:NVDA), Amazon.com (NASDAQ:AMZN)
Amazon.com AMZN and Amazon Web Services (AWS) announced on Wednesday that they have designed the P6e-GB200 UltraServers to meet the skyrocketing compute demands driven by the latest advancements in generative AI, including trillion-parameter foundation models, reasoning systems, and agentic AI. AWS has engineered custom hardware to cool next-generation Nvidia NVDA GPUs used in AI workloads. Recognizing the high energy demands of Nvidia's latest chips, AWS decided not to wait and build new liquid-cooled data centers. Also Read: Amazon Expands Global Data Centers, Boosts Access To Nvidia AI Chips For Cloud Customers Instead, its engineers developed the In-Row Heat Exchanger (IRHX) -- a solution that integrates with existing and new data centers to manage heat from dense GPU configurations, CNBC reported Wednesday. These UltraServers feature up to 72 Nvidia Blackwell GPUs, all interconnected using fifth-generation Nvidia NVLink, allowing the GPUs to operate as a single, unified compute system. Dave Brown, AWS VP of Compute and Machine Learning Services explained that off-the-shelf cooling options couldn't meet the demands of Nvidia's GB200 NVL72 systems, which pack 72 Blackwell GPUs into a single rack. Traditional air cooling was sufficient for previous GPU generations, but AWS had to innovate to keep pace with the increased compute intensity of today's systems. AWS now offers this infrastructure through its P6e instances, providing customers with the performance required to train and deploy large-scale AI models with advanced thermal support. Microsoft MSFT and CoreWeave CRWV have previously offered computing clusters built on Nvidia's GB200 NVL72 architecture. In March, Intel INTC positioned its SuperFluid cooling technology as a strong contender for managing the heat generated by Nvidia's high-powered GB300 AI chips. Launched in 2023, SuperFluid has reportedly passed verification tests showing it can handle thermal loads up to 1,500 watts, surpassing the 1,400-watt consumption expected from Nvidia's Blackwell Ultra GB300 chips. In June, AI server company Super Micro Computer SMCI expanded its liquid-cooled AI server solutions for Nvidia's Blackwell architecture into the European market. The company introduced over 30 solution stacks for Nvidia HGX B200, GB200 NVL72 and RTX PRO 6000 Blackwell deployments, helping enterprises speed up AI factory rollouts. Supermicro is scaling its portfolio with new liquid-cooled systems, including a 4U front I/O HGX B200 server powered by its DLC-2 technology. These systems enable customers to run high-density AI workloads while maintaining thermally efficient systems. By working closely with Nvidia, Supermicro is also preparing to support next-generation Blackwell Ultra chips, such as the GB300 NVL72 and HGX B300, later this year. Price Action: AMZN stock was down 0.34% at $221.79 premarket as of the last check on Thursday. NVDA is up 0.85%. Read Next: Supermicro's AI Boom, Liquid Cooling Edge Spark Analyst Confidence In Growth Potential Photo: Shutterstock AMZNAmazon.com Inc$221.77-0.35%Stock Score Locked: Edge Members Only Benzinga Rankings give you vital metrics on any stock - anytime. Unlock RankingsEdge RankingsMomentum53.34Growth97.03Quality63.12Value49.35Price TrendShortMediumLongOverviewNVDANVIDIA Corp$164.330.89%CRWVCoreWeave Inc$152.66-0.25%INTCIntel Corp$23.550.47%MSFTMicrosoft Corp$502.50-0.20%SMCISuper Micro Computer Inc$50.410.78%Market News and Data brought to you by Benzinga APIs
Share
Copy Link
Amazon Web Services has developed a custom cooling solution, the In-Row Heat Exchanger (IRHX), to manage heat from dense GPU configurations in AI workloads, particularly for Nvidia's latest chips. This innovation allows AWS to offer high-performance AI infrastructure through its new P6e instances.
Amazon Web Services (AWS) has unveiled a groundbreaking solution to address the cooling challenges posed by the latest generation of NVIDIA GPUs used in artificial intelligence workloads. The company's custom-designed In-Row Heat Exchanger (IRHX) represents a significant advancement in managing the thermal output of high-density GPU configurations 1.
Source: Benzinga
As the AI boom accelerates, the demand for more powerful GPUs has led to increased energy consumption and heat generation. NVIDIA's latest GPUs, particularly the GB200 NVL72 system which packs 72 Blackwell GPUs into a single rack, require innovative cooling solutions beyond traditional air cooling methods 1.
Dave Brown, VP of Compute and Machine Learning Services at AWS, explained that commercially available cooling equipment was insufficient for their needs:
"They would take up too much data center floor space or increase water usage substantially. And while some of these solutions could work for lower volumes at other providers, they simply wouldn't be enough liquid-cooling capacity to support our scale." 1
Rather than constructing entirely new liquid-cooled data centers, which would have been time-consuming and costly, AWS engineers developed the IRHX. This solution can be integrated into both existing and new data centers, providing the necessary cooling capacity without requiring a complete infrastructure overhaul 2.
The new cooling technology enables AWS to offer P6e instances, which provide customers with the computing power needed to train and deploy large-scale AI models. These instances are based on NVIDIA's GB200 NVL72 architecture, which allows the 72 GPUs to function as a unified compute system 2.
AWS's innovation comes amid fierce competition in the AI infrastructure space. Microsoft and CoreWeave have previously offered computing clusters based on NVIDIA's GB200 NVL72 architecture. Meanwhile, Intel has positioned its SuperFluid cooling technology as a competitor, claiming it can handle thermal loads up to 1,500 watts 2.
Source: CNBC
This development reinforces AWS's position as the world's largest cloud infrastructure provider. By creating custom hardware solutions, AWS can potentially reduce its dependence on third-party suppliers and improve its bottom line. In the first quarter, AWS reported its widest operating margin since at least 2014, contributing significantly to Amazon's net income 1.
As the AI industry continues to evolve rapidly, innovations in cooling technology will play a crucial role in enabling the next generation of AI applications and research. AWS's IRHX solution demonstrates the company's commitment to staying at the forefront of this technological revolution.
Databricks raises $1 billion in a new funding round, valuing the company at over $100 billion. The data analytics firm plans to invest in AI database technology and an AI agent platform, positioning itself for growth in the evolving AI market.
12 Sources
Business
19 hrs ago
12 Sources
Business
19 hrs ago
Microsoft has integrated a new AI-powered COPILOT function into Excel, allowing users to perform complex data analysis and content generation using natural language prompts within spreadsheet cells.
9 Sources
Technology
19 hrs ago
9 Sources
Technology
19 hrs ago
Adobe launches Acrobat Studio, integrating AI assistants and PDF Spaces to transform document management and collaboration, marking a significant evolution in PDF technology.
10 Sources
Technology
19 hrs ago
10 Sources
Technology
19 hrs ago
Meta rolls out an AI-driven voice translation feature for Facebook and Instagram creators, enabling automatic dubbing of content from English to Spanish and vice versa, with plans for future language expansions.
5 Sources
Technology
11 hrs ago
5 Sources
Technology
11 hrs ago
Nvidia introduces significant updates to its app, including global DLSS override, Smooth Motion for RTX 40-series GPUs, and improved AI assistant, enhancing gaming performance and user experience.
4 Sources
Technology
19 hrs ago
4 Sources
Technology
19 hrs ago