The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Tue, 11 Mar, 12:03 AM UTC
3 Sources
[1]
NVIDIA's new GB300 'Blackwell Ultra' AI servers: fully liquid-cooled AI clusters at GTC 2025
TL;DR: NVIDIA is set to unveil its GB300 AI GPUs at GTC 2025, featuring fully liquid-cooled AI clusters. The GB300's energy consumption and heat dissipation needs have increased significantly from the GB200, prompting a "second cooling revolution" with a rise in water-cooling quick connectors. Taiwan's cooling companies are expected to profit significantly. NVIDIA is expected to unveil its new GB300 AI GPUs at GTC 2025 later this month, with the huge shift into fully liquid-cooled AI clusters on its way. In a new report from UDN, we're learning that the energy consumption of NVIDIA's new GB300 AI GPU has "increased significantly" from GB200, and that the heat dissipation demand is greater, which has set off a "second cooling revolution". No longer will more water-cooling plates be introduced, but the use of water-cooling quick connectors will increase 4x with GB300 compared to GB200. NVIDIA's new GB300 AI GPUs will be fully water-cooled, with Taiwan cooling giants Shuanghong and Qihong to make "huge profits" says UDN, but we'll also see water-cooling quick connector suppliers like Fushida and Shishuo will also be a large part of this purported "second cooling revolution". NVIDIA kicks off its GPU Technology Conference (GTC 2025) between March 17 and 21, where we should expect to see NVIDIA CEO Jensen Huang to unveil its next-gen GB300 AI GPU, which should be the highlight of GTC. According to industry insiders, NVIDIA started introducing water-cooling for GB200 with the aim to gradually replace traditional air cooling and that it triggered the first wave of the "cold revolution". GB300 packs considerably more computing power, and far higher power consumption than GB200, as well as higher voltage requirements. We should expect AI server power supplies to increase later this year, with PSU wattage expected to rise from 3kW to 5.5kW, 8kW, and 10kW products... massive power increases, and with great power (comes great responsibility, of course) but more cooling is required. UDN reports that in the response to the huge increase in energy consumption from NVIDIA's new GB300 AI GPU, the internal design is also equipped with more water cooling components and more complex power management design, has led to an increase in the use of water-cooling plates and water-cooling quick connectors.
[2]
NVIDIA GTC 2025: GB300 AI GPU with 1.4kW power, new details on Rubin AI GPU, CPO tech, and more
TL;DR: NVIDIA's GTC event will unveil the GB300 "Blackwell Ultra" AI GPU, featuring 288GB HBM3E memory and a 1.4kW TDP, offering a 50% performance boost over the B200. The Rubin R100 AI GPU will introduce dual logic chips, 384GB HBM4 memory, and a 1.8kW TDP, with production starting in late 2025. NVIDIA is hosting its GPU Technology Conference (GTC) event next week, where we'll be introduced to its beefed-up GB300 "Blackwell Ultra" AI GPUs, new details on its next-gen Rubin R100 AI GPU which will feature next-gen HBM4 memory, new CPO technology, and more. JP Morgan's outlook for NVIDIA GTC 2025 has Blackwell Ultra GB300 as the highlight of the show, with the new GB300 AI chip featuring a logic structure similar to B200 but with huge capacity increases on HBM and massive increases in power consumption. NVIDIA's new GB300 Blackwell Ultra AI GPU will have up to 288GB of 12-Hi HBM3E memory, and a thermal design power (TDP) that has been increased up to 1.4kW. We are to expect a major performance boost from GB300 of up to 50% faster than GB200 in FP4 computing performance, and JP Morgan expects GB300 to start shipping in Q3 2025. Blackwell Ultra: A Performance Beast on the Horizon NVIDIA has B200 and GB300 on the market right now, with B300 and GB300 to be detailed at GTC 2025 next week, but we'll also be receive more details on the company's next-generation Rubin AI arrchitecture. NVIDIA's next-gen Rubin AI GPU is expected to feature a dual logic chip structure, where similiar to Blackwell, Rubin will have two chips fabbed on TSMC's new N3 process node. Rubin will also make the huge leap to next-gen HBM4 memory with 8-Hi stack chips with a total capacity of up to 384GB (up from 288GB HBM3E on B300). Rubin also chows down on even more power than Blackwell, with NVIDIA's next-gen R100 AI GPU expected to have an increased TDP of up to 1.8kW. NVIDIA is expected to upgrade the Vera Arm CPU to TSMC's new N3 process node, potentially using a 2.5D packaging structurer. NVIDIA's new Rubin AI GPU architecture is also expected to move up to a 1.6T network using two ConnectX-9 NICs. Rubin Platform: The New AI Power Engine for 2026
[3]
NVIDIA's "Blackwell Ultra" GB300 AI Servers To Mark The Transition Towards Fully Liquid-Cooled AI Clusters; Set To Be Unveiled At GTC 2025
NVIDIA's GB300 AI servers are set to be unveiled at this year's GTC 2025 conference and will mark the transition towards fully "liquid-cooled" AI clusters. Team Green will unveil the next revolution in the AI industry at GTC 2025. This time, the firm is expected to feature its highly-anticipated "Blackwell Ultra" lineup, which is a much more refined version of the existing Blackwell architecture. While details about NVIDIA's upcoming AI products, notably the GB300 lineup, are slim, we now know that the firm is set to "beef up" the cooling mechanism on the AI servers by up to four times compared to the existing GB200 clusters since the performance with Blackwell Ultra will be too "hot to handle." In a report by Taiwan Economic Daily, it is claimed that the GB300 AI servers are "fully liquid-cooled," removing elements of air-cooled systems. In anticipation of this, the Taiwanese supply chain has seen massive demand for liquid-cooling essentials, given that NVIDIA plans to ramp up the production of Blackwell Ultra soon, since the demand is said to be higher than what we saw with the original Blackwell. It is claimed that heat dissipation figures for the GB300 are much higher than those of its predecessor; hence, optimal cooling is necessary. Using liquid cooling is expected to massively raise the GB300 AI servers' price. Given that the current GB200 NVL72 servers cost around $3 million, it is imminent that the pricing of the GB300's top configuration will be much higher, and that will ultimately translate into more revenue for NVIDIA. However, given the yield rate issues surrounding Blackwell, it would be interesting to see whether customers would opt for the GB300 in huge numbers. Still, given NVIDIA's dominant presence in the market, it certainly looks imminent. For a quick rundown on the Blackwell Ultra "B300" series, it is rumored that NVIDIA is planning to go big on power figures this time, as the GB300 AI server is supposed to feature up to 1400W of TDP, which is a massive rise. With the architectural upgrades, we are looking at around 1.4 times higher FP4 performance when compared to the previous generation and higher memory capacity from 192 GB to 288 GB by effectively utilizing 12-Hi stacks of HBM3E technology. This year's GTC won't only showcase a single lineup, rather NVIDIA has plans to unveil its Vera Rubin lineup as well, giving a rundown on its status, although supply chain launch isn't expected for now.
Share
Share
Copy Link
NVIDIA is set to unveil its GB300 'Blackwell Ultra' AI GPUs at GTC 2025, featuring fully liquid-cooled AI clusters. The new servers promise significant performance improvements and mark a shift in cooling technology for AI infrastructure.
NVIDIA is poised to unveil its next-generation AI GPU, the GB300 'Blackwell Ultra', at the upcoming GPU Technology Conference (GTC) 2025. This new hardware represents a significant leap in AI computing capabilities and marks a transition towards fully liquid-cooled AI clusters 12.
The GB300 is expected to offer substantial improvements over its predecessor:
These enhancements suggest that NVIDIA is pushing the boundaries of AI computing power, catering to the growing demands of complex AI workloads.
The increased power and heat dissipation of the GB300 have necessitated a shift in cooling technology:
This move towards liquid cooling is being referred to as the "second cooling revolution" in the industry 1.
The shift to liquid cooling is expected to have significant implications for the hardware supply chain:
While the GB300 is the immediate focus, NVIDIA is also preparing for the future with its next-generation Rubin AI architecture:
The introduction of the GB300 and the shift to liquid cooling are expected to have significant market impacts:
As the AI industry continues to evolve rapidly, NVIDIA's GB300 'Blackwell Ultra' represents a significant step forward in computing power and efficiency. The move towards fully liquid-cooled AI clusters signals a new era in AI infrastructure, with far-reaching implications for the technology sector and its supply chain.
Reference
[1]
[2]
Nvidia announces the Blackwell Ultra B300 GPU, offering 1.5x faster performance than its predecessor with 288GB HBM3e memory and 15 PFLOPS of dense FP4 compute, designed to meet the demands of advanced AI reasoning and inference.
9 Sources
9 Sources
NVIDIA CEO Jensen Huang confirms the company's plans for Blackwell Ultra and Vera Rubin AI architectures, promising significant advancements in GPU technology for AI and data centers.
7 Sources
7 Sources
NVIDIA announces its upcoming Rubin and Rubin Ultra GPU platforms, along with Vera CPUs, set to revolutionize AI computing in 2026-2027 with unprecedented performance and memory capabilities.
4 Sources
4 Sources
NVIDIA showcases its next-generation Blackwell AI GPUs, featuring upgraded NVLink technology and introducing FP4 precision. The company also reveals its roadmap for future AI and data center innovations.
4 Sources
4 Sources
NVIDIA has renamed its upcoming Blackwell Ultra products to the B300 series, incorporating HBM3E 12-Hi memory and TSMC's CoWoS-L packaging. The new lineup aims to meet diverse performance demands in the AI chip market.
2 Sources
2 Sources