Google Deploys NVIDIA's Blackwell GB200 NVL Racks for AI Cloud Platform

Curated by THEOUTPOST

On Thu, 17 Oct, 1:01 PM UTC

2 Sources

Share

Google has begun deploying NVIDIA's cutting-edge Blackwell GB200 NVL racks to power its AI cloud platform, showcasing liquid-cooled high-performance computing capabilities.

Google Embraces NVIDIA's Blackwell Technology for AI Cloud Computing

Google, the third-largest cloud service provider with a 12% global market share, has announced the deployment of NVIDIA's state-of-the-art Blackwell GB200 NVL racks to enhance its AI cloud platform 1. This move signifies a major step in Google's strategy to expand its cloud operations and offer faster, more efficient AI services.

Technical Specifications and Performance

The GB200 NVL racks feature NVIDIA's latest high-performance chips, each containing one Grace CPU and one B200 Blackwell-based data center graphics chip. These chips boast an impressive 90 TFLOPS of FP64 performance 1. While Google has stated that it is using 'custom' GB200 NVL racks, the exact configuration remains undisclosed.

For comparison, NVIDIA's standard GB200 NVL72 configuration includes:

  • 36 Grace CPUs paired with 72 Blackwell GPUs
  • Bandwidth of up to 130 TB/s
  • 3240 TFLOPS of FP64 performance
  • 13.5 TB of HBM3e memory 2

Liquid Cooling Technology

A notable feature of Google's new AI infrastructure is the implementation of liquid cooling for the GB200 high-performance chips. The company shared images on social media showcasing the sleek, liquid-cooled server racks 2. This cooling method is crucial for managing the heat generated by these powerful AI processors, enabling optimal performance and energy efficiency.

Industry Trends and Competition

Google is not alone in adopting NVIDIA's Blackwell technology. Other tech giants and manufacturers have also begun integrating GB200 chips into their systems:

  1. Microsoft has similarly deployed GB200-based servers with liquid cooling 1.
  2. Foxconn, a Taiwanese manufacturer, is using GB200 NVL72 racks to construct Taiwan's fastest supercomputer 2.

Implications for AI and Cloud Computing

The adoption of NVIDIA's Blackwell technology by major cloud providers and tech companies signals a significant advancement in AI and cloud computing capabilities. These powerful systems are expected to accelerate various applications, including:

  • Large Language Models (LLMs)
  • Medical research
  • Cloud storage
  • AI workloads

As NVIDIA's Blackwell chips enter full production, we can anticipate more systems leveraging these industry-leading AI processors, potentially revolutionizing the landscape of AI and high-performance computing 1.

Continue Reading
NVIDIA Unveils GB200 NVL4: A Powerhouse AI Accelerator with

NVIDIA Unveils GB200 NVL4: A Powerhouse AI Accelerator with Quad Blackwell GPUs and Dual Grace CPUs

NVIDIA introduces the GB200 NVL4, a high-performance AI accelerator featuring four Blackwell GPUs and two Grace CPUs on a single board, offering significant improvements in AI and HPC workloads.

Guru3D.com logoTweakTown logoWccftech logoCRN logo

7 Sources

Guru3D.com logoTweakTown logoWccftech logoCRN logo

7 Sources

Microsoft Azure Leads Cloud Innovation with NVIDIA's

Microsoft Azure Leads Cloud Innovation with NVIDIA's Blackwell GB200 AI Servers

Microsoft Azure becomes the first cloud platform to integrate NVIDIA's cutting-edge Blackwell GB200 AI servers, showcasing a significant leap in cloud computing and AI capabilities.

DIGITIMES logoWccftech logoSoftonic logo

3 Sources

DIGITIMES logoWccftech logoSoftonic logo

3 Sources

Supermicro Unveils Liquid-Cooled AI Supercluster with

Supermicro Unveils Liquid-Cooled AI Supercluster with NVIDIA GB200 NVL72 Platform

Supermicro introduces a new liquid-cooled AI supercomputer powered by NVIDIA's GB200 NVL72 platform, offering exascale computing capabilities in a single rack for enhanced energy efficiency in AI data centers.

TweakTown logoMarket Screener logoBenzinga logo

3 Sources

TweakTown logoMarket Screener logoBenzinga logo

3 Sources

HPE Ships First NVIDIA Grace Blackwell GB200 NVL72 System

HPE Ships First NVIDIA Grace Blackwell GB200 NVL72 System with Advanced Liquid Cooling for AI

Hewlett Packard Enterprise announces the shipment of its first NVIDIA Blackwell family-based solution, the GB200 NVL72, designed for large-scale AI deployments with advanced liquid cooling technology.

DIGITAL TERMINAL logoCXOToday.com logo

2 Sources

DIGITAL TERMINAL logoCXOToday.com logo

2 Sources

Nvidia Unveils Blackwell Ultra B300: A Leap Forward in AI

Nvidia Unveils Blackwell Ultra B300: A Leap Forward in AI Computing

Nvidia announces the Blackwell Ultra B300 GPU, offering 1.5x faster performance than its predecessor with 288GB HBM3e memory and 15 PFLOPS of dense FP4 compute, designed to meet the demands of advanced AI reasoning and inference.

Tom's Hardware logotheregister.com logoTechSpot logoInteresting Engineering logo

9 Sources

Tom's Hardware logotheregister.com logoTechSpot logoInteresting Engineering logo

9 Sources

TheOutpost.ai

Your one-stop AI hub

The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.

© 2025 TheOutpost.AI All rights reserved