Microsoft Unveils Custom Chips and Infrastructure Upgrades to Boost AI Performance and Efficiency

3 Sources

Share

Microsoft announces two new custom-designed chips for data centers, along with advanced cooling and power delivery technologies, to enhance AI capabilities, security, and energy efficiency in its Azure cloud infrastructure.

News article

Microsoft Introduces Custom-Designed Chips for Data Centers

Microsoft has unveiled two new custom-designed chips aimed at enhancing its data center infrastructure for AI applications. Announced at the Ignite conference, these chips are part of Microsoft's strategy to optimize every layer of its infrastructure stack, from silicon to software, to support advanced AI workloads

1

2

.

Azure Integrated HSM: Boosting Data Center Security

The Azure Integrated HSM (Hardware Security Module) is designed to increase security in data centers. This chip will be installed in every new server destined for Microsoft's data centers beginning next year. Its primary function is to keep crucial encryption and other security data inside the security module, meeting FIPS 140-3 Level 3 security standards

1

2

.

Azure Boost DPU: Enhancing Data Processing Efficiency

The Azure Boost DPU (Data Processing Unit) aims to optimize data centers for highly multiplexed data streams. This chip consolidates multiple components of a traditional server into a single piece of silicon, focusing on cloud storage data. Microsoft claims it can run specific tasks at three times less power and four times the performance compared to current hardware

1

2

.

Advancements in Cooling and Power Management

In addition to the new chips, Microsoft has introduced advancements in data center cooling and power optimization:

  1. An advanced version of its heat exchanger unit - a liquid cooling 'sidekick rack' designed to manage heat emissions from large-scale AI systems

    2

    3

    .
  2. A new disaggregated power rack, developed in collaboration with Meta, featuring 400-volt DC power. This enables up to 35% more AI accelerators in each server rack and allows for dynamic power adjustments

    2

    3

    .

Expanding AI Infrastructure Offerings

Microsoft is also expanding its AI infrastructure offerings with new virtual machines and partnerships:

  1. The upcoming Azure ND GB200 v6 VM series, which will feature Nvidia's latest Blackwell GPUs

    3

    .
  2. The Azure HBv5 VM series, powered by AMD's EPYC 9V64H processors, designed for high-performance computing workloads

    3

    .

Industry Collaboration and Open-Source Initiatives

Microsoft is open-sourcing the cooling and power rack specifications through the Open Compute Project, allowing the broader industry to benefit from these advancements

2

3

.

Impact on AI and Cloud Computing

These developments come at a crucial time, as Goldman Sachs estimates that advanced AI workloads could drive a 160% increase in data center power demand by 2030

2

. Microsoft's investments in custom silicon and infrastructure optimizations aim to address the growing demands of AI applications while improving energy efficiency and performance in cloud computing.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo