Curated by THEOUTPOST
On Wed, 18 Dec, 4:02 PM UTC
7 Sources
[1]
Microsoft surges ahead in AI race with massive Nvidia chip acquisition
The big picture: Microsoft has purchased a staggering number of Nvidia's Hopper chips this year, far outpacing what its rivals have been able to procure. With Microsoft's significant lead and its deep ties to OpenAI, the company is well-positioned as tech giants vie for AI supremacy. Microsoft has emerged as the dominant player in the acquisition of Nvidia's Hopper chips. According to estimates from technology consultancy Omdia that were reported in the Financial Times, Microsoft has purchased 485,000 of these flagship processors this year, more than doubling the orders of its nearest rivals in the United States and China. This aggressive procurement strategy has positioned Microsoft at the forefront of AI infrastructure development, outpacing tech giants like Meta, Amazon, and Google. The move comes as the demand for Nvidia's advanced GPUs continues to outstrip supply, a trend that has persisted for nearly two years. Microsoft's substantial investment in Nvidia's chips is closely tied to its $13 billion stake in OpenAI. This partnership has driven Microsoft to rapidly expand its data center capabilities, not only to power its own AI services like Copilot but also to offer robust cloud computing solutions through its Azure platform. Indeed, the tech industry's voracious appetite for AI capabilities has led to unprecedented investments in data center infrastructure. Alistair Speirs, Microsoft's senior director of Azure Global Infrastructure, described the complexity of these projects to the FT, stating, "Good data center infrastructure, they're very complex, capital intensive projects. They take multi-years of planning. And so forecasting where our growth will be with a little bit of buffer is important." While Microsoft leads the pack in the US, Chinese tech giants ByteDance and Tencent have also made significant strides, each ordering approximately 230,000 of Nvidia's chips. These orders include the H20 model, a modified version of the Hopper chip designed to comply with US export controls for Chinese customers. The race for AI dominance extends beyond hardware acquisition. Tech companies are increasingly developing their own custom AI chips to reduce dependence on Nvidia. Google has been refining its Tensor Processing Units (TPUs) for a decade, while Meta recently introduced its Meta Training and Inference Accelerator chip. Amazon is also making strides with its Trainium and Inferentia chips, designed for cloud computing customers. Vlad Galabov, director of cloud and data center research at Omdia, noted the extraordinary impact of AI chip demand on server expenditure, telling the FT that "Nvidia GPUs claimed a tremendously high share of the server capex. We're close to the peak." As the AI landscape continues to evolve, Microsoft's strategic chip acquisitions and partnerships position it as a formidable force in the industry. However, the company recognizes that more than hardware is needed. "To build the AI infrastructure, in our experience, is not just about having the best chip, it's also about having the right storage components, the right infrastructure, the right software layer, the right host management layer, error correction and all these other components to build that system," Speirs said.
[2]
Microsoft acquires twice as many Nvidia AI chips as tech rivals
Microsoft bought twice as many of Nvidia's flagship chips as any of its largest rivals in the US and China this year, as OpenAI's biggest investor accelerated its investment in artificial intelligence infrastructure. Analysts at Omdia, a technology consultancy, estimate that Microsoft bought 485,000 of Nvidia's "Hopper" chips this year. That put Microsoft far ahead of Nvidia's next biggest US customer Meta, which bought 224,000 Hopper chips, as well as its cloud computing rivals Amazon and Google. With demand outstripping supply of Nvidia's most advanced graphics processing units for much of the past two years, Microsoft's chip hoard has given it an edge in the race to build the next generation of AI systems. This year, Big Tech companies have spent tens of billions of dollars on data centres running Nvidia's latest chips, which have become the hottest commodity in Silicon Valley since the debut of ChatGPT two years ago kick-started an unprecedented surge of investment in AI. Microsoft's Azure cloud infrastructure was used to train OpenAI's latest o1 model, as they race against a resurgent Google, start-ups such as Anthropic and Elon Musk's xAI, and rivals in China for dominance of the next generation of computing. Omdia estimates ByteDance and Tencent each ordered about 230,000 of Nvidia's chips this year, including the H20 model, a less powerful version of Hopper that was modified to meet US export controls for Chinese customers. Amazon and Google, which along with Meta are stepping up deployment of their own custom AI chips as an alternative to Nvidia's, bought 196,000 and 169,000 Hopper chips respectively, the analysts said. Omdia analyses companies' publicly disclosed capital spending, server shipments and supply chain intelligence to calculate its estimates. The value of Nvidia, which is now starting to roll out Hopper's successor Blackwell, has soared to more than $3tn this year as Big Tech companies rush to assemble increasingly large clusters of its GPUs. However, the stock's extraordinary surge has waned in recent months amid concerns about slower growth, competition from Big Tech companies' own custom AI chips and potential disruption to its business in China from Donald Trump's incoming administration in the US. ByteDance and Tencent have emerged as two of Nvidia's biggest customers this year, despite US government restrictions on the capabilities of American AI chips that can be sold in China. Microsoft, which has invested $13bn in OpenAI, has been the most aggressive of the US Big Tech companies in building out data centre infrastructure, both to run its own AI services such as its Copilot assistant and to rent out to customers through its Azure division. Microsoft's Nvidia chip orders are more than triple the number of the same generation of Nvidia's AI processors that it purchased in 2023, when Nvidia was racing to scale up production of Hopper following ChatGPT's breakout success. "Good data centre infrastructure, they're very complex, capital intensive projects," Alistair Speirs, Microsoft's senior director of Azure Global Infrastructure, told the Financial Times. "They take multi-years of planning. And so forecasting where our growth will be with a little bit of buffer is important." Tech companies around the world will spend an estimated $229bn on servers in 2024, according to Omdia, led by Microsoft's $31bn in capital expenditure and Amazon's $26bn. The top 10 buyers of data centre infrastructure -- which now include relative newcomers xAI and CoreWeave -- make up 60 per cent of global investment in computing power. Vlad Galabov, director of cloud and data centre research at Omdia, said some 43 per cent of spending on servers went to Nvidia in 2024. "Nvidia GPUs claimed a tremendously high share of the server capex," he said. "We're close to the peak." While Nvidia still dominates the AI chip market, its Silicon Valley rival AMD has been making inroads. Meta bought 173,000 of AMD's MI300 chips this year, while Microsoft bought 96,000, according to Omdia. Big Tech companies have also stepped up usage of their own AI chips this year, as they try to reduce their reliance on Nvidia. Google, which has for a decade been developing its "tensor processing units", or TPUs, and Meta, which debuted the first generation of its Meta Training and Inference Accelerator chip last year, each deployed about 1.5mn of their own chips. Amazon, which is investing heavily in its Trainium and Inferentia chips for cloud computing customers, deployed about 1.3mn of those chips this year. Amazon said this month that it plans to build a new cluster using hundreds of thousands of its latest Trainium chips for Anthropic, an OpenAI rival in which Amazon has invested $8bn, to train the next generation of its AI models. Microsoft, however, is far earlier in its effort to build an AI accelerator to rival Nvidia's, with only about 200,000 of its Maia chips installed this year. Speirs said that using Nvidia's chips still required Microsoft to make significant investments in its own technology to offer a "unique" service to customers. "To build the AI infrastructure, in our experience, is not just about having the best chip, it's also about having the right storage components, the right infrastructure, the right software layer, the right host management layer, error correction and all these other components to build that system," he said.
[3]
Why Microsoft is outspending big tech on Nvidia AI chips
Microsoft Corp. has acquired approximately 485,000 of Nvidia's "Hopper" AI chips this year, leading the market by a significant margin according to Financial Times. The company aims to enhance its artificial intelligence capabilities, particularly within its Azure cloud services. This strategic investment positions Microsoft ahead of competitors like Meta Platforms, which purchased 224,000 chips, and Amazon and Google, which acquired 196,000 and 169,000 chips, respectively. Analysts at Omdia reveal that Microsoft's chip orders exceed those of its closest competitors, indicating its aggressive push in AI infrastructure development. Microsoft is looking to cultivate its AI services, leveraging technologies from OpenAI, in which it has invested $13 billion. This year, tech companies collectively spent tens of billions of dollars on data centers equipped with Nvidia chips, with forecasts suggesting an estimated $229 billion in spending on servers in 2024. Microsoft alone is expected to contribute $31 billion to this total. The dominance of Nvidia in the AI chip market is evident, with the company boasting $35.1 billion in revenue for the third quarter, driven largely by data center sales which accounted for $30.8 billion. Despite Nvidia's tight grip on the industry, competitors like AMD are making strides. Omdia reports that Microsoft purchased 96,000 AMD MI300 chips while Meta acquired 173,000 of the same chips in cooperation with AMD's technological advancements. The demand for advanced graphics processing units has outpaced supply, solidifying Nvidia's position as a key player in AI advancements. This dynamic means that Microsoft's strategic chip acquisitions will likely serve to strengthen its AI framework. However, while these chips offer significant power, challenges persist. Nvidia has faced reports of overheating issues with its upcoming Blackwell AI chips, which could potentially impact companies deploying them, including Microsoft and Meta. The startups Nvidia thinks are the future of AI Despite these challenges, Microsoft continues to heavily invest in building out its data center infrastructure. Alistair Speirs, Microsoft's senior director of Azure Global Infrastructure, noted that constructing effective data center infrastructure is capital intensive and requires multi-year planning. This forward-thinking approach is crucial, especially as the competition intensifies among tech giants like Google, Amazon, and emerging startups such as Anthropic and xAI. AI chip development is under continuous scrutiny, particularly as tech companies strive to reduce reliance on Nvidia. Google is investing significantly in its tensor processing units (TPUs), while Meta has recently rolled out its Meta Training and Inference Accelerator chips. Additionally, Amazon is developing its Trainium and Inferentia chips. Amazon has announced plans to create a new data processing cluster featuring hundreds of thousands of its latest Trainium chips for Anthropic, showcasing a commitment to AI infrastructure. Microsoft's reliance on Nvidia's chips does not preclude it from developing its own AI accelerators, with currently around 200,000 Maia chips installed. Speirs emphasized the necessity of integrating Nvidia's technology with Microsoft's own advancements to deliver unique services to customers. Building comprehensive AI infrastructure encompasses not only robust processing power but also integrated storage components and software layers, highlighting complexities in the system's architecture.
[4]
Microsoft Dominates AI Race With Massive Nvidia Chip Purchase | Investing.com UK
In 2024, Microsoft (NASDAQ:MSFT) purchased 485,000 of Nvidia's (NASDAQ: NVDA) "Hopper" chips, marking a substantial investment in AI infrastructure larger than any of its tech rivals. This aligns with Microsoft's role as the largest investor in OpenAI, utilizing its Azure cloud infrastructure to train OpenAI's latest models. The tech giant is also channeling considerable resources into data centers to support AI-driven services like Copilot and Azure. This strategic spending positions Microsoft at the forefront of AI development, setting it apart from competitors and making it one of Nvidia's most important customers. When acquiring NVIDIA Corporation (NASDAQ:NVDA) chips, Microsoft is outpacing its rivals significantly. The company has procured twice as many chips as its largest U.S. and Chinese competitors. In contrast, Meta (NASDAQ: NASDAQ:META) purchased 224,000 chips, Amazon (NASDAQ:AMZN) acquired 196,000, and Google (NASDAQ:GOOGL) secured 169,000. ByteDance and Tencent (HK:0700) each bought around 230,000 chips. Microsoft's orders are more than triple the amount it purchased in 2023, highlighting its aggressive expansion strategy in AI technology. This substantial investment enhances Microsoft's capabilities and provides Nvidia with a critical revenue stream, helping to elevate its market valuation. Nvidia has seen its valuation soar to over $3 trillion, mainly driven by high demand from major tech companies like Microsoft. The substantial orders from Microsoft significantly contribute to Nvidia's revenue, affirming its leadership in the GPU market. However, Nvidia faces potential challenges as competition intensifies from companies developing custom AI chips. Despite these pressures, Nvidia remains proactive, introducing the successor to the Hopper chip, Blackwell, to sustain its market dominance. The stock of Nvidia (NVDA) has experienced notable price fluctuations, reflecting investor sentiment and market dynamics. As of 10:10 EST, the stock was trading at $134.63, slightly above its previous close of $130.39. The day's trading range has seen a low of $132.81 and a high of $135.78. Over the past year, Nvidia's stock has varied significantly, with a 52-week low of $47.32 and a high of $152.89. These movements indicate a volatile yet promising market environment influenced by Nvidia's strategic initiatives and external market factors. Market analysts maintain a strong buy recommendation for Nvidia, with a recommendation mean of 1.3125. The target price range for Nvidia stock spans from a low of $130.00 to a high of $220.00, with a mean target price of $172.3763. Recent historical closing prices show a slight downward trend, with the stock closing at $130.39 on December 17, 2024, down from $139.31 on December 11. Despite these fluctuations, Nvidia's robust financial metrics suggest a positive long-term outlook, including a market cap of over $3 trillion and a strong revenue base. ***
[5]
Microsoft bought nearly 500,000 Nvidia Hopper chips this year | TechCrunch
Microsoft bought more than twice as many Nvidia Hopper chips this year than any of its biggest rivals. The tech giant bought 485,000 Nvidia Hopper chips across 2024 according to reporting from the Financial Times, which cited data from tech consultancy Omdia. To compare, Meta bought 224,000 of the same, flagship Nvidia chip this year. Microsoft's 2024 chip purchase more than tripled the number of Nvidia chips the company bought in 2023. Microsoft is building its own custom AI chips, too, called Maia, which it announced at its Ignite conference in late 2023. The company has really immersed itself in AI this year. Microsoft deepened its partnership with OpenAI and participated in the company's colossal $6.6 billion funding round in October. The company also signed a deal to reopen the infamous Three Mile Island nuclear plant in September and purchase all of the power that the plant creates for 20 years to help juice its data centers.
[6]
Microsof Scoops Up 485,000 Nvidia AI Chips, Twice As Many As Its Closest Rival Meta: Report - Microsoft (NASDAQ:MSFT), Meta Platforms (NASDAQ:META)
Microsoft Corp. MSFT has reportedly acquired nearly twice as many Nvidia Corp. NVDA AI chips as its closest competitors. This strategic investment is aimed at bolstering its AI capabilities, particularly for its Azure cloud services. What Happened: Analysts at technology consultancy Omdia estimate that Microsoft has purchased approximately 485,000 of Nvidia "Hopper" AI chips this year. This acquisition is significant as it doubles the number acquired by its nearest rival, Meta Platforms Inc. META, which bought 224,000 chips, Financial Times reported on Wednesday. The demand for Nvidia's advanced GPUs has surged since the introduction of ChatGPT, prompting tech giants to invest heavily in AI infrastructure. Analysts from Omdia estimate that ByteDance and Tencent each ordered about 230,000 Nvidia chips, including the H20 model, tailored for Chinese markets. See Also: Investment Guru Peter Lynch: 'If You Can't Explain To An 11-Year-Old In 2 Minutes Or Less Why You Own The Stock, You Shouldn't Own It' Microsoft's aggressive expansion in AI infrastructure is partly driven by its $13 billion investment in OpenAI. The company aims to leverage its AI capabilities for both internal services and customer offerings through Azure. Despite competition, Microsoft continues to rely on Nvidia's chips while developing its own AI accelerators. Why It Matters: The acquisition of Nvidia chips by Microsoft underscores the tech giant's strategic focus on enhancing its AI infrastructure. Nvidia's dominance in the AI semiconductor market is well-documented, with CNBC's Jim Cramer emphasizing that no company can rival Nvidia's technological superiority. Nvidia posted $35.1 billion in revenue in the third quarter, with data center revenue alone reaching $30.8 billion, underscoring its pivotal role in AI infrastructure. However, Nvidia's chips have faced challenges, such as overheating issues in its Blackwell AI chips, impacting companies like Meta and Microsoft. Despite these challenges, Nvidia's chips remain integral to AI advancements, and Microsoft's substantial investment reflects its commitment to staying at the forefront of AI technology. Read Next: Trump's Ex-Secretary Of State Mike Pompeo Backs Nippon Steel's $14.9B Acquisition Of US Steel Citing National Security Concerns Disclaimer: This content was partially produced with the help of Benzinga Neuro and was reviewed and published by Benzinga editors. Image via Shutterstock Market News and Data brought to you by Benzinga APIs
[7]
Microsoft is Nvidia's biggest AI chip buyer of the year, and it's not even close. With ByteDance and Tencent coming out ahead of Zuck, Bezos, and Musk's outfits, too
(Image credit: Walid Berrazeg/SOPA Images/LightRocket via Getty Images) Yeah, I get it, we already know that Nvidia's sold a motherlode of chips and is laughing its way to the bank. But it hits a little differently to see the actual numbers. As reported by the Financial Times, the tech consultancy Omdia estimates that Nvidia's biggest purchaser of Hopper chips in 2024 was Microsoft, who bought 485,000 of them, this being over twice as many as any other company. Meta, for example, bought 224,000 of them, Amazon bought 196,000, and Google bought 'just' 169,000. Surprisingly, though, two of Nvidia's biggest customers were Chinese ones, these being ByteDance (of TikTok fame) and Tencent (of numerous videogames fame). According to Omdia, these each ordered about 230,000 Hopper chips. These won't have been the most powerful ones Nvidia has at its disposal, though, given US-China export restrictions. In other words, despite export controls, Chinese companies received more Nvidia chips than companies such as Meta, Amazon, and Google this year. Apart from all the China biz, there are two other things that strike me about these numbers. First, and very simply: Holy moly does Nvidia churn out a lot of a AI chips. Second: Holy moly does Microsoft buy a lot of them. Nvidia's part kind of goes without saying. The company is firmly cemented as the king of the ever-expanding AI infrastructure castle. So much so, in fact, that the company's CEO, Jen-Hsun Huang, has the chutzpah to claim that "just like we generate electricity, we're now going to be generating AI" in "AI factories". As far as Microsoft's concerned, while I'm a little surprised to find the company more than doubling the purchases of any other, it also makes sense, especially given the partnership with OpenAI. The AI industry can seem a little confusing when you start to look into how all these different companies relate to each other. But we shouldn't forget that while OpenAI technically isn't the biggest company in the AI space, most of the companies that are bigger actually rely on OpenAI's software and models and have partnerships with the company. So much is true for now, at least -- although newer entrants such as Anthropic and Musk's xAI could make inroads. And if we're talking OpenAI partnerships, Microsoft's is the one. To date, Microsoft has apparently invested $13 billion into OpenAI and is the exclusive provider of the company's cloud computing services. This partnership grants Microsoft all kinds of benefits, such as OpenAI model integration with Bing, Microsoft 365, Copilot, and so on, not to mention the ability to rent OpenAI-clad Azure servers out to customers for private or bespoke AI research or services. Oh, and there's the simple matter of monetary ROI. But it's surely crude to speak of such things (profit motives in such a civil society? I think not). People want AI and OpenAI is the biggest software-level solution, so Microsoft heavily invests in and partners with OpenAI, and Nvidia sells Microsoft a sweet, sweet stack of silicone to get the job done. Simple. But who's the real winner? The end-user, of course! Just kidding, the real winner's Nvidia, of course and as always. As Baron Harkonnen of Frank Herbert's Dune tells us: "He who controls the spice controls the universe." And Hopper's the spiciest spice in town, right now. Well, it'll be Blackwell, soon, but that's Nvidia, too. Huang probably made the right choice going into tech and ditching a promising table-tennis career.
Share
Share
Copy Link
Microsoft has purchased 485,000 of Nvidia's Hopper chips in 2024, more than doubling its nearest competitors' orders. This aggressive move positions Microsoft at the forefront of AI infrastructure development, outpacing tech giants like Meta, Amazon, and Google.
Microsoft has emerged as the dominant player in the acquisition of Nvidia's Hopper chips, purchasing a staggering 485,000 units in 2024 12. This move has positioned the tech giant at the forefront of AI infrastructure development, outpacing rivals such as Meta, Amazon, and Google by more than double their orders 123.
Microsoft's aggressive procurement strategy is closely tied to its $13 billion stake in OpenAI 14. The company is rapidly expanding its data center capabilities to power its own AI services, including Copilot, and to offer robust cloud computing solutions through its Azure platform 15.
Alistair Speirs, Microsoft's senior director of Azure Global Infrastructure, emphasized the complexity of these projects, stating, "Good data center infrastructure, they're very complex, capital intensive projects. They take multi-years of planning." 2
While Microsoft leads in the US, Chinese tech giants ByteDance and Tencent have also made significant strides, each ordering approximately 230,000 of Nvidia's chips 12. These orders include the H20 model, a modified version designed to comply with US export controls for Chinese customers 1.
Meta, Amazon, and Google have purchased 224,000, 196,000, and 169,000 Hopper chips respectively 23. The tech industry's voracious appetite for AI capabilities has led to unprecedented investments in data center infrastructure, with global server spending estimated to reach $229 billion in 2024 2.
Nvidia's dominance in the AI chip market has led to a surge in its valuation, reaching over $3 trillion 24. However, the company faces potential challenges as tech giants develop their own custom AI chips to reduce dependence on Nvidia 15.
Google has been refining its Tensor Processing Units (TPUs) for a decade, while Meta recently introduced its Meta Training and Inference Accelerator chip 1. Amazon is also making strides with its Trainium and Inferentia chips, designed for cloud computing customers 12.
Microsoft recognizes that building effective AI infrastructure requires more than just powerful chips. Speirs noted, "To build the AI infrastructure, in our experience, is not just about having the best chip, it's also about having the right storage components, the right infrastructure, the right software layer, the right host management layer, error correction and all these other components to build that system." 12
As the AI landscape continues to evolve, Microsoft's strategic chip acquisitions and partnerships position it as a formidable force in the industry 1. The company's investment in its own Maia chips, with about 200,000 installed this year, indicates its commitment to developing custom AI accelerators 25.
This aggressive push into AI infrastructure development suggests that Microsoft is preparing for a future where AI capabilities will be central to its products and services, potentially reshaping the competitive landscape in the tech industry.
Reference
[2]
[3]
[4]
As Nvidia's stock surges due to AI chip demand, experts warn of potential slowdown. Meanwhile, tech giants like Apple and Google develop in-house AI chips, challenging Nvidia's market position.
3 Sources
3 Sources
Microsoft announces plans to invest $80 billion in AI-enabled data centers during fiscal year 2025, with over half the investment in the US, as part of its strategy to maintain leadership in the global AI race.
25 Sources
25 Sources
Microsoft announces two new custom-designed chips for data centers, along with advanced cooling and power delivery technologies, to enhance AI capabilities, security, and energy efficiency in its Azure cloud infrastructure.
3 Sources
3 Sources
Microsoft announces integration of NVIDIA's Blackwell AI chips in Azure and new AMD EPYC-powered HPC solutions, showcasing advancements in AI computing infrastructure.
3 Sources
3 Sources
Microsoft's Azure AI receives a significant boost with the integration of OpenAI models, despite a slight miss in Q2 earnings. The company remains optimistic about its AI-driven future.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved