2 Sources
2 Sources
[1]
Microsoft is building its own AI model
Microsoft $MSFT wants out of the "powered by someone else" business. The company's AI chief, Mustafa Suleyman, told the Financial Times that the company is pushing toward AI "self-sufficiency." That means developing its own advanced foundation models and continuing to reduce its reliance on OpenAI, even as the two companies keep their relationship intact. Microsoft's October 2025 reset with OpenAI preserved the core perks: Microsoft says OpenAI remains its "frontier model partner," and Microsoft's IP rights and Azure API exclusivity run "through 2032," including models "post-AGI." So this is Microsoft buying itself even more room to negotiate, route, and replace. Because when your flagship AI product sits inside Microsoft 365, "single supplier" starts sounding like a vulnerability you have to try to explain on earnings calls. Microsoft can keep selling "Copilot everywhere," but the real prize is making sure the underlying compute, security, and billing stay Microsoft-shaped, no matter which model is hot this quarter. Microsoft is also trying to prove it's not just talking. In August 2025, Microsoft AI previewed MAI-1-preview, calling it "an in-house mixture-of-experts model" that was "pre-trained and post-trained on ~15,000 NVIDIA H100 GPUs," with plans to roll it into certain Copilot text use cases. That's a clear marker of intent: Microsoft is building models, and it's doing it at a meaningful scale, on the same hardware reality as everyone else. Microsoft's new Maia 200 chip is positioned as an inference accelerator "engineered to dramatically improve the economics of AI token generation" -- or, essentially, to take aim at Nvidia $NVDA's software, pairing custom silicon with a software package meant to loosen CUDA's grip. Inference is where the bills stack up -- and where hyperscalers most want leverage. Meanwhile, Microsoft is widening its menu on purpose, hosting models from xAI, Meta $META, Mistral, and Black Forest Labs in its data centers. It has also been willing to use Anthropic models in Microsoft 365 Copilot experiences after internal testing found them better for certain Office tasks, a shift that even involved paying AWS for access. So yes, Microsoft wants to be the place where every winning model runs -- and it wants at least one winner to have a Microsoft badge on it.
[2]
Microsoft eyes AI 'self-sufficiency' after OpenAI deal rejig
Microsoft's (MSFT) AI head Mustafa Suleyman said that the company is pursuing "true self-sufficiency" in AI by developing its own powerful models and reducing its dependence on OpenAI (OPENAI), The Financial Times reported. Suleyman told the news outlet that the Microsoft aims for 'true self-sufficiency' with advanced, internally developed AI to capture more of the enterprise market and reduce reliance on external partners like OpenAI. Investors worry large AI infrastructure spending may be creating a bubble, negatively impacting big tech stock performance, despite long-term expectations for revenue growth. Microsoft is accelerating its in-house AI development alongside investments in external companies to compete for enterprise AI deals, even as rivals like Anthropic lead in key areas.
Share
Share
Copy Link
Microsoft AI chief Mustafa Suleyman revealed the company is developing its own advanced foundation models to reduce reliance on OpenAI. The tech giant previewed MAI-1-preview, trained on 15,000 NVIDIA H100 GPUs, and launched Maia 200 chips for inference. The shift aims to secure a larger share of the enterprise market while maintaining OpenAI partnership through 2032.
Microsoft is accelerating its push toward AI self-sufficiency, signaling a strategic shift that could reshape how the tech giant positions itself in the enterprise AI market. Mustafa Suleyman, Microsoft's AI chief, told the Financial Times that the company is developing its own advanced foundation models and working to reduce reliance on OpenAI, even as their partnership remains intact through 2032
1
. The October 2025 reset with OpenAI preserved core benefits—Microsoft retains Azure API exclusivity and IP rights, including access to models "post-AGI"—but the company is clearly buying itself more room to maneuver1
.
Source: Seeking Alpha
The company isn't just talking about independence. In August 2025, Microsoft AI previewed MAI-1-preview, describing it as "an in-house mixture-of-experts model" that was "pre-trained and post-trained on ~15,000 NVIDIA H100 GPUs"
1
. Plans call for integrating it into certain Copilot text use cases, marking a clear signal that Microsoft is developing powerful in-house AI models at meaningful scale. This effort addresses a vulnerability that becomes harder to explain on earnings calls: when your flagship AI product sits inside Microsoft 365, single-supplier dependence starts looking like a strategic risk1
.Microsoft's new Maia 200 chip represents another pillar of this self-sufficiency strategy. Positioned as an inference accelerator "engineered to dramatically improve the economics of AI token generation," the chip takes direct aim at NVIDIA's software dominance by pairing custom silicon with software designed to loosen CUDA's grip
1
. Inference is where costs accumulate rapidly, and where hyperscalers most want leverage over their infrastructure spending.Related Stories
Microsoft is simultaneously widening its model portfolio, hosting offerings from xAI, Meta, Mistral, and Black Forest Labs in its data centers
1
. The company has even paid AWS for access to Anthropic models after internal testing found them superior for certain Office tasks within Microsoft 365 Copilot experiences1
. This multi-model approach aims to secure a larger share of the enterprise market by ensuring Microsoft can offer whatever performs best, while keeping compute, security, and billing under its control.Investors are watching closely as concerns mount that AI infrastructure spending may be creating a bubble, negatively impacting big tech stock performance despite long-term revenue growth expectations
2
. Microsoft is accelerating in-house development alongside investments in external companies to compete for enterprise AI deals, even as rivals like Anthropic lead in key areas2
. The real prize for Microsoft: making sure that no matter which foundation models dominate next quarter, the underlying infrastructure remains Microsoft-shaped1
.Summarized by
Navi
[1]
[2]
08 Mar 2025•Technology

06 Nov 2025•Technology

24 Dec 2024•Technology

1
Technology

2
Science and Research

3
Policy and Regulation
