54 Sources
54 Sources
[1]
OpenAI signs massive AI compute deal with Amazon
On Monday, OpenAI announced it has signed a seven-year, $38 billion deal to buy cloud services from Amazon Web Services to power products like ChatGPT and Sora. It's the company's first big computing deal after a fundamental restructuring last week that gave OpenAI more operational and financial freedom from Microsoft. The agreement gives OpenAI access to hundreds of thousands of Nvidia graphics processors to train and run its AI models. "Scaling frontier AI requires massive, reliable compute," OpenAI CEO Sam Altman said in a statement. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI will reportedly use Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. Amazon plans to roll out hundreds of thousands of chips, including Nvidia's GB200 and GB300 AI accelerators, in data clusters built to power ChatGPT's responses, generate AI videos, and train OpenAI's next wave of models. Wall Street apparently liked the deal, because Amazon shares hit an all-time high on Monday morning. Meanwhile, shares for long-time OpenAI investor and partner Microsoft briefly dipped following the announcement. Massive AI compute requirements It's no secret that running generative AI models for hundreds of millions of people currently requires a lot of computing power. Amid chip shortages over the past few years, finding sources of that computing muscle has been tricky. OpenAI is reportedly working on its own GPU hardware to help alleviate the strain. But for now, the company needs to find new sources of Nvidia chips, which accelerate AI computations. Altman has previously said that the company plans to spend $1.4 trillion to develop 30 gigawatts of computing resources, an amount that is enough to roughly power 25 million US homes, according to Reuters. Altman has also said that eventually, he would like OpenAI to add 1 gigawatt of compute every week. That ambitious plan is complicated by the fact that one gigawatt of power is roughly equivalent to the output of one typical nuclear power plant, and Reuters reports that each gigawatt of compute build-out currently comes with a capital cost of over $40 billion. These aspirational numbers are far beyond what long-time cloud partner Microsoft can provide, so OpenAI has been seeking further independence from its wealthy corporate benefactor. OpenAI's restructuring last week moved the company further from its nonprofit roots and removed Microsoft's right of first refusal to supply compute services in the new arrangement. Even before last week's restructuring deal with Microsoft, OpenAI had been forced to look elsewhere for computing power: The firm made a deal with Google in June to supply it with cloud services, and the company struck a deal in September with Oracle to buy $300 billion in computing power for about five years. But it's worth noting that Microsoft's compute power is still essential for the firm: Last week, OpenAI agreed to purchase $250 billion of Microsoft's Azure services over time. While these types of multi-billion-dollar deals seem to excite investors in the stock market, not everything is hunky dory in the world of AI at the moment. OpenAI's annualized revenue run rate is expected to reach about $20 billion by year's end, Reuters notes, and losses in the company are also mounting. Surging valuations of AI companies, oddly circular investments, massive spending commitments (which total more than $1 trillion for OpenAI), and the potential that generative AI might not be as useful as promised have prompted ongoing speculation among both critics and proponents alike that the AI boom is turning into a massive bubble. Meanwhile, Reuters has reported that OpenAI is laying the groundwork for an initial public offering that could value the company at up to $1 trillion. Whether that prospective $1 trillion valuation makes sense for a company burning through cash faster than it can make it back is another matter entirely.
[2]
OpenAI and Amazon ink $38B cloud computing deal | TechCrunch
OpenAI isn't done securing the AI infrastructure it needs to rapidly scale agentic workloads. The ChatGPT-maker on Monday said it has reached a deal with Amazon to buy $38 billion in cloud computing services over the next seven years. OpenAI said it will immediately start using AWS compute, with all capacity targeted to be deployed before the end of 2026, with the ability to expand further into 2027 and beyond. The deal follows OpenAI's restructuring last week, which freed the company from having to secure Microsoft's approval to buy computing services from other firms. OpenAI's deal with Amazon is part of its larger mission to grow computing power, spending more than $1 trillion over the next decade. The company has announced new data center buildouts with Oracle, SoftBank, the United Arab Emirates, and others. OpenAI has also secured deals with chipmakers Nvidia, AMD, and Broadcom. To some analysts, the increased investment form OpenAI and other tech giants signals that the industry is heading towards an AI bubble, wherein massive sums are spent beefing out an unproven, and potentially dangerous, technology with no clear sign of a meaningful return on investment.
[3]
OpenAI Signs $38 Billion Deal With Amazon
OpenAI has signed a multi-year deal with Amazon to buy $38 billion worth of AWS cloud infrastructure to train its models and serve its users. The deal is yet another sign of the AI industry becoming increasingly entangled, with OpenAI now at the center of major partnerships with industry players including Google, Oracle, Nvidia, and AMD. The AWS agreement is also notable because OpenAI rose to prominence in part through its partnership with Microsoft -- Amazon's biggest cloud rival. Amazon is also a major backer of one of OpenAI's key competitors, Anthropic. Amazon and Microsoft are currently developing their own AI models to compete with startups like OpenAI. Many now worry that the race to build ever-more infrastructure -- and the unusual financial agreements behind the deals -- are a sign of an AI bubble. Between 2026 and 2027, companies are projected to spend upwards of $500 billion on AI infrastructure in the US, according to reporting by financial journalist Derek Thompson. Patrick Moorhead, chief analyst at Moor Insights & Strategy, says he believes that big tech companies and AI startups have a genuine need for more capacity and see a path to turn compute into profit. He adds that the new deal shows that Amazon is not such a laggard in AI after all. "Many people said they were down and out, but they just put $38 billion up on the board, right, which is pretty exceptional," he says. Moorhead adds that OpenAI's strategy is to limit its dependence on any one cloud provider. "OpenAI is deploying with pretty much everybody at this point," he says. Amazon said in its announcement that it is building custom infrastructure for OpenAI. The setup features two kinds of Nvidia chips, GB200s and GB300s, which Amazon said will be used for both training and inference. The company also said the deal would provide OpenAI with access to "hundreds of thousands of state-of-the-art NVIDIA GPUs, with the ability to expand to tens of millions of CPUs to rapidly scale agentic workloads." OpenAI and other AI players appear to believe that agentic AI will become increasingly important as more users adopt AI tools to navigate the web. "Scaling frontier AI requires massive, reliable compute," OpenAI cofounder and CEO Sam Altman said in the announcement. OpenAI said last week that it would adopt a new for-profit structure that should allow it to raise more money. While the company is still controlled by a nonprofit, its for-profit arm has become a public-benefit corporation.
[4]
OpenAI bets $38B on Amazon's cloud to power ChatGPT and the future of AGI
Amazon shares reportedly hit an all-time high following the news. OpenAI has entered into a multi-year strategic partnership with Amazon Web Services, enabling it to tap into the cloud services provider's infrastructure and compute resources as it ramps up its efforts to build artificial general intelligence (AGI). Valued at $38 billion, the deal will go into effect "immediately," according to a Monday blog post from AWS, and is expected to scale up over the next seven years. Also: 8 ways to make responsible AI part of your company's DNA The partnership, announced less than a week after OpenAI finalized its structural transition into a nonprofit with a subsidiary for-profit arm, will give the company access to hundreds of thousands of Nvidia GB200 and GB300 semiconductors running on Amazon data clusters. The new organizational structure also means OpenAI has more leeway in its relationship with Microsoft, which, from 2019 to 2023, was its sole source of computing power. The revised agreement between the two companies now stipulates that OpenAI can partner with any cloud provider. In addition to its new partnership with AWS, OpenAI has reportedly teamed up with Google, gaining access to the tech giant's cloud computing resources. Also: Want better ChatGPT responses? Try this surprising trick, researchers say (Disclosure: Ziff Davis, ZDNET's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) OpenAI's agreement to run its model training and inference processes on AWS marks a major moment for Amazon's ambitions to become the leading cloud services provider for large-scale AI projects. Amazon's shares hit an all-time high on Monday following the news of the OpenAI partnership, according to Reuters. "AWS has unusual experience running large-scale AI infrastructure securely, reliably, and at scale -- with clusters topping 500K chips," the tech company wrote in its blog post. "AWS's leadership in cloud infrastructure combined with OpenAI's pioneering advancements in generative AI will help millions of users continue to get value from ChatGPT." AWS also announced earlier this year that OpenAI's open-weight models -- gpt-oss-120b and gpt-oss-20b -- had been added to Amazon Bedrock, making them available to millions of AWS customers looking to build their own, custom models for tasks like coding and scientific analysis. The race among tech developers to build new AI systems has driven growing energy demand, prompting many of those companies to invest vast sums in GPUs and the construction of new data centers. Also: Enterprises are not prepared for a world of malicious AI agents In September, OpenAI announced it would open five new data centers distributed across Texas, New Mexico, Ohio, and an undisclosed location in the Midwest, which would cumulatively provide up to seven gigawatts of power -- well over half of the total energy supply the company aims to secure by the end of this year through Project Stargate, the AI infrastructure plan OpenAI launched in January alongside the Trump administration, Oracle, SoftBank, and Emirati investment firm MGX. Those industry-wide efforts to secure ever-expanding reserves of power for AI have also made Nvidia the wealthiest company in the world (by a significant margin) as it's cemented something close to a monopoly over the GPU market. Hoping to reduce its reliance on Nvidia, OpenAI will reportedly begin producing its own proprietary chips sometime next year in partnership with American semiconductor manufacturer Broadcom.
[5]
OpenAI strikes $38 billion AI training deal with Amazon
OpenAI has struck a $38 billion deal with Amazon Web Services that will give the AI giant access to "hundreds of thousands" of Nvidia GPUs to power its AI models. The seven-year partnership comes as Microsoft continues to loosen its grip on OpenAI, dropping its status as its exclusive cloud provider and losing the first right of refusal to host its AI workloads. OpenAI will "immediately" start using AWS compute to train its AI models, according to the press release, with "all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond." Last week, OpenAI completed its for-profit restructuring and announced a new deal with Microsoft that will give the company the rights to OpenAI's technology until it reaches advanced general intelligence (AGI). Under the agreement, OpenAI can work with third parties to develop some AI products and release certain open weight models. The new deal also includes OpenAI's commitment to purchase $250 billion of Microsoft's Azure services, dwarfing the $38 billion it will pay Amazon. OpenAI also has a reported $300 billion contract with Oracle as well, while Amazon continues to pour billions of dollars into the AI startup Anthropic.
[6]
OpenAI signs $38B cloud computing deal with AWS
Amazon deal still dwarfed by $250B Azure commitment made as part of OpenAI's for-profit transformation OpenAI has signed a seven-year, $38 billion agreement with Amazon Web Services, adding another hyperscaler alongside Microsoft Azure for its growing AI compute needs. It's been less than a week since OpenAI completed its for-profit restructuring that saw Microsoft lose its ability to retain a stranglehold on the AI firm's computing resources, and Altman and company have already inked a $38 billion deal with AWS to start data crunching in another cloud. Prior to the restructuring that closed last Tuesday, Microsoft retained a right of first refusal if OpenAI were to take its cloud computing needs elsewhere, but that portion of the pair's partnership vanished with the AI giant formalizing its transition into a for-profit public benefit corporation. The deal will see OpenAI get access to AWS EC2 UltraServers "starting immediately," per a joint press release from the pair. OpenAI's compute capacity on AWS will scale up over the next seven years, ultimately scaling "to tens of millions of CPUs" that the company will use "to rapidly scale agentic workloads." "The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads," AWS CEO Matt Garman said of the new partnership. Along with access to existing AWS resources, OpenAI will also be getting some custom infrastructure deployed for its use in the coming years that will involve some good old fashioned Nvidia GPUs. Per the press release, AWS is building "a sophisticated architectural design optimized for maximum AI efficiency and performance" using clusters of Nvidia Blackwell GB200 and GB300 chips. Those Nvidia GPUs will be clustered on the same network as EC2 UltraServers, AWS added. This announcement comes on the heels of not only OpenAI going for-profit, but also AWS announcing its own rival platform to OpenAI's Stargate project, which aims to build a massive amount of compute power for AI processing with Oracle, Nvidia, and Softbank among OpenAI's partners. While Stargate's launch has been sluggish, AWS' Project Rainier is already up and running, which AWS credited to its total control of its own tech stack, from chips to datacenter design. As was the case last week, AWS credited its "leadership in cloud infrastructure" as one of the reasons OpenAI chose it as its newest cloud computing partner. While this partnership is a sizable one, we're told it won't include any commingling of OpenAI compute and Project Rainier, which Amazon told The Register is entirely separate from its deal with OpenAI. That, and it's not like Sam Altman is walking away from Microsoft entirely: Redmond might have let slip that OpenAI lost an estimated $11.5 billion last quarter, but that was after Microsoft already secured a 27 percent share in OpenAI's new public benefit corporation as well as a commitment from the AI upstart to spend $250 billion on Azure to continue fueling the bulk of its cloud computing needs. Compared to that, it feels more like OpenAI is having a bit of a teenage moment, symbolically stretching its legs and testing its independence while still staying reliant on papa Satya. ®
[7]
OpenAI strikes $38bn computing deal with Amazon
OpenAI has signed a $38bn deal with Amazon Web Services, the latest in a string of agreements struck by the lossmaking start-up it races to secure computing power. The seven-year deal, which takes OpenAI's total recent commitments to close to $1.5tn, will allow OpenAI to make immediate use of AWS infrastructure to run its products, including ChatGPT. The arrangement ties OpenAI more closely to Amazon, which has committed to invest $8bn in rival AI group Anthropic. It also lessens OpenAI's dependence on Microsoft, its biggest backer, for computing power. Amazon's share price rose about 4 per cent in pre-market trading. OpenAI has struck a series of massive deals this year with companies including Nvidia, AMD, Oracle, Broadcom, Google and Samsung. These contracts commit the company to spending close to $1.5tn on computing resources, but they are structured such that OpenAI pays in increments as new power is developed and delivered. The deals have largely been put together by a small team within OpenAI, with little input from external advisers. The company is planning an ambitious build out of data centre infrastructure as it seeks to position itself at the centre of a nexus of companies developing the new technology. Chief executive Sam Altman has said he wants to be able to add 1 gigawatt a week of new capacity from 2030, an amount of power roughly equivalent to the output of a nuclear power plant. Experts question the feasibility of developing so much new capacity so quickly, as well as OpenAI's ability to pay for it. The company's revenue has surged to $13bn on an annualised basis, and Altman said this week that it could hit $100bn by 2027. But OpenAI's voracious appetite for computing power pushed it to a loss of around $12bn in the past quarter alone, according to financial disclosures by Microsoft last week. Altman said he was confident the company would grow to meet its commitments in an interview with investor Brad Gerstner this weekend. "We are taking a forward bet that [revenue] is going to continue to grow and that not only will ChatGPT keep growing, but we will be able to become one of the important AI clouds, that our consumer device business will be a significant and important thing, that AI that can automate science will create huge value," he said. The start-up completed a long-awaited restructure last week, allowing investors to hold equity in the company for the first time and clearing the path for an initial public offering. OpenAI has raised around $60bn to date, and executives at the company anticipate a more conventional corporate structure will make future fundraising more straightforward. As part of the restructure, OpenAI also rewrote a contract that gave Microsoft a right of first refusal on new cloud contracts, clearing the path for the AWS deal.
[8]
OpenAI and Amazon sign $38B deal for AI computing power
SEATTLE (AP) -- OpenAI and Amazon have signed a $38 billion deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's cloud computing services. OpenAI will get access to "hundreds of thousands" of Nvidia's specialized AI chips through Amazon Web Services as part of the deal announced Monday. The agreement comes less than a week after OpenAI altered its partnership with its longtime backer Microsoft, which is no longer the startup's exclusive cloud provider. California and Delaware regulators also last week allowed OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement Monday. It said OpenAI "will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond."
[9]
OpenAI signs $38 billion Amazon Web Services deal for Nvidia-powered cloud compute
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. What just happened? OpenAI is scaling its compute infrastructure beyond Microsoft with a $38 billion capacity purchase from Amazon Web Services, marking its first direct partnership with the largest player in global cloud infrastructure. The deal, announced Monday, establishes OpenAI as a new AWS customer and underscores its move toward a multi-cloud strategy as it readies for a potential public offering. The agreement allows OpenAI to begin deploying workloads on AWS immediately, using hundreds of thousands of Nvidia GPUs across US data centers, with additional capacity expected to come online over several years. Until recently, Microsoft held exclusive rights to host OpenAI's cloud services. The two companies signed their first investment deal in 2019, and Microsoft has since invested a cumulative $13 billion. That exclusivity ended in January, when Microsoft agreed to a non-exclusive framework granting the company the right of first refusal on new cloud workloads. That preferential status expired last week, giving OpenAI more flexibility to partner with other hyperscale providers. The AWS deal follows a series of infrastructure commitments totaling roughly $1.4 trillion with companies including Nvidia, Broadcom, Oracle, and Google. Collectively, those contracts mark the most aggressive capacity expansion effort in the history of the AI sector, raising questions about how much energy and processing power can realistically be delivered in the next decade. For Amazon, the OpenAI partnership arrives at a strong financial moment for its cloud division. The company reported over 20% year-over-year revenue growth at AWS in the company's recent earnings, beating Wall Street forecasts. That's slightly below growth rates at Microsoft Azure and Google Cloud, which reported 40% and 34% expansion, respectively, but still enough to maintain AWS's leadership in total cloud market share. Dave Brown, Amazon's vice president of compute and machine learning services, said the OpenAI deal involves reserved capacity wholly separate from AWS's existing commitments. "It's completely separate capacity that we're putting down," Brown told CNBC. "Some of that capacity is already available, and OpenAI is making use of that." The initial phase will use existing AWS data centers. Amazon plans to build out additional infrastructure dedicated to OpenAI's workloads, extending its commitment beyond current facilities. The agreement centers on Nvidia's latest Blackwell GPUs. AWS will supply the compute power to train future frontier models and support real-time inference - including ChatGPT responses - through its Elastic Compute Cloud. While the two companies have not yet disclosed plans involving Amazon's own Trainium chips, Brown noted that AWS has seen clients such as Anthropic adopt Trainium for advanced workloads. "We like Trainium because we're able to give customers better price performance and honestly give them choice," Brown said, while declining to confirm whether OpenAI would use the chip. For OpenAI, the deal represents a shift away from total dependence on Microsoft Azure and a move toward operational independence. The partnership also strengthens OpenAI's presence on AWS's Bedrock platform, where its foundation and open-weight models are already accessible to enterprise users through managed services. Companies such as Peloton, Thomson Reuters, Comscore, and Triomics use those models for code generation, analytical workloads, and scientific applications. OpenAI CEO Sam Altman said that the firm needs multiple partners to meet growing compute demands. "Scaling frontier AI requires massive, reliable compute," Altman said. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone."
[10]
OpenAI Signs $38 Billion Cloud Computing Deal With Amazon
After signing deals to buy hundreds of billions of dollars in computing power from the chipmakers Nvidia and AMD and the cloud computing company Oracle, OpenAI has reached a multi-billion-dollar agreement with the tech giant Amazon. OpenAI said on Monday that it had agreed to purchase $38 billion in cloud computing services from Amazon over the next seven years. The additional computing power will help OpenAI build and deploy its artificial intelligence technologies, including the chatbot ChatGPT. From 2019 to 2023, OpenAI purchased all of its computing power from Microsoft as part of a complex partnership between the two companies. Their contract stipulated that OpenAI must buy solely from Microsoft, its primary investor, unless Microsoft gave approval for computing deals with other companies. Over the past 18 months, as OpenAI complained that it was not able to get all the computing power it needed from Microsoft, the tech giant allowed the start-up to sign separate deals with two other cloud companies. Then, last week, Microsoft and OpenAI renegotiated their contract, so that OpenAI was free to purchase services from any cloud computing company without Microsoft's approval. Coming just days later, OpenAI's $38 billion agreement with Amazon is part of the A.I. company's fast-speed effort to expand the already enormous pool of computing power that fuels its ambitions. OpenAI is also working to build new computer data centers with Oracle, the Japanese conglomerate SoftBank, the United Arab Emirates and others. To help fund the construction of the data centers, the company has also inked a complex set of deals with chipmakers Nvidia, AMD and Broadcom. As OpenAI secures hundreds of billions of dollars in additional computing power, it is racing to keep pace with many of the tech industry's largest companies. Over the last year, Amazon, Google, Meta and Microsoft committed a total of than $360 billion in capital expenditures. As both OpenAI and these tech giants increase their spending, some financial analysts and tech historians are concerned that the tech industry is heading toward a dangerous bubble. Artificial intelligence remains an unproven and expensive technology that could take years to mature. Though OpenAI, the market leader, is pulling in billions of dollars in yearly revenue, it is not profitable. (The Times has sued OpenAI and Microsoft, claiming copyright infringement of news content related to A.I. systems. The two companies have denied the suit's claims.)
[11]
ChatGPT owner OpenAI signs $38bn cloud computing deal with Amazon
OpenAI has signed a $38bn (£29bn) contract with Amazon to access its cloud computing infrastructure as it continues its run of major partnerships. In 2025, the ChatGPT maker has signed deals worth more than $1tn with Oracle, Broadcom, AMD and chip-making giant Nvidia. As part of the seven-year agreement, OpenAI will gain access to Nvidia graphics processors to train its artificial intelligence models. The deal follows a sweeping restructure of OpenAI last week which saw it convert away from being a non-profit and changed its relationship with Microsoft to give OpenAI more operational and financial freedom.
[12]
OpenAI's $38B cloud deal with Amazon takes ChatGPT maker further beyond Microsoft
ChatGPT maker OpenAI, exercising newfound freedom under its renegotiated Microsoft partnership, will expand its cloud footprint for training and running AI models to Amazon's infrastructure under a new seven-year, $38 billion agreement. The deal, announced Monday, positions Amazon as a major infrastructure provider for Microsoft's flagship AI partner, highlighting seemingly insatiable demand for computing power and increasingly complex alliances among big companies seeking to capitalize on AI. It comes as Microsoft, Amazon, and big tech companies attempt to reassure investors who've grown concerned about a possible bubble in AI spending and infrastructure investment. Under its new Amazon deal, OpenAI is slated to begin running AI workloads on Amazon Web Services' new EC2 UltraServers, which use hundreds of thousands of Nvidia GPUs. Amazon says the infrastructure will help to run ChatGPT and train future OpenAI models. Amazon shares rose nearly 5% in early trading after the announcement. "Scaling frontier AI requires massive, reliable compute," said OpenAI CEO Sam Altman in the press release announcing the deal. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." Matt Garman, the AWS CEO, said in the release that Amazon's cloud infrastructure will serve as "a backbone" for OpenAI's ambitions. In an interview with CNBC, Dave Brown, Amazon's vice president of compute and machine learning services, said the new agreement represents "completely separate capacity" that AWS is building out for OpenAI. "Some of that capacity is already available, and OpenAI is making use of that," Brown told CNBC. Amazon has also been deepening its investment in AI infrastructure for Anthropic, the rival startup behind the Claude chatbot. Amazon has invested and committed a total of $8 billion in Anthropic and recently opened Project Rainier, an $11 billion data center complex for Anthropic's workloads, running on hundreds of thousands of its custom Trainium 2 chips. Microsoft has been expanding its own relationship with Anthropic, adding the startup's Claude models to Microsoft 365 Copilot, GitHub Copilot, and its Azure AI Foundry platform Up to this point, OpenAI has relied almost exclusively on Microsoft Azure for the computing infrastructure behind its large language models. The new deal announced by Microsoft and OpenAI last week revised that relationship, giving OpenAI more flexibility to use other cloud providers -- removing Microsoft's right of first refusal on new OpenAI workloads. At the same time, OpenAI committed to purchase an additional $250 billion in Microsoft services. Microsoft still holds specific IP rights to OpenAI's models and products through 2032, including the exclusive ability among major cloud platforms to offer OpenAI's technology through its Azure OpenAI Service. OpenAI's new $38 billion deal with Amazon builds on a relationship that began earlier this year, when Amazon added OpenAI's first open-weight models in five years to its Bedrock and SageMaker services. Released under an open-source license, those models weren't bound by OpenAI's exclusive API agreement with Microsoft, letting Amazon offer them on its platforms. The latest announcement is part of a series of deals by OpenAI in recent months with companies including Oracle and Google -- committing hundreds of billions of dollars overall for AI computing capacity, and raising questions about the long-term economics of the AI boom.
[13]
OpenAI Inks First Multi-Billion Dollar Deal With Amazon
OpenAI just inked another multibillion dollar deal. The AI giant will pay Amazon $38 billion over the next seven years to gain access to the company's AI infrastructure, the two companies announced on Monday. OpenAI will use Amazon's data centers (which themselves rely on Nvidia chips) to train new AI models and serve inference for ChatGPT. The deal might also expand to include millions of central processing units to power OpenAI's agentic AI ambitions, Amazon said in a press release. All of the targeted capacity will be deployed by the end of next year and could expand further after 2027. The computing deal, which is the first partnership between OpenAI and Amazon's cloud infrastructure arm AWS, comes on the tails of OpenAI's recapitalization last week, which saw the AI giant finally become for-profit. Under the new structure, Microsoft gave up its right of first refusal to be OpenAI's computing supplier, but will continue to have intellectual property rights to OpenAI models through 2032 and will hold a 27% stake. Following the restructuring announcement, Reuters reported that OpenAI was planning for an IPO as early as next year. The newly minted AWS-OpenAI partnership is also just the latest in a substantial dealmaking spree surrounding Sam Altman's company. All within the past few months, the AI giant has signed a $300 billion data center partnership with Oracle, a $10 billion custom in-house AI chips deal with Broadcom, and expanded its CoreWeave deal to $22 billion. On top of that, OpenAI also scored a $100 billion investment by Nvidia, grabbed a 10% stake in chipmaker AMD, and agreed to purchase $250 billion of Microsoft's Azure services. All in all, OpenAI's computing deals so far have topped $1 trillion, according to a report by the Financial Times. The AI industry has become one infinitely expanding and tangled web of multibillion-dollar investments, made by a handful of giant tech companies with overlapping interests. At the heart of this web are the two companies that have become synonymous with what industry analysts call the "AI revolution": Nvidia and OpenAI. With each deal, the tech giants inject more money into the system, enjoy a boost in share prices and hit market capitalizations that have never even been dreamed of before. Just last week, less than a day after announcing a series of partnerships worth billions of dollars, Nvidia became the only company in history to be worth $5 trillion. If all goes according to plan, these circular investments could continue to build up the market and U.S. economic growth. But if AI breakthroughs slow down, or for any reason, demand doesn't pan out the way these companies have imagined it would, then it creates huge risks for the global financial system. Because the system is so tightly-wound right now that no mistake will be isolated: if one company goes down, then they all go down together. That has only supercharged fears of an AI bubble for skeptics. If AI adoption is limited, as skeptics fear might be the case, then these circular deals could be considered "round-tripping," analyst Rishi Jaluria told Gizmodo last month, aka making trades to artificially prop up an asset and make it seem more valuable and in demand than it actually is. For now, there is no consensus on how AI demand is going to scale, so we will have to wait and see what the true impact of this trillions-worth dealmaking spree is going to be.
[14]
Inside the $38 billion OpenAI-AWS partnership
OpenAI can expand its compute usage further into 2027 under this agreement The AI industry is advancing faster than any other technology in history, and its demand for computing power is immense. To meet this demand, OpenAI and Amazon Web Services (AWS) have entered a multi-year partnership which could reshape how AI tools are built and deployed. The collaboration, valued at $38 billion, gives OpenAI access to AWS's vast infrastructure to run and scale its most advanced artificial intelligence workloads. The deal grants OpenAI immediate access to AWS compute systems powered by Nvidia GPUs and Amazon EC2 UltraServers. These systems are designed to deliver high performance and low latency for demanding AI operations, including ChatGPT model training and inference. "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." AWS says the new architecture will cluster GPUs like the GB200 and GB300 within interconnected systems to ensure seamless processing efficiency across workloads. The infrastructure is expected to be fully deployed before the end of 2026, with room to expand further into 2027. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for its AI ambitions," said Matt Garman, CEO of AWS. "The breadth and immediate availability of optimized compute show why AWS is uniquely positioned to support OpenAI's vast AI workloads." AWS's infrastructure, already known for its scalability in cloud hosting and web hosting, is expected to play a central role in the partnership's success. The data centers handling OpenAI workloads will use tightly connected clusters capable of managing hundreds of thousands of processing units. Everyday users may soon notice faster, more responsive AI tools powered by stronger infrastructure behind ChatGPT and similar services. Developers and businesses could gain simpler, more direct access to OpenAI's models through AWS, making it easier to embed AI into apps and data systems. However, the ability to expand this to tens of millions of CPUs raises both technical possibilities and logistical questions about cost, sustainability, and long-term efficiency. This rapid scale-up of compute resources could lead to rising energy use and higher costs for maintaining such vast systems. Also, concentrating AI development under major cloud providers could heighten concerns about dependency, control, and reduced competition. OpenAI and AWS have been in business together for a while. Earlier in the year, OpenAI made its foundation models available through Amazon Bedrock, allowing AWS users to integrate them into their existing systems. The availability of these models on a major cloud hosting platform meant more developers could experiment with generative AI tools for data analysis, coding, and automation. Companies such as Peloton, Thomson Reuters, and Verana Health are already using OpenAI models within the AWS environment to enhance their business workflows.
[15]
OpenAI agrees to spend $38 billion on AWS
Why it matters: The deal with Amazon shows OpenAI is eager to boost its computing capacity from anyone who can provide it. Driving the news: The deal starts immediately and "will have continued growth over the next seven years," the companies said in a statement. * OpenAI will have access to hundreds of thousands of Nvidia GPUs with the option to also expand to tens of millions of other types of chips, known as CPUs -- the processor type most commonly powering PCs and phones. * The move comes just days after OpenAI and Microsoft announced a revised partnership that allows OpenAI to obtain additional computing resources without first offering the business to Microsoft. Reality check: OpenAI has committed to spending more than $1.4 trillion with partners over the next five years. * The company has not explained how it intends to raise that money. * Investors are growing increasingly nervous about the giant sums AI-related companies are pledging to spend -- often with each other -- without the revenue or path to profitability to match. The intrigue: AWS has also invested $4 billion in OpenAI competitor Anthropic, signaling Amazon's push to anchor major AI startups on its infrastructure.
[16]
OpenAI signs $38bn cloud computing deal with Amazon
Agreement to use AWS datacentres, and Nvidia chips inside them, part of $1.4tn spending spree on AI infrastructure OpenAI has signed a $38bn (£29bn) deal to use Amazon infrastructure to operate its artificial intelligence products, as part of a more than $1tn spending spree on computing power. The agreement with Amazon Web Services means OpenAI will be able to use AWS datacentres, and the Nvidia chips inside them, immediately. Last week, OpenAI's chief executive, Sam Altman, said his company had committed to spending $1.4tn on AI infrastructure, amid concerns over the sustainability of the boom in using and building datacentres. These are the central nervous systems of AI tools such as ChatGPT. "Scaling frontier AI requires massive, reliable compute," said Altman on Monday. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI said the deal would give it access to hundreds of thousands of Nvidia graphics processors to train and run its AI models. Amazon plans to use the chips in data clusters that will power ChatGPT's responses and train OpenAI's next wave of models, the companies said. Matt Garman, the chief executive of AWS, said OpenAI continued to push the boundaries of what was possible and that Amazon's infrastructure would serve as a backbone for its ambitions. OpenAI is committed to developing 30 gigawatts of computing resources - enough to roughly power 25 million US homes. Last week, OpenAI said it had converted its main business into a for-profit corporation as part of a reorganisation that valued the startup at $500bn. Longtime backer Microsoft will have a roughly 27% stake in OpenAI's new for-profit corporation. However, the race for computing power by AI companies has raised concerns among some market watchers about how it will be paid for. OpenAI's annual revenue is about $13bn, according the Financial Times, a figure dwarfed by its $1.4tn infrastructure commitment. Other datacentre deals signed by OpenAI include $300bn agreement with US company Oracle. Altman hit back at the spending concerns during a podcast appearance with the Microsoft chief executive, Satya Nadella, saying "enough" to a question from the host, the US investor Brad Gerstner, about the gap between OpenAI's revenue and its infrastructure commitments. Altman said OpenAI made "well more" revenue than the reported $13bn, without specifying a number. He added: "I just - enough ... I think there's are a lot of people who would love to buy OpenAI shares." Analysts at the US investment bank Morgan Stanley estimate that global spending on datacentres will reach nearly $3tn between now and 2028. It said half of that spending would be covered by the big US tech companies and the rest would come from sources such as the private credit market, a growing part of the shadow banking sector that is raising concerns at the Bank of England and elsewhere.
[17]
OpenAI and Amazon sign $38B deal for AI computing power
OpenAI and Amazon have signed a $38 billion deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's cloud computing services. OpenAI will get access to "hundreds of thousands" of Nvidia's specialized AI chips through Amazon Web Services as part of the deal announced Monday. The agreement comes less than a week after OpenAI altered its partnership with its longtime backer Microsoft, which is no longer the startup's exclusive cloud provider. California and Delaware regulators also last week allowed OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement Monday. It said OpenAI "will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond."
[18]
OpenAI signs $38 billion deal to power AI tools with 'hundred of thousands' of Nvidia chips via Amazon Web Services | Fortune
OpenAI and Amazon have signed a $38 billion deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's data centers in the U.S. OpenAI will be able to power its AI tools using "hundreds of thousands" of Nvidia's specialized AI chips through Amazon Web Services as part of the deal announced Monday. Amazon shares increased 4% after the announcement. The agreement comes less than a week after OpenAI altered its partnership with its longtime backer Microsoft, which until early this year was the startup's exclusive cloud computing provider. California and Delaware regulators also last week allowed San Francisco-based OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement Monday. It said OpenAI "will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond." AI requires huge amounts of energy and computing power and OpenAI has long signaled that it needs more capacity, both to develop new AI systems and keep existing products like ChatGPT answering the questions of its hundreds of millions of users. It's recently made more than $1 trillion worth of financial obligations in spending for AI infrastructure, including data center projects with Oracle and SoftBank and semiconductor supply deals with chipmakers Nvidia, AMD and Broadcom. Some of the deals have raised investor concerns about their "circular" nature, since OpenAI doesn't make a profit and can't yet afford to pay for the infrastructure that its cloud backers are providing on the expectations of future returns on their investments. OpenAI CEO Sam Altman last week dismissed doubters he says have aired "breathless concern" about the deals. "Revenue is growing steeply. We are taking a forward bet that it's going to continue to grow," Altman said on a podcast where he appeared with Microsoft CEO Satya Nadella. Amazon is already the primary cloud provider to AI startup Anthropic, an OpenAI rival that makes the Claude chatbot.
[19]
OpenAI and Amazon sign $38 billion deal for AI computing power
OpenAI and Amazon have signed a $38 billion (€33bn) deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's data centres in the US. As part of the deal, announced on Monday, OpenAI will be able to power its AI tools using "hundreds of thousands" of Nvidia's specialised AI chips through Amazon Web Services (AWS). Amazon shares rose 4% following the announcement. The agreement comes less than a week after OpenAI altered its partnership with longtime backer Microsoft, which until earlier this year had been the start-up's exclusive cloud computing provider. California and Delaware regulators also last week approved San Francisco-based OpenAI, originally founded as a non-profit, to move forward with plans to form a new business structure designed to raise capital more easily and operate for profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement on Monday. It added that OpenAI "will immediately start utilising AWS compute as part of this partnership, with all capacity expected to be deployed before the end of 2026, and the potential to expand further into 2027 and beyond." AI development requires enormous amounts of energy and computing capacity. OpenAI has long indicated that it needs more infrastructure both to develop new AI systems and to keep existing products, such as ChatGPT, running for its hundreds of millions of users. The company has recently committed over $1 trillion (€870bn) to AI-related spending, including data centre projects with Oracle and SoftBank and semiconductor supply agreements with Nvidia, AMD and Broadcom. Some of these deals have raised investor concerns about their "circular" nature, since OpenAI is not yet profitable and cannot currently pay for all the infrastructure its cloud partners provide, relying instead on expectations of future returns. OpenAI CEO Sam Altman dismissed these concerns last week, describing them as "breathless" speculation. "Revenue is growing steeply. We are taking a forward bet that it's going to continue to grow," Altman said on a podcast appearance alongside Microsoft CEO Satya Nadella. Amazon is already the primary cloud provider for AI start-up Anthropic, a rival of OpenAI that is developing the Claude chatbot.
[20]
OpenAI inks $38bn cloud deal with AWS after corporate restructure
OpenAI now has access to AWS' compute, made up of 'hundreds of thousands' of Nvidia chips. OpenAI has signed a $38bn deal with Amazon's cloud subsidiary Amazon Web Services (AWS) to run and scale its AI workloads. The partnership, effective immediately, is expected to run for seven years. Under the agreement, OpenAI will access AWS' compute, made up of "hundreds of thousands" of Nvidia GPUs, with the option of expanding to "tens of millions" of CPUs to rapidly scale its agentic workloads. "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." AWS' clusters, consisting of Nvidia GB200s and GB300s GPUs, will be used to train ChatGPT's next-generation models, the cloud giant said. OpenAI also uses Microsoft, Google, Oracle and CoreWeave for its cloud needs. In addition, the company has ongoing chip deals with Nvidia, Broadcom and AMD. The tech juggernaut has expensive plans to continue remaining at the forefront of commercial AI. Last week, Altman said that his company has spent around $1trn on infrastructure so far. Meanwhile, Reuters exclusively reported that OpenAI is laying the groundwork for an IPO debut which would value it at $1trn. Moreover, the AWS partnership comes just after OpenAI restructured its corporate structure, giving its nonprofit lead a $130bn stake, Microsoft a $135bn stake, and confirming its valuation to be at $500bn. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," said Matt Garman, CEO of AWS. "The breadth and immediate availability of optimised compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads." The company said that OpenAI has become one of the most popular publicly available AI model providers in Amazon Bedrock, used by the likes of Bystreet, Comscore, Peloton and Thomson Reuters. Meanwhile, an AWS outage last month caused by a faulty bug took down several dozen websites, disrupting major services, including banks, government websites, and many businesses. Don't miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic's digest of need-to-know sci-tech news.
[21]
OpenAI Chooses AWS in $38 Billion Deal to Run Core AI Workloads | AIM
OpenAI will begin using AWS compute immediately, with all capacity targeted for deployment before the end of 2026 and additional expansion planned through 2027 and beyond. Amazon Web Services (AWS) and OpenAI have announced a multi-year strategic partnership worth $38 billion to run and scale OpenAI's core artificial intelligence (AI) workloads on AWS infrastructure. The deal, which will expand over the next seven years, gives OpenAI access to AWS compute resources comprising hundreds of thousands of NVIDIA GPUs, with the potential to scale up to tens of millions of CPUs. Under the agreement, OpenAI will begin using AWS compute immediately, with all capacity targeted for deployment before the end of 2026 and additional expansion planned through 2027 and beyond. The partnership comes as demand for AI compute continues to surge globally. "Scaling frontier AI requires massive, reliable compute," said Sam Altman, co-founder and CEO of OpenAI. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." Matt Garman, CEO of AWS, said the company's infrastructure would serve as a foundation for OpenAI's growing operations. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," Garman said. "The breadth and immediate availability of optimised compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads." The new infrastructure, featuring NVIDIA GB200 and GB300 GPUs connected via Amazon EC2 UltraServers, is designed to enable low-latency performance across large-scale clusters. These systems will support both inference for ChatGPT and training for next-generation models, with flexibility to adjust to OpenAI's evolving requirements. The collaboration extends the companies' earlier efforts to bring AI capabilities to enterprise customers. OpenAI's open-weight foundation models were recently made available on Amazon Bedrock, allowing AWS customers to integrate them into applications. Companies such as Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health are already using OpenAI's models on AWS for agentic workflows, coding, scientific research, and data analysis. The deal adds to OpenAI's broader effort to secure massive compute resources, following recent partnerships with NVIDIA, AMD, and Broadcom that together represent more than 26 gigawatts of capacity and potential commitments surpassing $1 trillion.
[22]
OpenAI inks $38B AI infrastructure deal with AWS - SiliconANGLE
OpenAI Group PBC will rent $38 billion worth of cloud infrastructure from Amazon Web Services Inc. as part of a seven-year partnership announced today. The news comes less than a week after the ChatGPT developer revised the terms of its relationship with Microsoft Corp. The tech giant was OpenAI's sole cloud provider for about two years. Under the modified partnership, Microsoft no longer has the right of first refusal on OpenAI cloud contracts. The new deal with AWS will give the ChatGPT developer access to hundreds of thousands of Nvidia Corp. graphics processing units. According to the companies, OpenAI will use the chipmaker's latest GB200 and GB300 chips. Both processors combine two graphics processing units with one central processing unit. The GB200 is also available in a supersized version, the GB200 NVL4, that includes four Blackwell GPUs and two CPUs. The GB300, in turn, is based on Nvidia's newer Blackwell Ultra graphics card. A single Blackwell Ultra can provide about 15 petalops of performance. The chips that AWS plans to put at the disposal of OpenAI will be deployed in Amazon EC2 UltraServers. The machines are based on a set of custom components called the AWS Nitro System. One of those components is the Nitro Security Chip, which offloads certain cybersecurity tasks from an UltraServer's main processors. OpenAI will start using AWS infrastructure immediately. The companies plan to deploy all the computing capacity included in the contract before the end of 2026. From 2027 and onwards, OpenAI will have the ability to further expand its AWS environment. "Scaling frontier AI requires massive, reliable compute," said OpenAI Chief Executive Officer Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI's largest cloud provider is Oracle Corp., which recently won a $300 billion infrastructure contract from the ChatGPT developer. The companies are building a network of U.S. data centers that will have 4.5 gigawatts of computing capacity. One gigawatt is enough to power about 750,000 homes. The Oracle-built facilities are headlined by a campus in Abilene, Texas that started coming online earlier this year. According to the company, the campus will contain 450,000 Nvidia GPUs at full capacity. OpenAI also continues to use infrastructure from Microsoft, its former exclusive cloud provider. The ChatGPT developer agreed to purchase $250 billion worth of Azure compute capacity as part of its recent reorganization. OpenAI also has infrastructure deals with CoreWeave Inc. and Google LLC. OpenAI's best-funded startup rival, Anthropic PBC, likewise uses AWS infrastructure to power its AI models. Last week, the Amazon.com Inc. unit opened a $11 billion data center campus dedicated to running Anthropic's training and inference workloads. The campus contains about 500,000 of AWS' custom AWS Trainium2 chips, a number that is expected to double by year's end.
[23]
OpenAI Signs 38 Billion USD Nvidia GPU Cloud Deal with Amazon Web Services
OpenAI is teaming up with Amazon Web Services (AWS) in a massive 38 billion USD deal to expand its computing power for AI training and model deployment. Instead of building new data centers from scratch, OpenAI will rent space in AWS facilities and use Nvidia GPUs to handle large-scale AI workloads. The plan is to partly run ChatGPT and train future models directly on AWS infrastructure. AWS will supply its advanced EC2 UltraServers for this project. These systems link together several compute clusters to cut down on latency and keep memory usage efficient, which is essential when training or running massive AI models. OpenAI isn't using Amazon's in-house Trainium2 chips here; instead, it's going with Nvidia's newer GB200 and GB300 hardware. The GB300 setup is especially powerful -- it can run up to 72 Blackwell GPUs, offering about 360 petaflops of performance and 13.4 terabytes of HBM3e memory. Neither company gave exact details about how many GPU clusters will be deployed, but OpenAI mentioned "hundreds of thousands of GPUs" with the option to scale up to "tens of millions of CPUs." That scale will give OpenAI a major performance boost, making it easier to train bigger, more complex models and to keep ChatGPT running smoothly for millions of users worldwide. The rollout is expected to wrap up by the end of 2026, though both companies said the partnership could continue into 2027. AWS will manage the physical hardware, while OpenAI runs its AI workloads through AWS's EC2 system. This setup gives OpenAI a lot of flexibility without having to build or maintain its own global data center network. For AWS, this partnership strengthens its position in the competitive AI cloud market, going head-to-head with Microsoft Azure and Google Cloud, both of which also rely on Nvidia GPUs for AI workloads. For OpenAI, it secures long-term access to high-end GPU resources -- a crucial advantage given the ongoing global shortage of AI-grade hardware. Source: OpenAI
[24]
OpenAI inks $38 billion AWS contract after cutting Microsoft approval ties
OpenAI will immediately begin using Amazon Web Services compute capacity under the pact. OpenAI announced a $38 billion agreement with Amazon on Monday to purchase cloud computing services over seven years, enabling the company to scale its AI infrastructure following a recent restructuring that removed the need for Microsoft's approval on such deals. The deal allows OpenAI to begin utilizing Amazon Web Services compute resources immediately. All purchased capacity under the agreement is scheduled for deployment before the end of 2026. OpenAI has the option to extend its usage of AWS into 2027 and subsequent years, supporting ongoing expansion of its computational capabilities. This partnership with Amazon forms a key component of OpenAI's overarching strategy to significantly increase its computing power. The company anticipates spending more than $1 trillion on such infrastructure over the next decade to meet the demands of developing advanced AI systems, including agentic workloads that power tools like ChatGPT. In parallel efforts, OpenAI has initiated new data center construction projects with multiple partners, including Oracle, SoftBank, and the United Arab Emirates, along with additional collaborators. These buildouts aim to bolster the physical infrastructure necessary for large-scale AI operations. OpenAI has also finalized agreements with leading chip manufacturers Nvidia, AMD, and Broadcom to secure essential hardware components. These deals ensure access to high-performance processors critical for training and running complex AI models.
[25]
Amazon secures $38B deal to provide OpenAI access to NVIDIA GB200, GB300 AI servers
TL;DR: Amazon Web Services (AWS) and OpenAI have formed a multi-year, $38 billion partnership, providing OpenAI access to NVIDIA GB200 and GB300 AI servers hosted on AWS EC2 UltraServers. This collaboration supports OpenAI's goal to expand AI infrastructure, enhancing large-scale model training and advanced AI deployment by 2026. Amazon and OpenAI have just announced a new partnership, where Amazon Web Services (AWS) will be one of the primary compute providers to the AI startup, hosting NVIDIA's GB200 and GB300 AI servers. The multi-year partnership between AWS and OpenAI will see OpenAI getting access to a pool of Amazon's NVIDIA GB200 and GB300 AI servers, in a deal worth around $38 billion, and will span 7 years. OpenAI gets access to NVIDIA GB200 and GB300 AI servers which is important, as all planned capacity is expected to be deployed for use by the end of 2026. This means that OpenAI will have even more access to the best NVIDIA AI GPU hardware by next year, helping it continue to expand its AI operations and services. OpenAI has been signing new deals left, right, and center, including with US tech giants NVIDIA, AMD, Microsoft, Broadcom, and Oracle, and now adding Amazon to that growing list. These new AI servers that OpenAI will have access to are set up across Amazon EC2 UltraServers, which are linked together by a fast, high-capacity network that was built to handle both inference, and large-scale model training. AWS is currently hosting massive clusters with over 500,000 chips, one of the big key factors in the decisions at OpenAI. OpenAI CEO Sam Altman has said that the ChatGPT creator plans to invest $1.4 trillion -- yes, with a "T" -- to build 30 gigawatts of computing capacity, which works out to the same power consumption as around 25 million US households. OpenAI has so far inked deals with NVIDIA for 10 GW, AMD for 6 GW, and Oracle for 4.5 GW of compute capacity, as it slowly builds its steps up to the goal of building a 30 GW AI infrastructure network. OpenAI co-founder and CEO Sam Altman said: "Scaling frontier AI requires massive, reliable compute. Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone". Matt Garman, CEO of AWS, added: "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions. The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads".
[26]
OpenAI and Amazon sign $38B deal for AI computing power
SEATTLE (AP) -- OpenAI and Amazon have signed a $38 billion deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's cloud computing services. OpenAI will get access to "hundreds of thousands" of Nvidia's specialized AI chips through Amazon Web Services as part of the deal announced Monday. The agreement comes less than a week after OpenAI altered its partnership with its longtime backer Microsoft, which is no longer the startup's exclusive cloud provider. California and Delaware regulators also last week allowed OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement Monday. It said OpenAI "will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond."
[27]
OpenAI signs $58b cloud deal with Amazon
San Francisco | After signing deals to buy hundreds of billions of dollars in computing power from chipmakers Nvidia and AMD and cloud computing company Oracle, OpenAI has reached a multibillion-dollar agreement with Amazon. OpenAI said Monday that it had agreed to purchase $US38 billion ($58 billion) in cloud computing services from Amazon over the next seven years. The additional computing power will help OpenAI build and deploy its artificial intelligence technologies, including the ChatGPT chatbot.
[28]
OpenAI and Amazon Sign $38B Deal for AI Computing Power
SEATTLE (AP) -- OpenAI and Amazon have signed a $38 billion deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's cloud computing services. OpenAI will get access to "hundreds of thousands" of Nvidia's specialized AI chips through Amazon Web Services as part of the deal announced Monday. The agreement comes less than a week after OpenAI altered its partnership with its longtime backer Microsoft, which is no longer the startup's exclusive cloud provider. California and Delaware regulators also last week allowed OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement Monday. It said OpenAI "will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond."
[29]
OpenAI, Amazon sign $38 billion computing deal
OpenAI has struck a $38 billion agreement with Amazon Web Services (AWS) to secure cloud computing power from the tech giant, the ChatGPT maker announced Monday. The multiyear deal will give the AI firm access to "hundreds of thousands of state-of-the-art" Nvidia chips, as the company seeks vast computing power to continue developing its AI models. "Scaling frontier AI requires massive, reliable compute," OpenAI CEO Sam Altman said in a statement. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI is set to begin using AWS computing power immediately, with the Amazon cloud computing arm aiming to have all of the targeted capacity deployed before the end of 2026, it noted in a blog post. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," AWS CEO Matt Garman said in a statement. The AWS deal is the latest in a series of agreements for OpenAI, which has also recently unveiled partnerships with companies like Nvidia, AMD and Broadcom. The ChatGPT maker announced last month that it was partnering with Nvidia to deploy 10 gigawatts, or about 4 million to 5 million chips. Several weeks later, it signed a deal with AMD to obtain 6 gigawatts' worth of the company's AI chips. This was followed by an agreement with Broadcom to deploy 10 gigawatts of custom AI accelerators for OpenAI. It has also reportedly signed a $300 billion deal with Oracle for 4.5 gigawatts of power. The company's recent dealmaking spree has raised questions about its revenue stream as it seeks to make good on about $1 trillion in commitments, according to the Financial Times. Amid a push for more resources, OpenAI completed a restructuring of the company last week, converting its for-profit arm into a public benefit corporation and giving the nonprofit a controlling stake. It initially sought to remove the nonprofit's control over the for-profit but backed away after facing pushback.
[30]
Does OpenAI's $38 Billion Deal With Amazon Signal a Breakup With Microsoft?
On Monday, tech giants OpenAI and Amazon Web Services (AWS) announced a multi-year partnership, marking the artificial intelligence company's next step towards massive scaling and a step away from its long-time parter Microsoft. Effective immediately, OpenAI now has access to Amazon's infrastructure as part of a seven year $38 billion deal. The agreement provides the AI company with access to hundreds of thousands of NVIDIA graphic processing units as it begins running its workload on AWS's infrastructure. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," said Matt Garman, CEO of AWS said in a press release. At the time of publishing, Amazon's stock had jumped 5 percent following news of the announcement. In addition to paving the path towards rapid expansion and growth of its ChatGPT large language model and other AI initiatives, the deal also marks the AI company moving on from Microsoft, its longtime cloud services provider. However, they're still presenting a united front. "As we step into this next chapter of our partnership, both companies are better positioned than ever to continue building great products that meet real-world needs, and create new opportunity for everyone and every business," the companies said in a joint press release on October 28. The companies are weathering a complicated relationship, with Microsoft investing up to $13 billion in OpenAI since an initial $1 billion in 2019 that came with an exclusivity agreement to use Microsoft cloud services. Last week, both tech companies renegotiated an agreement which allowed OpenAI to buy cloud services from any provider. "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman in a press release. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." The multibillion dollar deal follows a series of OpenAI investments to boost the company's scalability and computing power, including a data center deal with Oracle, a cloud deal with Google, and a data center projects in the United Arab Emirates.
[31]
OpenAI Turns to Amazon in $38 Billion Cloud Services Deal After Restructuring
* OpenAI and Amazon have inked a seven-year deal * The move comes shortly after the AI firm was restructured * OpenAI will begin using Amazon Web Services immediately OpenAI has signed a seven-year, $38 billion deal to buy cloud services from Amazon.com, in its first big push to power its AI ambitions after a restructuring last week that gave the ChatGPT maker greater operational and financial freedom. The agreement, announced on Monday, will give OpenAI access to hundreds of thousands of Nvidia graphics processors to train and run its artificial intelligence models. The deal underscores the AI industry's insatiable appetite for computing power as companies race to build systems that can rival or surpass human intelligence. OpenAI CEO Sam Altman has said the startup is committed to spending $1.4 trillion to develop 30 gigawatts of computing resources - enough to roughly power 25 million U.S. homes. The deal is also a major vote of confidence for the e-commerce giant's cloud unit, Amazon Web Services, which some investors feared had fallen behind rivals Microsoft and Google in the artificial intelligence race. Those fears were somewhat eased by the strong growth the business reported in the September quarter. Amazon shares hit an all-time high on Monday, with the company set to add nearly $140 billion to its market value. The stock was last up 5%, following a near-10% jump on Friday. Microsoft shares had briefly dipped on the news. "This is a hugely significant deal (and is) clearly a strong endorsement of AWS compute capabilities to deliver the scale needed to support OpenAI," said PP Foresight analyst Paolo Pescatore. Reliable Compute "Scaling frontier AI requires massive, reliable compute," said Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. Amazon plans to roll out hundreds of thousands of chips, including Nvidia's GB200 and GB300 AI accelerators, in data clusters built to power ChatGPT's responses and train OpenAI's next wave of models, the companies said. Amazon already offers OpenAI open weight models on Amazon Bedrock, which offers multiple AI models for businesses using AWS. Altman has said that eventually, he would like OpenAI to add 1 gigawatt of compute every week - an astronomical sum as each gigawatt currently comes with a capital cost of over $40 billion. OpenAI's sweeping restructuring last week moved it further away from its non-profit roots and removed Microsoft's first right to refusal to supply compute services in the new arrangement. Reuters has reported that OpenAI was laying the groundwork for an initial public offering that could value the company at up to $1 trillion. But surging valuations of AI companies and their massive spending commitments, which total more than $1 trillion for OpenAI, have raised fears that the AI boom is turning into a bubble. While OpenAI's relationship with Microsoft, which the two forged in 2019, has helped push Microsoft to the top spot among its Big Tech peers in the AI race, both companies have been trying to reduce reliance on each other. OpenAI has already tapped Alphabet's Google to supply it with cloud services, as Reuters first reported in June. It also reportedly struck a deal with Oracle to buy $300 billion in computing power for about five years. The hefty commitments from OpenAI has raised some eyebrows on Wall Street, with analysts and investors questioning how the loss-making company will fund all those deals. Last week, OpenAI also agreed to purchase $250 billion of Microsoft's Azure cloud services, as part of the restructuring deal. While OpenAI's annualized revenue run rate is expected to reach about $20 billion by the year end, losses in the company are also mounting, sources have told Reuters.
[32]
Amazon's Stock Climbs to Fresh High on $38B Cloud Computing Deal With OpenAI
The agreement allows OpenAI to use Amazon's bank of hundreds of thousands of Nvidia chips. Amazon shares rose to a record high Monday after the tech giant said it struck a $38 billion deal with OpenAI. Shares of Amazon (AMZN) were up close to 5% around $255 in recent trading, leaving them on track to close at a record high. Read Investopedia's full daily markets coverage here. Amazon said the agreement, which begins immediately and runs over seven years, gives OpenAI access to Amazon Web Services' bank of hundreds of thousands of Nvidia (NVDA) chips, with the option to expand. The company said that "AWS's leadership in cloud infrastructure combined with OpenAI's pioneering advancements in generative AI will help millions of users continue to get value from ChatGPT," OpenAI's popular AI chatbot. OpenAI CEO Sam Altman said the partnership "strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." AWS CEO Matt Garman said that OpenAI's use of AWS "will serve as a backbone for their AI ambitions." With Monday's surge, Amazon shares have added close to 17% in 2025, with most of those gains coming in the past week since the tech giant posted better-than-expected earnings on strong growth in its cloud business.
[33]
Amazon's $38 billion OpenAI deal shows it is no longer an AI laggard
After years of leading the cloud computing industry with its highly profitable Amazon Web Services business, Amazon has watched Microsoft and Alphabet's Google snatch big-ticket contracts with their AI-steeped clouds. Amazon's $38 billion cloud deal with OpenAI marks a major endorsement for the e-commerce giant's cloud business after recent setbacks, including ceding market share to rivals and an outage that disrupted large parts of the internet. After years of leading the cloud computing industry with its highly profitable Amazon Web Services business, Amazon has watched Microsoft and Alphabet's Google snatch big-ticket contracts with their AI-steeped clouds. Its lead in the cloud market slipped to 29% as of September, from 34% a few months before ChatGPT was launched in 2022, according to data from Synergy Research Group. Amazon was considered a laggard in the AI race by many investors because it was late to launch a flagship large language model and for failing to offer a consumer-facing chatbot like OpenAI's ChatGPT. Recently, though, the company has ramped up spending on its AI efforts, and last month opened an $11 billion AI data center in Indiana called Project Rainier, where startup Anthropic's models are being trained using Amazon's own Trainium chips. Monday's deal with OpenAI, a marquee customer, coupled with strong quarterly results last, suggests AWS is regaining momentum, analysts and investors said. "While it is small relative to other deals OpenAI has made with other cloud providers, it represents a key first step in Amazon's effort to partner with a company that is spending over a trillion dollars on computing power in the coming years," said Mamta Valechha, analyst at Quilter Cheviot. Amazon's stock rose 5% after the deal to a record high after it traded little changed for most of the year, lagging the gains seen in other Big Tech stock that have surged on cloud-computing deals worth hundreds of billions of dollars with AI startups. Microsoft last week disclosed a $250 billion OpenAI commitment for its Azure cloud services under a new arrangement that allowed OpenAI to restructure itself, while Oracle has signed a $300 billion deal with the startup. Google has a chip agreement worth tens of billions with Anthropic among other AI tie-ups. Higher AI spending to drive gains Amazon's efforts have in part been hampered by executive losses. A key vice president helping oversee generative AI development left for another company, Reuters reported in June. To stay competitive and fund the costly data centers needed to support the technology, CEO Andy Jassy has tried to cut through management layers and even installed an anonymous complaint line for identifying inefficiencies. The company said last week it would reduce its corporate workforce by about 14,000 in one of its biggest layoffs. It is also spending more on AI, with its capital expenditure expected to total around $125 billion this year and more the year after. That is more than Alphabet's planned outlay of up to $93 billion, and roughly in-line with what Wall Street expects Microsoft to spend this year. Analysts said the OpenAI deal offers a credible path for Amazon to recoup its spending. Brian Pitz, an analyst at BMO Capital Markets, estimates that this may boost AWS's backlog by about 20% in the fourth quarter ending December, from $200 billion as of September end. "It clearly seems like they (Amazon) are finally in the flow of what is happening with these large language models versus before," said William Lee, an investor at SuRo Capital that holds equity in OpenAI.
[34]
AWS' $38 Billion OpenAI Deal 'Exposes How Dependent AI' Is On Centralized Infrastructure, AI Startup CEO Says
"The industry's biggest players are now effectively trading power to control intelligence the way others trade energy -- concentrating enormous financial and operational power in a few providers," said Eric Yang, CEO of AI startup Gradient. The world's largest cloud company signed a massive $38 billion deal with AI superstar OpenAI this week that sent shockwaves through the tech industry due to the eye-popping deal size. OpenAI has immediately started running large amounts of workloads on AWS' cloud infrastructure with huge capacity expansions planned over the next several years. Gradient CEO Eric Yang said the scale of the new AWS and OpenAI cloud deal shows how quickly AI has become a capital market in its own right. "The industry's biggest players are now effectively trading power to control intelligence the way others trade energy -- concentrating enormous financial and operational power in a few providers," said Yang, CEO and President of AI startup Gradient. [Related: Amazon CEO Explains Layoffs, Buying 'A Lot Of Nvidia,' AI Chip Innovation And Doubling AWS Capacity] "That model can deliver short-term efficiency but also exposes how dependent AI has become on centralized infrastructure," Yang said in an email to CRN. With the $38 billion deal, AWS will provide OpenAI with the computing power it needs for its popular ChatGPT AI and help scale AI agents faster. OpenAI will access AWS' compute resources that are comprised of hundreds of thousands of Nvidia GPUs, with the ability to expand to tens of millions of CPUs to rapidly scale agentic AI workloads. Yang said having the majority of AI intelligence and workloads concentrated on centralized infrastructure, like the public cloud, is concerning. "The next challenge is ensuring that intelligence itself doesn't stay trapped there," the CEO said. "The real opportunity lies in building systems that can operate independently of those central clouds -- where people and organizations can run and evolve AI on their own terms," he said. Founded in 2024, Yang's AI startup Gradient is developing ways to build and run AI without needing to plug into the cloud infrastructure market led by AWS, Microsoft and Google Cloud. "That's the foundation of a more sovereign approach to intelligence, one that grows from the edge rather than the center," Yang said. This week Gradient unveiled it has open-sourced Parallax, the company's new operating system for AI that lets people and teams build, host, and access AI applications on their own devices and networks. "Parallax is the easiest way to host your own agents and applications privately, efficiently, and without relying on the cloud," said Yang. The infrastructure deployment that AWS is building for OpenAI features a sophisticated architectural design optimized for maximum AI processing efficiency and performance. AWS said clustering Nvidia GPUs -- both its GB200s and GB300s -- via Amazon EC2 UltraServers on the same network enables low-latency performance across interconnected systems, allowing OpenAI to efficiently run workloads with optimal performance. These clusters are designed to support various workloads, from serving inference for ChatGPT to training next-generation models. "With this new $38B agreement, OpenAI will immediately start using our world-class infrastructure -- including Amazon EC2 UltraServers packed with hundreds of thousands of state-of-the-art Nvidia GPUs and the ability to scale to tens of millions of CPUs," said AWS CEO Matt Garman in a LinkedIn post this week. OpenAI CEO and co-founder Sam Altman said his company needs massive amounts of compute capability to scale AI to the masses. "Scaling frontier AI requires massive, reliable compute," said Altman in a statement. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." AWS recently reported total sales of $33 billion during its third quarter of 2025, up 20 percent year over year compared to Q3 2024. The company now has a $132 billion annual run rate.
[35]
Amazon Enters a Mega-Deal With OpenAI to Provide NVIDIA's GB200 & GB300 AI Servers in a 'Surprising' Seven-Year Agreement
OpenAI has once again entered into a partnership with a major player in Big Tech, and this time it is none other than Amazon, as AWS will now become one of the primary compute providers. OpenAI-Amazon Deal Will Span Over Seven Years With a Partnership Worth $38 Billion; No Mention of Trainium ASICs Well, Sam Altman's OpenAI has managed to secure several deals in the past few weeks, involving almost everyone in the AI frenzy, and these deals are targeted at gaining access to additional compute power. Now, in an announcement by Amazon, it is disclosed that AWS and OpenAI have signed a 'multi-year' partnership, where OpenAI will get access to a pool of Amazon's NVIDIA-based AI servers, and the collaboration is expected to be worth $38 billion, spanning seven years, which is simply amazing. AWS has unusual experience running large-scale AI infrastructure securely, reliably, and at scale-with clusters topping 500K chips. AWS's leadership in cloud infrastructure combined with OpenAI's pioneering advancements in generative AI will help millions of users continue to get value from ChatGPT. The deal is expected to provide OpenAI with access to NVIDIA's GB200 and GB300 AI servers. More importantly, all planned capacity is expected to be deployed for use by the end of 2026, which means that OpenAI will have access to a vast pool of AI chips by next year, fueling the company's ambitions to go a step beyond generative AI. Interestingly, the deal doesn't mention the use of Amazon's Trainium AI chips, which could've been a possibility, but it seems that the attention is focused on Team Green's technology stack. Within the past few weeks, OpenAI has entered into deals with NVIDIA, AMD, Microsoft, Broadcom, and Oracle, which means the AI giant is expected to have access to a significant portion of the available compute resources. With eyes on a potential IPO worth over $1 trillion, it appears that the partnerships are laying the groundwork for OpenAI to enter mainstream markets. On the other hand, entering into deals with OpenAI is viewed as a privilege. OpenAI has evolved into a leading entity in the AI race, and it shows no signs of stopping. Follow Wccftech on Google or add us as a preferred source, to get our news coverage and reviews in your feeds.
[36]
Amazon shares hit all-time high after OpenAI deal worth $38 billion for cloud services
OpenAI has signed a seven-year, $38 billion deal to buy cloud services from Amazon.com, in its first big push to power its AI ambitions after a restructuring last week that gave the ChatGPT maker greater operational and financial freedom. The agreement, announced on Monday, will give OpenAI access to hundreds of thousands of Nvidia graphics processors to train and run its artificial intelligence models. It underscores the AI industry's insatiable appetite for computing power as companies race to build systems that can rival or surpass human intelligence. OpenAI CEO Sam Altman has said the startup is committed to spending $1.4 trillion to develop 30 gigawatts of computing resources - enough to roughly power 25 million U.S. homes. The deal is also a major vote of confidence for the e-commerce giant's cloud unit, Amazon Web Services, which some investors feared had fallen behind rivals Microsoft and Google in the artificial intelligence race. Those fears were somewhat eased by the strong growth the business reported in the September quarter. Amazon shares hit an all-time high on Monday, with the company set to add nearly $140 billion to its market value. Microsoft shares had briefly dipped on the news. RELIABLE COMPUTE "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. Amazon plans to roll out hundreds of thousands of chips, including Nvidia's GB200 and GB300 AI accelerators, in data clusters built to power ChatGPT's responses and train OpenAI's next wave of models, the companies said. Amazon already offers OpenAI models on Amazon Bedrock, which offers multiple AI models for businesses using AWS. OpenAI's sweeping restructuring last week moved it further away from its non-profit roots and also removed Microsoft's first right to refusal to supply compute services in the new arrangement. Reuters has reported that OpenAI was laying the groundwork for an initial public offering that could value the company at up to $1 trillion. But surging valuations of AI companies and their massive spending commitments, which total more than $1 trillion for OpenAI, have raised fears that the AI boom is turning into a bubble. While OpenAI's relationship with Microsoft, which the two forged in 2019, has helped push Microsoft to the top spot among its Big Tech peers in the AI race, both companies have been making moves to reduce reliance on each other. OpenAI has already tapped Alphabet's Google to supply it with cloud services, as Reuters first reported in June. It also reportedly struck a deal with Oracle to buy $300 billion in computing power for about five years. (You can now subscribe to our ETMarkets WhatsApp channel)
[37]
Amazon, OpenAI ink massive $38B deal to supply 'hundreds of thousands' of Nvidia GPUs (AMZN:NASDAQ)
OpenAI (OPENAI) announced on Monday that it has signed a deal with Amazon (AMZN) Web Services worth $38B to help supply it with "hundreds of thousands" of Nvidia's (NVDA) GPUs. Amazon shares rose 5.4% in premarket trading, while Nvidia rose Amazon's AWS will benefit from a $38B deal supplying compute resources and infrastructure to OpenAI. OpenAI selected AWS after amending its deal with Microsoft, signaling a diversification in its cloud compute ecosystem and access to large-scale infrastructure. AWS's ability to provide large-scale, reliable computing will support OpenAI's efforts to scale agentic AI workloads and deliver advanced AI services to millions of users.
[38]
OpenAI Strikes $38 Billion Cloud Deal with Amazon to Boost AI Development | PYMNTS.com
By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions. Under the agreement, OpenAI is accessing AWS compute comprising hundreds of thousands of Nvidia GPUs and can expand to tens of millions of CPUs to scale agentic workloads, the companies said in a Monday (Nov. 3) press release. OpenAI will start using AWS compute immediately, plans to deploy all the capacity included in this agreement before the end of 2026, and will be able to expand further in 2027 and beyond, according to the release. "Scaling frontier AI requires massive, reliable compute," OpenAI Co-founder and CEO Sam Altman said in the release. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." AWS CEO Matt Garman said in the release that AWS infrastructure will serve as a backbone for OpenAI's AI ambitions as that company "continues to push the boundaries of what's possible." "The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads," Garman said. In a collaboration launched earlier, the companies said in August that OpenAI open weight foundation models had become available on Amazon Bedrock. PYMNTS reported on Aug. 6 that this collaboration marked the first time OpenAI's AI models became available on a cloud computing platform outside of Microsoft, its largest investor to date. Up to that point, OpenAI's models had only been available in the cloud on Microsoft Azure. According to the Monday press release, this made those models available to millions of customers on AWS, and OpenAI "has quickly become one of the most popular publicly available model providers in Amazon Bedrock with thousands of customers [...] working with their models for agentic workflows, coding, scientific analysis, mathematical problem-solving and more." When OpenAI announced on Tuesday (Oct. 28) that it completed a restructuring that transformed its for-profit arm into a public benefit corporation, the company said that it is purchasing another $250 billion worth of Microsoft's Azure cloud computing services but that Microsoft will no longer have a right of first refusal to be OpenAI's compute partner. Microsoft has a 27% stake worth about $135 billion in the public benefit corporation.
[39]
AWS CEO On $38B OpenAI Deal, ChatGPT, Nvidia GPUs And 'Powerful Reminder' Of AWS' 'Trust'
"With this new $38B agreement, OpenAI will immediately start using our world-class infrastructure -- including Amazon EC2 UltraServers packed with hundreds of thousands of state-of-the-art Nvidia GPUs and the ability to scale to tens of millions of CPUs," said AWS CEO Matt Garman today. Amazon Web Services and OpenAI have signed a blockbuster deal for the AI superstar to buy a whopping $38 billion worth of cloud capacity from the world's largest cloud company. OpenAI will immediately start running a large number of workloads on AWS infrastructure with huge capacity expansions planned over the next seven years. AWS CEO Matt Garman said the multi-year partnership and deal between AWS and OpenAI will "fuel the next wave of AI innovation." "With this new $38B agreement, OpenAI will immediately start using our world-class infrastructure -- including Amazon EC2 UltraServers packed with hundreds of thousands of state-of-the-art Nvidia GPUs and the ability to scale to tens of millions of CPUs," said Garman in a LinkedIn post Monday. [Related: Amazon Layoffs Hit Managers, Software Engineers, Scientists And Recruiters] Garman said the deal will provide OpenAI with the computing power it needs for its popular ChatGPT AI offering and to scale AI agents faster. "This combination of cutting-edge hardware will give them the compute they need, from serving inference for ChatGPT to training next-generation models to scaling agentic AI workloads," Garman said. AWS' cloud infrastructure, combined with OpenAI's advancements in generative AI, will help millions of users continue to get value from ChatGPT, the AWS CEO said. AWS will provide the capacity OpenAI needs with deployment expected before the end of 2026, with the ability "to expand further into 2027 and beyond," Garman said. He said the deal is "a powerful reminder of why organizations trust AWS when they need serious scale, security, and performance for their most demanding workloads." AWS And OpenAI's Deal Details AWS is the world's largest cloud computing company and currently has a $132 billion annual run rate. Under the new $38 billion deal, OpenAI will access AWS compute resources that are comprised of hundreds of thousands of Nvidia GPUs, with the ability to expand to tens of millions of CPUs to rapidly scale agentic AI workloads. The infrastructure deployment that AWS is building for OpenAI features a sophisticated architectural design optimized for maximum AI processing efficiency and performance. AWS said clustering Nvidia GPUs -- both its GB200s and GB300s -- via Amazon EC2 UltraServers on the same network enables low-latency performance across interconnected systems, allowing OpenAI to efficiently run workloads with optimal performance. These clusters are designed to support various workloads, from serving inference for ChatGPT to training next-generation models. OpenAI CEO Sam Altman OpenAI CEO and co-founder Sam Altman said his company needs massive amounts of compute capability to scale AI to the masses. "Scaling frontier AI requires massive, reliable compute," said Altman in a statement. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." This isn't AWS' first partnership with OpenAI. Earlier this year, OpenAI's open weight foundation models became available on Amazon Bedrock. The startup has quickly become one of the most popular AI model providers in Amazon Bedrock with thousands of customers working with their models for agentic workflows, coding, scientific analysis and mathematical problem-solving, according to AWS' CEO. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," said Garman. "The breadth and immediate availability of optimized compute demonstrate why AWS is uniquely positioned to support OpenAI's vast AI workloads." AWS Third Quarter Earnings Results The OpenAI news comes on the heels of AWS third quarter financial earnings results. The Seattle-based cloud company generated $33 billion in total sales during the quarter, up 20 percent year over year compared to Q3 2024. AWS' operating income increased to $11.4 billion in the third quarter 2025, up 10 percent compared with $10.4 billion year over year.
[40]
AWS and OpenAI sign $38 billion multi-year strategic AI partnership
Amazon Web Services (AWS) and OpenAI have announced a multi-year, strategic partnership under which AWS will provide its infrastructure to support and scale OpenAI's core artificial intelligence (AI) workloads. This agreement, valued at $38 billion with planned growth over the next seven years, grants OpenAI immediate access to AWS compute resources, initially comprising hundreds of thousands of state-of-the-art NVIDIA GPUs. This capacity includes the ability to expand to tens of millions of CPUs to rapidly scale agentic workloads. AWS is providing large-scale AI infrastructure, leveraging its experience in securely and reliably operating clusters of up to 500,000 chips. This collaboration aims to combine AWS's leadership in cloud infrastructure with OpenAI's advancements in generative AI, supporting the continued value delivered to millions of users through products like ChatGPT. The rapid evolution of AI technology has created significant demand for powerful computing resources. Frontier model providers, in their pursuit of more advanced models, are increasingly utilizing AWS for its performance, scale, and security. OpenAI will begin using AWS compute capacity immediately as part of this partnership. The goal is to fully deploy this capacity before the end of 2026, with flexibility for further expansion into 2027 and beyond. The infrastructure AWS is building for OpenAI features a sophisticated architectural design focused on maximizing AI processing efficiency. This design includes clustering NVIDIA GPUs (both GB200s and GB300s) via Amazon EC2 UltraServers on a unified network. This setup is intended to ensure low-latency performance across interconnected systems, enabling OpenAI to run diverse workloads with optimal efficiency. The clusters are designed to support various operations, from serving inference for ChatGPT to training next-generation models, offering the flexibility required for OpenAI's evolving needs. This announcement builds upon existing collaboration between the two companies aimed at delivering cutting-edge AI technology to global organizations. Earlier this year, OpenAI's open-weight foundation models were made available on Amazon Bedrock, offering additional model options to AWS customers.
[41]
Amazon share price: AMZN stocks are trading high after ChatGPT maker OpenAI uses AWS
Amazon stocks: OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. Amazon share price on Wednesday jumped close to five per cent. AMZN stocks were trading at $256.41. The boost comes after OpenAI and Amazon inked $38 billion agreement for ChatGPT maker to use AWS. Amazon.com will supply OpenAI with cloud computing services under a multi-year $38 billion deal, giving the ChatGPT maker access to hundreds of thousands of Nvidia graphics processors to train and run its artificial intelligence models. The agreement announced on Monday underscores the AI industry's insatiable appetite for computing power, driven by the pursuit of technology capable of matching or surpassing human intelligence. It sent Amazon shares up 5 per cent in premarket trading. Sam Altman, CEO of OpenAI said the company is committed to spending $1.4 trillion to develop 30 gigawatts of computing resources. OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. OpenAI turning to Amazon - the larger rival of Microsoft and its biggest backer - signifies a pivotal point in the AI juggernaut's relationship with the Windows maker. It also comes as a big vote of confidence for AWS, following strong quarterly growth reported by the segment last week. The deal is among the first major moves by OpenAI since it completed a restructuring last week that frees the ChatGPT maker to move away from its nonprofit roots. Reuters has reported it was laying the groundwork for an initial public offering that could value the company at up to $1 trillion. But surging valuations of AI companies and their massive spending commitments, which total more than $1 trillion for OpenAI, have raised fears that the AI boom is inflating into a bubble. While OpenAI's relationship with Microsoft, which the two forged in 2019, has helped push Microsoft to the top spot among its Big Tech peers in the AI race, both companies have been making moves to reduce reliance on each other. OpenAI has already tapped Alphabet's Google to supply it with cloud services, as Reuters first reported in June. It also reportedly struck a deal with Oracle to buy $300 billion in computing power for about five years. Q1. Who is OpenAI CEO? A1. OpenAI CEO is Sam Altman. Q2. Does OpenAI have tie up with Microsoft? A2. OpenAI turning to Amazon - the larger rival of Microsoft and its biggest backer - signifies a pivotal point in the AI juggernaut's relationship with the Windows maker. It also comes as a big vote of confidence for AWS, following strong quarterly growth reported by the segment last week.
[42]
OpenAI Signs $38B Cloud Deal With AWS to Scale AI Workloads
AI sector giant OpenAI has signed a seven-year, $38 billion agreement with Amazon Web Services (AWS), granting Sam Altman's company access to AWS' cloud computing infrastructure, a press release revealed. The deal gives OpenAI immediate access to AWS's data centers and the Nvidia chips running them to run and scale its core artificial intelligence (AI) workloads. The Altman-led company claims that its pioneering advancements in AI, in tandem with AWS's leadership in cloud infrastructure, will help millions of people derive greater "value from ChatGPT." "Scaling frontier AI requires massive, reliable compute," said Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone," he added. Importantly, OpenAI's agreement with AWS comes in the wake of a major restructuring move last week that saw Altman and his company shift away from the non-profit model, and even opens the door for a potential initial public offering (IPO) in the near future. According to OpenAI's press release, the company will gain access to "hundreds of thousands" of Nvidia GPUs (graphics processing units), with the ability to expand to "tens of millions" of CPUs (central processing units) to scale its agentic AI workloads. "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," said Matt Garman, Chief Executive Officer (CEO) of AWS. "The breadth and immediate availability of optimized compute demonstrate why AWS is uniquely positioned to support OpenAI's vast AI workloads," Garman added. OpenAI plans to deploy all of AWS' compute capacity before the end of next year and "expand further into 2027 and beyond." Notably, Altman's company revealed that AWS' GPUs will be used both to serve inference for OpenAI's ChatGPT model and to train its next-generation systems. For years, Altman and OpenAI had relied on Microsoft for computing power, with the two companies even maintaining an exclusive cloud agreement. However, that arrangement came to an end in January 2025. The latest deal with AWS clearly signals OpenAI's intent to avoid dependence on a single entity for computing resources. This move also follows OpenAI's recent recapitalisation, which is significant for several reasons. The AI company has described the shift as one that makes the OpenAI Foundation, its non-profit arm "one of the best-resourced philanthropic organizations ever," while simplifying its overall corporate structure. However, in practice, the move represents a departure from the non-profit model and opens the door to a potential IPO. Interestingly, Altman has previously suggested that an IPO is the most likely path forward for OpenAI, given the enormous capital needed to build next-generation data centers and train advanced AI systems. He also stated that OpenAI has financial obligations amounting to $1.4 trillion to help construct roughly 30 gigawatts of data center infrastructure in the coming years. While a potential public listing would give OpenAI access to vast capital markets and greater operational freedom, it also raises questions about how its non-profit entity will function once shareholder pressures enter the equation.
[43]
OpenAI, AWS forge $38bn partnership to power the next frontier of AI
Image: Getty Images/ For illustrative purposes OpenAI has signed a multi-year strategic partnership with Amazon Web Services (AWS) valued at $38bn to run and scale its advanced artificial intelligence workloads on AWS infrastructure. Under the agreement, OpenAI will immediately begin using AWS compute capacity, including hundreds of thousands of NVIDIA GPUs connected through Amazon EC2 UltraServers, with the ability to scale to tens of millions of CPUs. All capacity is targeted to be deployed by the end of 2026, with expansion expected into 2027 and beyond. Highlights of the OpenAI-AWS tieup The deal will give OpenAI access to AWS's large-scale AI infrastructure, which includes clusters of more than 500,000 chips. The setup will support both inference for ChatGPT and the training of next-generation models, with systems designed for low-latency performance and scalable efficiency. "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions," AWS CEO Matt Garman said. The partnership follows earlier collaborations between the two firms. OpenAI's foundation models are already available on Amazon Bedrock, AWS's AI service, where they are used by customers such as Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health for applications including coding, data analysis, and scientific research. AWS said the new infrastructure will use NVIDIA GB200 and GB300 chips to handle OpenAI's growing compute requirements, reinforcing the company's position as a preferred provider for large-scale AI workloads. Read: AWS' Tanuja Randery on supporting EMEA's digital transformation and AI ambition
[44]
OpenAI Signs $38 Billion Mega-Deal with Amazon: ChatGPT to Get AWS Support for Next-Gen AI
With AWS support, users can expect smoother performance, faster responses, and even more advanced AI features in the future. The multi-year deal will enable to use AWS's cloud infrastructure to run and expand its core AI workloads. The partnership is expected to continue growing over the next seven years, according to the new agreement. "OpenAI will be accessing AWS compute comprising hundreds of thousands of state-of-the-art NVIDIA GPUs, with the ability to expand to tens of millions of CPUs to rapidly scale agentic workloads," mentioned . OpenAI will soon start utilizing AWS compute as part of this partnership, with all capacity targeted for deployment by the end of 2026 and the ability to expand further into 2027 and beyond.
[45]
OpenAI and Amazon sign US$38B deal for AI computing power
SEATTLE -- OpenAI and Amazon have signed a US$38 billion deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's cloud computing services. OpenAI will get access to "hundreds of thousands" of Nvidia's specialized AI chips through Amazon Web Services as part of the deal announced Monday. The agreement comes less than a week after OpenAI altered its partnership with its longtime backer Microsoft, which is no longer the startup's exclusive cloud provider. California and Delaware regulators also last week allowed OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement Monday. It said OpenAI "will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond."
[46]
OpenAI strikes 7-year, $38B cloud computing deal with Amazon Web...
OpenAI has come to terms on a massive 7-year, $38 billion deal with Amazon Web Services to secure cloud computing capabilities needed to power its suite of advanced AI tools such as ChatGPT and Sora. The deal which was announced on Monday gives OpenAI access to hundreds of thousands of Nvidia chips housed in Amazon's global data centers, with full capacity slated to come online by the end of 2026. AWS said the agreement will allow OpenAI to scale rapidly while tapping the "price, performance, scale, and security" of its cloud network. "Scaling frontier AI requires massive, reliable compute," OpenAI CEO Sam Altman said in a statement. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." The $38 billion pact marks the first time the ChatGPT maker has turned to Amazon for infrastructure, breaking years of exclusive reliance on Microsoft's Azure cloud. It comes just one week after OpenAI restructured its ownership to gain more freedom in financing and operations -- a shift that removed Microsoft's right of first refusal to supply cloud services. AWS Chief Executive Matt Garman called the deal proof that Amazon's infrastructure can handle "the vast AI workloads" of frontier model builders like OpenAI. Amazon's shares jumped roughly 5% Monday after the announcement, hitting an all-time high. Under the terms, OpenAI will use Amazon's UltraServer clusters -- racks of Nvidia GB200 and GB300 processors -- to train and run its models, process ChatGPT queries, and expand its so-called "agentic AI" systems, where software can complete tasks autonomously. The partnership will also let OpenAI tap into millions of CPUs for specialized workloads, giving it a way to handle soaring user demand as AI adoption widens. Amazon said all planned capacity will be online by the end of next year, with expansion continuing through 2027 and beyond. The partnership represents Amazon's bid to reclaim ground from faster-growing rivals Microsoft and Google, whose cloud divisions have seen stronger revenue growth on the back of surging AI demand. While AWS remains the world's largest cloud provider, analysts said it has lagged behind in attracting high-profile AI customers. The company is already investing heavily to change that. Amazon last week reported 20% quarterly growth in cloud revenue -- its fastest since 2022 -- and opened an $11 billion data-center campus in Indiana dedicated to training and running models from OpenAI rival Anthropic. Amazon has invested $8 billion in Anthropic, which uses its proprietary Trainium chips but also struck a major deal last month to tap as many as one million of Google's TPU chips. Until recently, Amazon had been locked out of the OpenAI ecosystem. The startup's exclusive cloud agreement with Microsoft barred it from buying capacity from other providers. That arrangement was rewritten last month as part of OpenAI's sweeping reorganization, giving Altman latitude to pursue deals across the industry. OpenAI has now committed nearly $600 billion in new cloud contracts across Amazon, Microsoft, Oracle, and Google -- a staggering figure for a company generating roughly $13 billion in annual revenue. The deals are intended to solve what Altman has described as "severe computing shortages" that have constrained model training and product launches.
[47]
Amazon jumps 5% after AWS and OpenAI announce $38B partnership By Investing.com
Investing.com -- Amazon (NASDAQ:AMZN) stock rose 5.4% Monday after the company's cloud division AWS announced a multi-year strategic partnership with OpenAI valued at $38 billion. The agreement will enable OpenAI to run its advanced AI workloads on AWS's infrastructure starting immediately. Under the terms of the deal, which will grow over the next seven years, OpenAI will access AWS compute power comprising hundreds of thousands of NVIDIA GPUs, with the ability to expand to tens of millions of CPUs to scale AI operations. AWS will provide OpenAI with Amazon EC2 UltraServers featuring both GB200 and GB300 NVIDIA chips in a sophisticated architectural design optimized for AI processing efficiency. The infrastructure is designed to support various workloads, from serving inference for ChatGPT to training next-generation models. "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." The partnership builds on the companies' existing collaboration. Earlier this year, OpenAI's open weight foundation models became available on Amazon Bedrock, bringing additional model options to AWS customers. According to the announcement, OpenAI has quickly become one of the most popular model providers on Amazon Bedrock, with thousands of customers using their models. All capacity under the new agreement is targeted to be deployed before the end of 2026, with the ability to expand further into 2027 and beyond.
[48]
OpenAI turns to Amazon in $38 billion cloud services deal after restructuring - The Economic Times
OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond.OpenAI has signed a seven-year, $38 billion deal to buy cloud services from Amazon.com, in its first big push to power its AI ambitions after a restructuring last week that gave the ChatGPT maker greater operational and financial freedom. The agreement, announced on Monday, will give OpenAI access to hundreds of thousands of Nvidia graphics processors to train and run its artificial intelligence models. It underscores the AI industry's insatiable appetite for computing power as companies race to build systems that can rival or surpass human intelligence. OpenAI CEO Sam Altman has said the startup is committed to spending $1.4 trillion to develop 30 gigawatts of computing resources - enough to roughly power 25 million US homes. The deal is also a major vote of confidence for the e-commerce giant's cloud unit, Amazon Web Services, which some investors feared had fallen behind rivals Microsoft and Google in the artificial intelligence race. Those fears were somewhat eased by the strong growth the business reported in the September quarter. Amazon shares hit an all-time high on Monday, with the company set to add nearly $140 billion to its market value. Microsoft shares had briefly dipped on the news. RELIABLE COMPUTE "Scaling frontier AI requires massive, reliable compute," said OpenAI cofounder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. Amazon plans to roll out hundreds of thousands of chips, including Nvidia's GB200 and GB300 AI accelerators, in data clusters built to power ChatGPT's responses and train OpenAI's next wave of models, the companies said. Amazon already offers OpenAI models on Amazon Bedrock, which offers multiple AI models for businesses using AWS. OpenAI's sweeping restructuring last week moved it further away from its non-profit roots and also removed Microsoft's first right to refusal to supply compute services in the new arrangement. Reuters has reported that OpenAI was laying the groundwork for an initial public offering that could value the company at up to $1 trillion. But surging valuations of AI companies and their massive spending commitments, which total more than $1 trillion for OpenAI, have raised fears that the AI boom is turning into a bubble. While OpenAI's relationship with Microsoft, which the two forged in 2019, has helped push Microsoft to the top spot among its Big Tech peers in the AI race, both companies have been making moves to reduce reliance on each other. OpenAI has already tapped Alphabet's Google to supply it with cloud services, as Reuters first reported in June. It also reportedly struck a deal with Oracle to buy $300 billion in computing power for about five years.
[49]
OpenAI and Amazon sign 38 billion-dollar deal for AI computing power | BreakingNews
OpenAI and Amazon have signed a 38 billion-dollar (£28.9 billion) deal that enables the ChatGPT maker to run its artificial intelligence systems on Amazon's cloud computing services. OpenAI will get access to "hundreds of thousands" of Nvidia's specialised AI chips through Amazon Web Services as part of the deal announced on Monday. The agreement comes less than a week after OpenAI altered its partnership with its long-time backer Microsoft, which is no longer the start-up's exclusive cloud provider. California and Delaware regulators last week also allowed OpenAI, which was founded as a nonprofit, to move forward on its plan to form a new business structure to more easily raise capital and make a profit. "The rapid advancement of AI technology has created unprecedented demand for computing power," Amazon said in a statement on Monday. It said OpenAI "will immediately start utilising AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond."
[50]
Amazon's $38 billion OpenAI deal shows it is no longer an AI laggard
(Reuters) -Amazon's $38 billion cloud deal with OpenAI marks a major endorsement for the e-commerce giant's cloud business after recent setbacks, including ceding market share to rivals and an outage that disrupted large parts of the internet. After years of leading the cloud computing industry with its highly profitable Amazon Web Services (AWS) business, Amazon has watched Microsoft and Alphabet's Google snatch big-ticket contracts with their AI-steeped clouds. Its lead in the cloud market slipped to 29% as of September, from 34% a few months before ChatGPT was launched in 2022, according to data from Synergy Research Group. Amazon was considered a laggard in the AI race by many investors because it was late to launch a flagship large language model and for failing to offer a consumer-facing chatbot like OpenAI's ChatGPT. Recently, though, the company has ramped up spending on its AI efforts, and last month opened an $11 billion AI data center in Indiana called Project Rainier, where startup Anthropic's models are being trained using Amazon's own Trainium chips. Monday's deal with OpenAI, a marquee customer, coupled with strong quarterly results last, suggests AWS is regaining momentum, analysts and investors said. "While it is small relative to other deals OpenAI has made with other cloud providers, it represents a key first step in Amazon's effort to partner with a company that is spending over a trillion dollars on computing power in the coming years," said Mamta Valechha, analyst at Quilter Cheviot. Amazon's stock rose 5% after the deal to a record high after it traded little changed for most of the year, lagging the gains seen in other Big Tech stock that have surged on cloud-computing deals worth hundreds of billions of dollars with AI startups. Microsoft last week disclosed a $250 billion OpenAI commitment for its Azure cloud services under a new arrangement that allowed OpenAI to restructure itself, while Oracle has signed a $300 billion deal with the startup. Google has a chip agreement worth tens of billions with Anthropic among other AI tie-ups. HIGHER AI SPENDING TO DRIVE GAINS Amazon's efforts have in part been hampered by executive losses. A key vice president helping oversee generative AI development left for another company, Reuters reported in June. To stay competitive and fund the costly data centers needed to support the technology, CEO Andy Jassy has tried to cut through management layers and even installed an anonymous complaint line for identifying inefficiencies. The company said last week it would reduce its corporate workforce by about 14,000 in one of its biggest layoffs. It is also spending more on AI, with its capital expenditure expected to total around $125 billion this year and more the year after. That is more than Alphabet's planned outlay of up to $93 billion, and roughly in-line with what Wall Street expects Microsoft to spend this year. Analysts said the OpenAI deal offers a credible path for Amazon to recoup its spending. Brian Pitz, an analyst at BMO Capital Markets, estimates that this may boost AWS's backlog by about 20% in the fourth quarter ending December, from $200 billion as of September end. "It clearly seems like they (Amazon) are finally in the flow of what is happening with these large language models versus before," said William Lee, an investor at SuRo Capital that holds equity in OpenAI. (Reporting by Harshita Mary Varghese and Kritika Lamba in Bengaluru; Writing by Aditya Soni; Editing by Sayantani Ghosh and Saumyadeb Chakrabarty)
[51]
Amazon backed by major partnership with OpenAI
Amazon shares are up 4% on Wall Street today, benefiting from the announcement by its cloud subsidiary Amazon Web Services (AWS) of a multi-year strategic partnership with OpenAI, representing a $38bn commitment from the latter. "This multi-year strategic partnership provides OpenAI with immediate and growing access to AWS's world-class infrastructure for its advanced artificial intelligence workloads," the tech giant explains. AWS will provide the ChatGPT creator with Amazon EC2 UltraServers, equipped with hundreds of thousands of chips, and the ability to scale up to tens of millions of processors for its advanced generative AI workloads. Under this new agreement, which will see continued growth over the next seven years, OpenAI will rapidly increase its computing capacity while benefiting from the price, performance, scalability, and security of AWS. AWS' leadership in cloud infrastructure, combined with OpenAI's pioneering advances in generative AI, will help millions of users continue to benefit from ChatGPT, Amazon said.
[52]
OpenAI signs a $38bn strategic agreement with AWS
OpenAI has signed a major contract with Amazon and its AWS cloud service for the purchase of computing capacity worth $38bn. This partnership marks a significant strategic shift for the artificial intelligence startup, which has historically been linked to Microsoft via Azure. Thanks to this agreement, OpenAI will benefit from immediate access to Nvidia GPUs via AWS data centers, ahead of the construction of new dedicated infrastructure. These resources will fuel both the inference of existing models and the training of future generations of AI. This partnership comes as Microsoft has just lost its exclusivity on OpenAI's requests, although the company remains a key Azure customer with an additional $250bn commitment. The contract with AWS, which is larger than those recently signed with Oracle and Alphabet, illustrates a clear desire for diversification. This strategy is part of a plan to go public, with internal restructuring already underway to facilitate this operation. For AWS, this contract represents a significant step forward in the face of competition in the cloud sector. Already a partner of Anthropic, Amazon is investing heavily in its capabilities, including an $11bn campus dedicated to AI in Indiana. The agreement with OpenAI remains open to the integration of future technologies, including AWS's Trainium chips. By consolidating its ties with several hyperscale providers, OpenAI is positioning itself as an independent player capable of sustaining its growth in a rapidly expanding AI market.
[53]
OpenAI, Amazon strike $38 billion agreement for ChatGPT maker to use AWS
(Reuters) -Amazon.com will supply OpenAI with cloud computing services under a multi-year $38 billion deal, giving the ChatGPT maker access to hundreds of thousands of Nvidia graphics processors to train and run its artificial intelligence models. The agreement announced on Monday underscores the AI industry's surging appetite for computing power, driven by the pursuit of technology capable of matching or surpassing human intelligence. It sent Amazon shares up 5% in premarket trading. OpenAI will begin using Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. The deal is among the first major moves by OpenAI since it completed a restructuring last week that frees the ChatGPT maker to move away from its nonprofit roots. Reuters has reported it was laying the groundwork for an initial public offering that could value the company at up to $1 trillion. But surging valuations of AI companies and their massive spending commitments, which total more than $1 trillion for OpenAI, have raised fears that the AI boom is inflating into a bubble. (Reporting by Deborah Sophia in Bengaluru; Editing by Shinjini Ganguli)
[54]
OpenAI and Amazon sign $38 bn deal for ChatGPT-maker to use AWS: Here's how the partnership works
The infrastructure being built by AWS for OpenAI features a sophisticated design optimised for maximum performance and efficiency. OpenAI, the company behind ChatGPT, has signed a major $38 billion deal with Amazon Web Services (AWS) to power and scale its artificial intelligence (AI) systems. The multi-year deal will see OpenAI using AWS's cloud infrastructure to run and expand its core AI workloads, starting immediately. Under the agreement, AWS will provide hundreds of thousands of NVIDIA GPUs and the ability to scale to tens of millions of CPUs over the next seven years. All the capacity is expected to be deployed before the end of 2026, with the ability to expand further into 2027 and beyond. Also read: OnePlus 15 India launch next week: Expected price, camera, battery, display and other specs The infrastructure being built by AWS for OpenAI features a sophisticated design optimised for maximum performance and efficiency. By clustering NVIDIA GB200 and GB300 GPUs using Amazon EC2 UltraServers, AWS will enable low-latency performance across interconnected systems. This setup will allow OpenAI to handle a variety of workloads with the flexibility to adjust as its needs evolve. "Scaling frontier AI requires massive, reliable compute," said OpenAI co-founder and CEO Sam Altman. "Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone." Also read: Samsung Galaxy Z Flip 6 available with over Rs 43,500 on Amazon: How to grab this deal Matt Garman, CEO of AWS, added, "As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions." This partnership builds on previous collaborations between the two companies. Earlier this year, OpenAI's open weight foundation models became available on Amazon Bedrock, making them accessible to millions of AWS customers. Also read: Xiaomi 14 Civi price drops by over Rs 18,200 on Amazon: Check deal details here
Share
Share
Copy Link
OpenAI has entered a seven-year, $38 billion partnership with Amazon Web Services to access hundreds of thousands of Nvidia GPUs for AI model training and inference. This major deal follows OpenAI's recent restructuring that freed it from Microsoft's exclusive cloud provider status.
OpenAI announced on Monday a significant seven-year, $38 billion partnership with Amazon Web Services (AWS) to power its artificial intelligence operations, including ChatGPT and Sora video generation
1
. The deal represents OpenAI's largest cloud computing agreement following its recent corporate restructuring that freed the company from Microsoft's exclusive provider status2
.
Source: New York Post
The agreement provides OpenAI with immediate access to hundreds of thousands of Nvidia graphics processors, specifically GB200 and GB300 AI accelerators, housed in Amazon's data clusters
3
. All planned capacity is targeted for deployment by the end of 2026, with expansion capabilities extending into 2027 and beyond4
.
Source: Wccftech
This partnership marks a fundamental shift in OpenAI's infrastructure strategy. Previously, Microsoft served as OpenAI's exclusive cloud provider from 2019 to 2023, but the company's recent restructuring eliminated Microsoft's right of first refusal for compute services
1
. The new arrangement allows OpenAI to diversify its cloud partnerships while maintaining its relationship with Microsoft through a separate $250 billion Azure services commitment5
.OpenAI CEO Sam Altman emphasized the importance of reliable computing infrastructure, stating, "Scaling frontier AI requires massive, reliable compute. Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone"
1
.The announcement had immediate market impact, with Amazon shares hitting an all-time high on Monday morning, while Microsoft shares briefly declined
1
. The deal positions Amazon as a major player in AI infrastructure, challenging perceptions that the company was lagging in artificial intelligence development3
.
Source: Analytics Insight
Amazon's announcement highlighted its experience "running large-scale AI infrastructure securely, reliably, and at scale -- with clusters topping 500K chips," positioning the partnership as leveraging AWS's cloud leadership with OpenAI's generative AI advancements
4
.Related Stories
The AWS deal represents part of OpenAI's ambitious computing expansion plans. Altman has previously outlined goals to spend $1.4 trillion developing 30 gigawatts of computing resources and eventually adding one gigawatt of compute capacity weekly
1
. Beyond Amazon, OpenAI has secured partnerships with Google for cloud services and a reported $300 billion agreement with Oracle2
.The company is also developing proprietary hardware solutions, reportedly working on GPU development with Broadcom to reduce dependence on Nvidia chips
4
.The massive spending commitments have raised concerns about potential market overheating. With companies projected to spend upwards of $500 billion on AI infrastructure between 2026 and 2027, analysts worry about the sustainability of current investment levels
3
. OpenAI's current financial position adds to these concerns, with the company expected to reach $20 billion in annualized revenue while mounting losses continue1
.Summarized by
Navi
[1]
03 Nov 2025•Business and Economy

11 Sept 2025•Business and Economy

31 Oct 2025•Business and Economy

1
Technology

2
Technology

3
Science and Research
