100 Sources
100 Sources
[1]
OpenAI and Nvidia's $100B AI plan will require power equal to 10 nuclear reactors
On Monday, OpenAI and Nvidia jointly announced a letter of intent for a strategic partnership to deploy at least 10 gigawatts of Nvidia systems for OpenAI's AI infrastructure, with Nvidia planning to invest up to $100 billion as the systems roll out. The companies said the first gigawatt of Nvidia systems will come online in the second half of 2026 using Nvidia's Vera Rubin platform. "Everything starts with compute," said Sam Altman, CEO of OpenAI, in the announcement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale." The 10-gigawatt project represents an astoundingly ambitious and as-yet-unproven scale for AI infrastructure. Nvidia CEO Jensen Huang told CNBC that the planned 10 gigawatts equals the power consumption of between 4 million and 5 million graphics processing units, which matches the company's total GPU shipments for this year and doubles last year's volume. "This is a giant project," Huang said in an interview alongside Altman and OpenAI President Greg Brockman. To put that power demand in perspective, 10 gigawatts equals the output of roughly 10 nuclear reactors, which typically output about 1 gigawatt per facility. Current data center energy consumption ranges from 10 megawatts to 1 gigawatt, with most large facilities consuming between 50 and 100 megawatts. OpenAI's planned infrastructure would dwarf existing installations, requiring as much electricity as multiple major cities. The partnership follows OpenAI's rapid user growth to 700 million weekly active users. Nvidia's stock rose nearly 4 percent on Monday following the announcement, adding roughly $170 billion to its market capitalization. The partnership establishes Nvidia as OpenAI's preferred strategic compute and networking partner, alongside OpenAI's existing relationships with Microsoft, Oracle, SoftBank, and the recently announced Stargate project partners. The partnership announcement comes a week after Nvidia disclosed a $5 billion investment in Intel, taking a 4 percent stake in its longtime competitor as the two companies plan to co-develop custom data center and PC products.
[2]
Nvidia plans to invest up to $100B in OpenAI | TechCrunch
Nvidia announced Monday it plans to invest up to $100 billion in OpenAI as part of a deal to build out massive data centers for training and running AI models. The companies say they signed a letter of intent to deploy 10 gigawatts -- enough to power millions of homes -- worth of Nvidia systems to power OpenAI's next generation of AI infrastructure. The deal may help OpenAI as it reduces its reliance on Microsoft, its largest investor and supplier of cloud computing resources. In January, Microsoft announced changes to its partnership with OpenAI, allowing the ChatGPT-maker to build additional AI infrastructure with other partners. Since then, OpenAI has teamed up with various partners on AI data center projects, such as Stargate. Nvidia says the deal will complement existing partnerships OpenAI has, including agreements with Microsoft, Oracle, and SoftBank. OpenAI says it will work with Nvidia as a "preferred strategic compute and networking partner" for its AI factory growth. It's unclear whether Nvidia's investment will be paid out in chips, cloud credits, cash, or otherwise.
[3]
Nvidia Invests in OpenAI With $100 Billion AI Infrastructure Deal
Macy has been working for CNET for coming on 2 years. Prior to CNET, Macy received a North Carolina College Media Association award in sports writing. OpenAI and Nvidia have struck one of the biggest partnerships in AI, with Nvidia pledging to invest up to $100 billion in OpenAI while supplying the compute power needed to build the company's next generation of models. The deal, announced Monday in a letter of intent, calls for OpenAI to deploy at least 10 gigawatts of Nvidia systems over the coming years. The first phase, one gigawatt, is scheduled for the second half of 2026 on Nvidia's upcoming Vera Rubin platform, named for the late dark matter astronomer. Nvidia's investment will grow in scale as each new system is deployed. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against ChatGPT maker OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. "Everything starts with compute," said OpenAI CEO Sam Altman in a statement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Read also: Is AI Capable of 'Scheming?' What OpenAI Found When Testing for Tricky Behavior This partnership is notable as AI research is increasingly constrained by access to massive computing resources. By securing a long-term pipeline of Nvidia hardware, OpenAI seeks to guarantee its ability to keep pace with rivals like Google, Anthropic, Microsoft and Meta. For Nvidia, the deal makes it more than just a supplier. By gradually taking a large stake in OpenAI, it positions itself at the center of the AI boom, buying into the biggest AI company. Read also: OpenAI Is Building a Teen-Friendly Version of ChatGPT The completion of this deal will take years, but if the partnership holds, it could define how quickly AI advances, what kinds of models OpenAI can deliver in the future and how accessible those models will be to the global population.
[4]
How Nvidia and OpenAI's staggering $100 billion deal could fuel a new age of AI
This deal is bigger than all of NVIDIA's other AI deals combined. Nvidia's partnership with OpenAI to deploy at least 10 gigawatts of Nvidia systems in the next few years is a jaw-dropper. Backing this massive investment in OpenAI's data centers is a staggering $100 billion from Nvidia as the new facilities come online. (Disclosure: Ziff Davis, ZDNET's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) The collaboration, formalized through a letter of intent, marks the most significant single infrastructure commitment we've ever seen in the rapidly growing AI industry. "This is the biggest AI infrastructure project in history," said Nvidia founder and CEO Jensen Huang in a blog post. "This partnership is about building an AI infrastructure that enables AI to go from the labs into the world." Also: What Nvidia's stunning $5 billion Intel bet means for enterprise AI and next-gen laptops OpenAI, the pioneering force behind ChatGPT, will use millions of Nvidia GPUs to fuel its next generation of AI research and deployment. The first phase of datacenter rollouts is slated to begin in the second half of 2026, built atop NVIDIA's Vera Rubin platform, and is expected to push the boundaries of both AI model training and real-time inference at unprecedented scales. Leaders from both companies have emphasized the transformative potential of their alliance. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence," added Huang. OpenAI CEO Sam Altman echoed the sentiment, calling the forthcoming hardware platform "the fuel that we need to drive improvement, drive better models, drive revenue, drive everything." Also: Deploying agentic AI? You'll probably do business with these 3 companies While Nvidia has made significant investments in AI software companies before, there's been nothing on the scale of its $100-billion commitment to OpenAI. Previously, NVIDIA participated in funding rounds for prominent AI software startups, including earlier investments in OpenAI's 2024 $6.6-billion round, and backing companies such as Cohere, Mistral, Perplexity, CoreWeave, and Scale AI. In total, Nvidia invested around $1 billion across 50 AI startups in 2024. OpenAI's growth has far outpaced virtually every other company in the AI space, both in terms of user adoption and financial scale. By mid-2025, OpenAI's annualized revenue had soared to $10-$13 billion -- up from $3.7 billion in 2024 -- and its projected 2025 revenue of $12.7 billion easily surpasses that of all its major competitors. OpenAI's user growth has also outpaced its rivals. The company said it recently surpassed 700 million weekly active users. This deal with NVIDIA is seen as key to supporting both a surging user base and the resource-intensive workloads required to pursue artificial general intelligence (AGI). As OpenAI's preferred compute and networking partner, NVIDIA will work closely with OpenAI to optimize hardware, software, and model development roadmaps for maximal efficiency and innovation. Also: Report: OpenAI will launch its own AI chip next year The partnership arrives against the backdrop of fierce global competition to build massive AI datacenter capacity. While OpenAI continues to collaborate with cloud giants like Microsoft, Oracle, and SoftBank, the scale of the NVIDIA deployment sets a new benchmark in the industry. With leadership on both sides touting this as only the beginning of a global AI buildout, the OpenAI-NVIDIA alliance is poised to leave a profound mark on AI's future. It's also a move that must concern other AI software companies, since the no-question-about-it leading AI hardware company is now working hand-in-glove with the early AI software leader.
[5]
Nvidia is partnering up with OpenAI to offer compute and cash
OpenAI is teaming up with Nvidia via a "strategic partnership" that will get the ChatGPT-maker more compute and more cash to develop new models on the road to superintelligence. The partnership, announced Monday, will allow OpenAI to "build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems," which translates to millions of GPUs that can help power OpenAI's new models. One of the most important points here, besides more data centers and compute -- which are always in high demand for companies like OpenAI -- is that as part of the deal, NVIDIA "intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed," per the release. The details will be finalized in the next few weeks, according to the companies.
[6]
Nvidia's $100 billion investment in OpenAI raises big antitrust concerns -- legal experts and policymakers raise eyebrows over potential for market imbalance
Invest in a major customer, then get money from a major customer? Nvidia plans to invest up to $100 billion in OpenAI, marking an unprecedented financial and strategic alignment between the leading AI hardware provider and one of the best-known developers of artificial intelligence models. However, the deal raises major antitrust concerns among legal experts and policymakers over potential market imbalance, as in both cases the investment can impact competitors of both companies, reports Reuters. The planned investment raises questions about how the cash infusion could affect Nvidia's other customers and the overall AI hardware market. Nvidia already commands the lion's share of the market for hardware used for AI training and inference, as virtually all AI companies use its Hopper and Blackwell GPUs, so they rely on access to Nvidia's GPUs to scale their own models. Legal experts note that Nvidia's investment may create incentives to prioritize OpenAI over others, potentially offering better terms or faster access to a limited supply of leading-edge GPUs, such as Rubin. In response, a representative for Nvidia stated that the company's commitment to all of its clients remains unchanged. The spokesperson emphasized that having a financial interest in any one partner would not affect how the company serves others, assuring that every customer will continue to receive the same level of attention and service. Additionally, if OpenAI prefers hardware from Nvidia over hardware from its rivals, such as AMD, or even its own processor developed in collaboration with Broadcom, then Nvidia will get an unfair advantage. Keeping in mind that OpenAI is believed to have acquired $10 billion worth of custom-built AI processors from Broadcom, it is unlikely that it won't deploy them, but now that Nvidia will provide OpenAI hardware worth tens of billions of dollars, the AI company will continue to do more work on Nvidia's hardware rather than on competing processors. OpenAI currently operates as a non-profit but is pursuing a transition to a for-profit public benefit corporation. This structural change is meant to facilitate investment while maintaining oversight by the original non-profit entity. The arrangement with Nvidia does not provide governance rights -- only financial participation -- and may depend on regulatory approvals in states like Delaware and California, where OpenAI is registered. Nvidia and OpenAI finalized the $100 billion deal after weeks of private negotiations between chief executives Jensen Huang and Sam Altman, reports CNBC. The companies came to final terms just before a major infrastructure reveal in Texas on Monday, which is surprising given the scale of the deal. While the agreement is inked between Nvidia and OpenAI, the data centers that the latter will build will be a part of the Stargate project, according to CNBC. The initial phase of the agreement involves Nvidia investing $10 billion to help OpenAI deploy 1 GW data center using Nvidia's upcoming Vera Rubin chips, with construction slated to begin in the second half of 2026. According to Nvidia CEO Jensen Huang, the total cost per gigawatt is around $50 billion: $35 billion is for Nvidia hardware and $15 billion is for facilities and other infrastructure, which implies that Nvidia is not financing the entire build-out. OpenAI would still need to secure roughly $40 billion per gigawatt from other sources. The subsequent phases of the plan include scaling compute capacity to 10 GW, which would require the full $90 billion investment from Nvidia and $400 billion from OpenAI. The company has not disclosed how it plans to obtain the remaining capital, nor has it confirmed whether it shares Nvidia's cost assumptions. Interestingly, Nvidia's initial $10 billion input is based on a $500 billion valuation for OpenAI, but CNBC claims future investments will be based on OpenAI's valuation at the time, so could fluctuate. There is also no specific timeline for when the full 10 GW of capacity will be operational or when the remaining $90 billion will be deployed. U.S. regulators have previously flagged the risk of major technology firms leveraging their existing dominance to control emerging AI markets. Officials from the Department of Justice have emphasized the importance of averting exclusionary practices in the AI supply chain, including restricted access to processors and compute infrastructure. The potential effects extend beyond hardware. Oracle recently disclosed that it had signed large-scale cloud contracts with OpenAI and other clients, boosting its valuation. With Nvidia's investment potentially strengthening OpenAI's financial position, Oracle's revenue projections may appear more credible, something that will address investor concerns about OpenAI's ability to fund such commitments, according to Reuters.
[7]
Oh the joy: OpenNvidia may be the AI generation's WinTel
Duo could dominate in the same way Microsoft and Intel ruled PCs for decades Opinion The OpenAI and Nvidia $100 billion partnership sure sounds impressive. $100 billion isn't chicken feed, even as more and more tech companies cross the trillion-dollar mark. But what does it really mean? As two of my Register colleagues noted, "The announcement has enough wiggle room to drive an AI-powered self-driving semi through." True, but it may be the start of something huge that will define the AI movement for the foreseeable future. Let's step into the Wayback Machine with Mr. Peabody and Sherman to the early 1980s, when PCs from companies most of you have never heard of, such as Osborne, Kaypro, and Sinclair Research, landed on desktops. IBM decided to get into the personal computer business, and the company needed chips. So Big Blue teamed with a relatively obscure CPU company called Intel. That took care of the hardware, but IBM needed an operating system urgently. Initially, like everyone else, except for those guys named Steve with some company called Apple, IBM wanted to use CP/M from Digital Research. That didn't work out. So, IBM called Microsoft, and Bill Gates and crew acquired Quick and Dirty Operating System (QDOS ) from Seattle Computer Products, and slapped the names MS-DOS and IBM PC-DOS on it. Microsoft also, and this is the critical bit, kept the right to sell MS-DOS to other companies. Intel, of course, had always retained the right to sell its chips to anyone. It quickly became clear that IBM was onto something. So other new companies, Compaq specifically, sprang up to develop their own PC clones, starting with the Compaq Portable in 1983. It, and all the many other clones from companies like Dell, HP, and Packard Bell, were, of course, powered by Intel chips and ran Microsoft operating systems. The two companies started working hand-in-glove with each other. By the late '80s, their pairing, WinTel, would rule the PC world. Decades later, while not nearly as dominant as they once were, chances are the computer in front of you is WinTel. What does that have to do with OpenNvidia? Everything. This deal promises to create the world's largest AI infrastructure project to date. It gives OpenAI access to millions of Nvidia GPUs and the capital needed for a massive wave of next-generation data centers. But what else are they going to say? Sucks to be you, Anthropic? Bite me, Oracle? I mean, seriously, where exactly will all the other AI software companies get the Nvidia GPUs they so desperately need when Nvidia has promised so many of its newest Vera Rubin processors to OpenAI? If the deal comes to completion, that 10 gigawatts of Nvidia systems that OpenAI gets is roughly equal to four to five million GPUs. According to analysts, Nvidia's total AI GPU run will amount to only 6.5 to 7 million chips in 2025. That doesn't leave many chips over for everyone else, does it? An Nvidia spokesman told Reuters, "Our investments will not change our focus or impact supply to our other customers - we will continue to make every customer a top priority, with or without any equity stake." But what else are they going to say? Sucks to be you, Anthropic? Bite me, Oracle? Now, where have I seen this combination of chips and software before? Oh, right. WinTel. It worked pretty well for them, didn't it? As for their rivals back in the early days, I recall them because I was already in the tech industry then. If you're under 40, have you even heard of North Star Computers, Cromemco, or Vector Graphics? Yeah, I didn't think so. Of course, it's possible that the US Federal Trade Commission (FTC) might have something to say about this de facto move to a monopoly. After all, today, NVIDIA has about 92 percent of the data center AI chip market. As for OpenAI, it's been growing faster than essentially all other AI companies, whether you measure it in user adoption or financial scale. By mid-2025, OpenAI's annualized revenue had soared to $10-$13 billion - up from $3.7 billion in 2024 - and its projected 2025 revenue was $12.7 billion. Simultaneously, TechGaged and StatCounter both report OpenAI ChatGPT's AI chatbot market share ranges from 80.9 percent to 82.7 percent in recent months. Oh wait. I forgot. The US is governed by Donald "Anarchy in AI" Trump. The FTC won't be stopping this deal. It's possible the UK's Department for Science, Innovation and Technology (DSIT) and the EU's European Commission may have something to say. Whether NVIDIA and OpenAI will pay either any attention is another matter. True, the deal's details are still messy. As Scott Raynovich, Founder and Chief Technology Analyst of the technology analysis firm Futuriom, noted in a LinkedIn comment, "All of these deals are the same... to me they read like... 'I promise to spend a bunch of money with you if you kick a bunch back to me... but there is no guarantee... and it's all contingent on things going exactly as they are going right now, but we could always bail.'" Far be it from me to disagree. This deal could go sideways. After all, I'm one of those who won't be surprised if AI goes bust. But, if it doesn't, Nvidia is the one AI company I see surviving. Any business that's aligned closely with Nvidia may do quite well. After all, just like with the dot-com crash, after all the crying, the internet grew and grew. I expect the same will happen with AI, no matter what happens to it in the short term. So, yes, in the long run, I can see OpenNvidia dominating AI in the 2040s the way Wintel did in the 2000s. ®
[8]
Nvidia's Massive OpenAI Deal Fuels 'Circular' Financing Concerns
Three years after OpenAI and Nvidia Corp. helped kick off the global artificial intelligence frenzy, the two firms are joining forces to pave the way for a more costly phase of development with a deal that's quickly revived fears of an AI bubble. Nvidia on Monday said it will invest as much as $100 billion in OpenAI to help the ChatGPT maker support a massive build-out of data centers equipped with Nvidia's chips - a deal that some analysts say raises questions about whether Nvidia is investing heavily to prop up the market and keep companies spending on its products.
[9]
Shareholders should have more say over the AI rush
Everyone loved Monday's announcement by OpenAI and Nvidia. Of course they did. You only have to carve the words artificial intelligence on to a lamp post these days and it'll be valued at millions. But the more I think about OpenAI and Nvidia's plan to build 10 gigawatts of computing power, the more I worry that investors are losing their grip. Massive numbers don't seem to register. Being left in the dark is considered normal. Checks and balances barely exist. One criticism of the deal is that it kind of sounds like a perpetual motion machine with Nvidia investing up to $100bn in OpenAI so the latter can buy more Nvidia chips. Forking out for growth is common, though -- however ick. Samsung, Intel and TSMC invested in ASML to accelerate lithography development in order to boost demand for their own products. Netflix and Amazon fund studios which make content for them. Miners support refineries to guarantee offtake. My main issue with the deal is he scale of money involved given that Nvidia shareholders learnt about it through a press release presented as a fait accompli. Sure, the cash-for-shares swap happens in dribs and drabs as capacity is built. But for reference, $100bn would rank as a mega-deal by anyone's standards. I don't care that Nvidia's market cap is more than 40 times that. The sum is equivalent to the common equity on its balance sheet. If paid out, it's akin to a dividend yield of 2.3 per cent. Shouldn't equity holders have an explicit vote on such an outlay, let alone a deal we're told is so strategic? Most rule-setters around the world reckon not. While shareholders must be asked if management teams want to do things such as make a large acquisition or cede control, no vote is needed on capex and investment, no matter how gargantuan. Only the board needs to sign-off. Crucially, there are exceptions. If a significant issuance of shares is required, for example. Or when an investment is in effect a reverse merger. A third reason shareholders are sometimes given a look in -- and the one which rang my bell with Nvidia and OpenAI -- is if a deal is with a related party. Laws vary by jurisdiction, but accounting standards (IAS 24 under IFRS as well as ASC 850 under GAAP for American companies) agree that a related-party transaction is between any parties where there is a personal, financial or control relationship. In other words, two companies that share the same directors, executives or major shareholders (or close family members). They could also be related if one controls the other in some way. Joint ventures can also fit the definition. Due to potential conflicts of interest, it would seem obvious that such deals should be put to shareholders if they are substantial enough. A boardroom green light doesn't cut it. Therefore in countries such as Britain and Australia, related party transactions over a certain size require shareholder approval. Not so in the US. For companies listed on the New York Stock Exchange and Nasdaq, all that's required is for their audit committees or another independent group of directors to review related party transactions "on an ongoing basis". But surely shareholders should have a say in the largest computing project in history, as Nvidia chief executive Jensen Huang called it this week. It sure as hell seems like it should to me. The world's largest public and private companies do not control the other, as such. Boy, are they related, though. "We've been working closely with Nvidia since the early days," said Greg Brockman, the co-founder and president of OpenAI. Indeed, Huang hand-delivered the first DGX system to OpenAI almost a decade ago. The two companies even call the deal a "landmark" partnership, although we don't know yet what the structure will be. But given related-party rules, you can bet it won't be a formal joint-venture. Shareholders, of course, do have the right to sue the board if they feel aggrieved or fight a proxy battle to replace directors. But so long as the stock prices keep rising -- and Nvidia's rose 4 per cent on the announcement -- I suspect investors won't complain even if this cosy deal worries them to some degree. More broadly, I question whether US related-party rules remain fit for purpose given the interconnectedness of the new AI ecosystem. At least OpenAI is private. But what about Microsoft's shareholders, who stand to own about 30 per cent of the maker of ChatGPT when it eventually goes public? Or those of SoftBank and Oracle? The Stargate project they are backing is a $400bn investment. Almost half a trillion dollars! That's too much for investors to take on trust.
[10]
Nvidia's $100 billion OpenAI play raises big antitrust issues
Sept 23 (Reuters) - The $100 billion partnership between dominant AI chipmaker Nvidia (NVDA.O), opens new tab and leading artificial intelligence company OpenAI could give both companies an unfair advantage over their competitors, experts say. The move underscores the increasingly overlapping financial interests of the various tech giants developing advanced AI systems, and the potential for a dwindling number of key players to stave off smaller rivals. It "raises significant antitrust concerns," said Andre Barlow, an antitrust lawyer with Doyle, Barlow & Mazard, who also noted that the Trump administration has taken a pro-business approach to regulations, removing hurdles that would slow AI growth. And while unleashing U.S. dominance in artificial intelligence by clearing away regulations and creating incentives for growth is a top priority for President Donald Trump, a Department of Justice official said last week that spurring innovation by protecting AI competition through antitrust enforcement is also part of Trump's AI plan. "The question is whether the agencies see this investment as pro-growth or something that could slow AI growth," Barlow said. Nvidia holds more than half of the market for the GPU chips that run the data centers powering artificial intelligence models and applications, such as OpenAI's ChatGPT. That dominant market position raises concerns that Nvidia would favor OpenAI over other customers with better pricing or faster delivery times, said Rebecca Haw Allensworth, an antitrust professor at Vanderbilt Law School. "They're financially interested in each other's success. That creates an incentive for Nvidia to not sell chips to, or not sell chips on the same terms to, other competitors of OpenAI," Allensworth said. A Nvidia spokesperson said that its investment in OpenAI would not change its focus. "We will continue to make every customer a top priority, with or without any equity stake," the spokesperson said. OpenAI did not immediately respond to a request for comment. Nvidia's biggest customer base is already relatively concentrated, with the two largest buyers accounting for 23% and 16% of its revenue in the second quarter of this year, according to its financial filings, which do not name the buyers. The scope of Monday's deal - in which Nvidia would invest up to $100 billion in OpenAI, and the latter would buy millions of chips from Nvidia -- goes to show "just how expensive frontier AI has become," said Sarah Kreps, director of the Tech Policy Institute at Cornell University. "The cost of chips, data centers and power has pushed the industry toward a handful of firms able to finance projects on that scale," Kreps said. During Joe Biden's presidency, the DOJ and U.S. Federal Trade Commission were on guard against anticompetitive actions by Big Tech companies in the AI space, warning that such companies could use their existing scale to dominate the nascent field. Under Trump, both agencies have continued other cases against Big Tech companies, and DOJ antitrust division head Gail Slater said on Thursday that enforcement "must focus on preventing exclusionary conduct over the resources that are needed to build competitive AI systems and products." "The competitive dynamics of each layer of the AI stack and how they interrelate, with a particular eye towards exclusionary behavior that forecloses access to key inputs and distribution channels, are legitimate areas for antitrust inquiry," she said. Reporting by Jody Godoy; Editing by Chris Sanders and Lisa Shumaker Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence * Antitrust * Regulatory Oversight * Mergers & Acquisitions * Corporate Counsel Jody Godoy Thomson Reuters Jody Godoy reports on tech policy and antitrust enforcement, including how regulators are responding to the rise of AI. Reach her at [email protected]
[11]
Nvidia and OpenAI forge $100 billion alliance to deliver 10 gigawatts of Nvidia hardware for AI datacenters
All of a sudden, the 1.21 gigawatts in Back To The Future aren't impressive anymore. Tech industry giants OpenAI and Nvidia have announced a pivotal partnership, which will deploy 10 gigawatts worth of AI datacenters and $100 billion in investments. OpenAI has committed to creating multiple datacenters with Nvidia as its "preferred strategic compute and networking partner," with the first one expected to deploy in the second half of 2026. The partnership will see OpenAI construct fervently until the total combined power budget of those datacenters reaches "at least" 10 gigawatts. For its part, Nvidia dove into its war chest to secure $100 billion, returning the favor by progressively investing in OpenAI, presumably via share purchases. Additionally, and perhaps most interestingly, both companies commit to "co-optimize" their respective roadmaps. It's not hard to imagine that the hands of Nvidia's AI clients already guide the chipmaker's designs, but this statement could imply that OpenAI will have a bigger say in Nvidia's plans than before. The companies also point out that the new collaboration dovetails nicely with the existing agreements with the likes of Microsoft, Oracle, and SoftBank. OpenAI is already the exclusive AI partner for Microsoft, which promised in January to invest $80 billion in the technology. Meanwhile, OpenAI's Sam Altman remarks that "compute infrastructure will be the basis for the economy of the future", a statement that would seem more like hyperbole a mere two or three years ago. OpenAI's next datacenters will use Nvidia's Vera Rubin platform (and presumably Rubin Ultra), powerful accelerators packing 76 TB of HBM4 memory that should be capable of performing FP4 inference at 3.6 exaflops and FP8 training at 1.2 exaflops. The fact that the "exa" prefix is becoming commonplace is exciting and scary in equal measures. The Rubin GPU and Vera CPUs taped out in late August and are now being manufactured in TSMC facilities. Meanwhile, Rubin Ultra is expected to deliver 15 exaflops of FP4 operations for inference, and 5 exaflops of FP4 for training. These figures come by way of 365 TB of HBM4e memory and 14 GB300 GPUs. To put the 10-gigawatt figure into perspective, a contemporary U.S. nuclear power plant reactor is suitable for around 1 gigawatt, meaning these new datacenters will gobble up 10 reactors' worth of juice to do their thing. That's a concept that's hard to wrap one's head around. While the technological advancement is definitely impressive, it also raises hard questions about its environmental costs.
[12]
A look at OpenAI's tangled web of dealmaking
While OpenAI says that scaling is key to driving innovation and future AI breakthroughs, investors and analysts are beginning to raise their eyebrows over the mindboggling sums, as well as OpenAI's reliance on an increasingly interconnected web of infrastructure partners. OpenAI took a $350 million stake in CoreWeave ahead of its IPO in March, for instance. Nvidia formalized its financial stake in OpenAI by participating in a $6.6 billion funding round in October. Oracle is spending about $40 billion on Nvidia chips to power one of OpenAI's Stargate data centers, according to a May report from the Financial Times. Earlier this month, CoreWeave disclosed an order worth at least $6.3 billion from Nvidia. And through its $100 billion investment in OpenAI, Nvidia will get equity in the startup and earn revenue at the same time. OpenAI is only expected to generate $13 billion in revenue this year, according to the company's CFO Sarah Friar. She told CNBC that technology booms require bold bets on infrastructure. "When the internet was getting started, people kept feeling like, 'Oh, we're over-building, there's too much,'" Friar said. "Look where we are today, right?" Altman told CNBC in August that he's willing to run the company at a loss in order to prioritize growth and its investments.
[13]
Nvidia to invest $100 billion in OpenAI to help expand the ChatGPT maker's computing power
Chipmaker Nvidia will invest $100 billion in OpenAI as part of a partnership announced Monday that will add at least 10 gigawatts of Nvidia AI data centers to ramp up the computing power for the owner of the artificial intelligence chatbot ChatGPT. Per the letter of intent signed by the companies, the first gigawatt of Nvidia systems will be deployed in the second half of 2026. Nvidia and OpenAI said they would be finalizing the details of the arrangement in the coming weeks. "This partnership complements the deep work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world's most advanced AI infrastructure," the companies said in a release. The Nvidia-OpenAI partnership comes about 10 days after OpenAI said it had reached a new tentative agreement that will give Microsoft a $100 billion equity stake in its for-profit corporation. OpenAI is technically controlled by its nonprofit. OpenAI was founded as a nonprofit in 2015 and its nonprofit board has continued to control the for-profit subsidiary that now develops and sells its AI products. OpenAI's corporate structure and nonprofit mission are the subject of a lawsuit brought by Elon Musk, who helped found the nonprofit research lab and provided initial funding. Musk's suit seeks to stop OpenAI from taking control of the company away from its nonprofit and alleges it has betrayed its promise to develop AI for the benefit of humanity. Earlier this month, the attorneys general of California and Delaware warned OpenAI that they have "serious concerns" about the safety of ChatGPT, especially for children and teens. The two state officials, who have unique powers to regulate nonprofits such as OpenAI, noted "deeply troubling reports of dangerous interactions between" chatbots and their users, including the suicide of one young Californian after he had prolonged interactions with an OpenAI chatbot. The parents of the 16-year-old California boy, who died in April, sued OpenAI and its CEO, Sam Altman, last month. OpenAI says it has 700 million weekly active users. Also, just last week Nvidia announced that it was investing $5 billion in fellow chipmaker Intel, which has struggled to keep up with the frenzied demand for artificial intelligence.
[14]
NVIDIA is investing up to $100 billion in OpenAI to build 10 gigawatts of AI data centers
NVIDIA will invest up to $100 billion in OpenAI as the ChatGPT maker sets out to build at least 10 gigawatts of AI data centers using NVIDIA chips and systems. The strategic partnership is gargantuan in scale. The 10-gigawatt buildout will require millions of NVIDIA GPUs to run OpenAI's next-generation models. NVIDIA's investment will be doled out progressively as each gigawatt comes online. The first phase of this plan is expected to come online in the second half of 2026, and will be built on NVIDIA's Vera Rubin platform, which NVIDIA CEO will be a "big, big, huge step up," over the current-gen Blackwell chips. "NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Jensen Huang in a announcing the letter of the intent for the partnership. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale," said Sam Altman, CEO of OpenAI. NVIDIA has made a number of strategic investments lately, including in Intel, shortly after the took a 10 percent stake in the American chipmaker. The company also recently to license AI technology from startup Enfabrica and hire its CEO and other key employees. OpenAI has also formed other strategic partnerships over the last few years, including a somewhat complicated . This summer it struck a to build out 4.5 gigawatts of data center capacity using more than 2 million Oracle chips. That deal was part of , the strategic partnership between SoftBank, OpenAI, NVIDIA, Oracle, Arm and Microsoft with a promise to spend $500 billion in the US on AI infrastructure.
[15]
Nvidia to invest $100 billion in OpenAI for 10 gigawatts of AI computing power
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. What we know so far: Even with uncertainties around timing and valuation, the potential scale of Nvidia's investment signals an escalation in the capital intensity of the AI industry and cements the chip designer's central role in shaping the technology's future infrastructure. Executives from both companies are describing the effort as foundational for the next era of the industry. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale," OpenAI CEO Sam Altman said. Nvidia is preparing to make one of the largest corporate investments in history, committing as much as $100 billion to OpenAI as part of a sweeping agreement to expand the infrastructure underpinning artificial intelligence. The deal involves OpenAI purchasing millions of Nvidia's high-performance processors to support the build-out of up to 10 gigawatts of computing capacity, equivalent to the output of 10 nuclear power plants. At its full scale, the investment would eclipse any previous private-company financing round and exceed landmark acquisitions in the technology sector, such as Microsoft's $75 billion purchase of Activision Blizzard. Nvidia's commitment is structured as a staged equity purchase in OpenAI, people familiar with the matter told The Financial Times. The first tranche, totaling $10 billion, would be deployed when OpenAI brings its initial gigawatt of computing power online. At OpenAI's current $500 billion valuation, that payment would give Nvidia about a 2 percent stake in the startup. Further equity purchases would be tied to subsequent deployments and executed at prevailing valuations. Nvidia plans to fund the investment entirely in cash. While Nvidia could ultimately invest $100 billion, the overall build-out of the planned infrastructure is expected to cost as much as $400 billion, incorporating the expense of chips, land, and supporting facilities. OpenAI intends to spend over $100 billion on Nvidia processors alone, a commitment that one person close to the talks said would be financed in part by Nvidia's own equity investment as well as OpenAI's future revenues and additional funding sources. The investment would give OpenAI access to 4 million to 5 million of Nvidia's GPUs. The computing power, spread among previously unannounced projects, is expected to be largely concentrated in the United States. The first operational phase is scheduled for the second half of 2026, when OpenAI will begin deploying Nvidia's forthcoming Vera Rubin chip system. OpenAI, which remains unprofitable, has sought outsized commitments from multiple technology partners to fund its growth ambitions. Earlier this year, it signed a $300 billion, five-year agreement with Oracle for access to computing power and announced plans with Broadcom to develop its own line of AI chips. The company is also in negotiations with Microsoft, its earliest strategic investor, to shift to a new corporate structure that would allow outside investors to hold equity. Nvidia, meanwhile, has expanded beyond chipmaking into direct investments across the AI ecosystem. In recent years it has taken stakes in cloud providers such as CoreWeave and Nebius, application developers like Elon Musk's xAI and Perplexity, and robotics start-ups Figure and Wayve. Analysts view the OpenAI deal as an extension of that strategy on a much larger scale. "Nvidia is consolidating control over the AI stack and reinforcing its position as the [sector's] indispensable enabler," Dmitri Zabelin, an analyst at PitchBook said. Some observers note, however, that the arrangement may be less transformative than the headline figure suggests. Michael Cusumano, professor at MIT's Sloan School of Management, said the deal was "kind of a wash" given that "Nvidia is investing $100 billion in OpenAI stock and OpenAI is saying they are going to buy $100 billion or more of Nvidia chips."
[16]
Nvidia to invest $100bn in OpenAI, firm behind ChatGPT
US tech giant Nvidia will invest up to $100bn (£73bn) in OpenAI, the firm behind ChatGPT, the companies announced. Nvidia said it will supply high-performance chips needed for the processing power required by artificial intelligence (AI), of which OpenAI is a specialist. It is the latest move in an escalating race between global tech firms trying to get ahead in the AI sector, where China is an emerging rival. The announcement comes after a series of high-profile investments by Nvidia, including a $5bn investment in Intel and a £2bn investment in the UK's AI sector.
[17]
Nvidia adds more air to the AI bubble with $100B OpenAI pact
analysis OpenAI and Nvidia have signed a letter of intent wherein OpenAI agrees to buy at least 10 gigawatts of Nvidia systems for its datacenters, while the AI arms dealer returns the favor with an investment of up to $100 billion in the house that Altman built. The first phase of the deal, which Nvidia and OpenAI jointly announced on Monday, will see OpenAI deploying Team Green's Vera Rubin platform in its datacenters starting in H2 2026. To help defray the cost and to own a bigger piece of the AI pie, Nvidia will start buying into OpenAI "progressively as each gigawatt is deployed," the press release explains. As part of the deal, Nvidia will become the "preferred strategic compute and networking partner" of OpenAI, whatever that means. The deal is non-exclusive on OpenAI's part, we're told, meaning it can use competing chips from AMD or others, should it care to do so. The announcement has enough wiggle room to drive an AI-powered self-driving semi through - it's a letter of intent for a strategic partnership, which is a non-binding kinda-sorta contract, and the deal calls only for Nvidia to invest "up to" $100 billion for as long as OpenAI keeps buying its chips. If the whole thing sounds a lot like the company whose stock price has benefited most from the AI bubble keeping said bubble inflated by helping to fund its own most important customer, well, you might be savvier than the average stock market participant, who collectively bid Nvidia's stock up about 4% on the news. (Then again, the pop may be years away, in which case investors sitting on the sidelines are the real fools. Nobody knows!) On a more mundane level, Nvidia has chips that OpenAI needs, and the AI leader gets some of the funding it will need to carry on burning through cash until it builds a market position that comes close to profitability, which Sam Altman recently admitted wouldn't happen for "quite a while," even as the firm is on track to hit about $20 billion in annualized revenue this year. If OpenAI does in fact turn the corner to become a self-sustaining business, then Nvidia owns part of what could become the most valuable software company in history. Oracle is also part of this back-scratching arrangement. Just a couple of weeks ago, OpenAI reportedly committed $300 billion over five years to pay Oracle for about five gigawatts of compute capacity that likely also comes from Nvidia GPUs. It's not clear where OpenAI will find that money, but that didn't stop Oracle from boasting about a huge projected increase in demand for AI infrastructure, which inspired stockholders to send shares up 40% at one point, briefly making Oracle founder Larry Ellison the richest human in the world. The two companies are already closely intermingled - some might say co-dependent. Last week, Nvidia said it was supplying the hardware for the British arm of OpenAI's Stargate datacenters, albeit using Grace Blackwell Ultra GPUs rather than Rubin silicon. And in July, it announced a similar deal in Norway for another Stargate bit barn. Meanwhile, it remains unclear how big and lucrative the demand for AI tools actually is, with reports increasingly suggesting that top-down company-mandated AI pilots are not offering much of a return on investment, even as individual workers are finding ways to use AI to make their own jobs easier or more productive. No matter! The train keeps rolling... ®
[18]
NVIDIA, OpenAI Announce 'the Biggest AI Infrastructure Deployment in History'
NVIDIA CEO Jensen Huang, OpenAI CEO Sam Altman and OpenAI President Greg Brockman described a new strategic partnership to fuel OpenAI's growth -- and enable AI at scale for virtually every industry and user. OpenAI and NVIDIA just announced a landmark AI infrastructure partnership -- an initiative that will scale OpenAI's compute with multi-gigawatt data centers powered by millions of NVIDIA GPUs. To discuss what this means for the next generation of AI development and deployment, the two companies' CEOs, and the president of OpenAI, spoke this morning with CNBC's Jon Fortt. "This is the biggest AI infrastructure project in history," said NVIDIA founder and CEO Jensen Huang in the interview. "This partnership is about building an AI infrastructure that enables AI to go from the labs into the world." Through the partnership, OpenAI will deploy at least 10 gigawatts of NVIDIA systems for OpenAI's next-generation AI infrastructure, including the NVIDIA Vera Rubin platform. NVIDIA also intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed. "There's no partner but NVIDIA that can do this at this kind of scale, at this kind of speed," said Sam Altman, CEO of OpenAI. The million-GPU AI factories built through this agreement will help OpenAI meet the training and inference demands of its next frontier of AI models. "Building this infrastructure is critical to everything we want to do," Altman said. "This is the fuel that we need to drive improvement, drive better models, drive revenue, drive everything." Since the launch of OpenAI's ChatGPT -- which in 2022 became the fastest application in history to reach 100 million users -- the company has grown its user base to more than 700 million weekly active users and delivered increasingly advanced capabilities, including support for agentic AI, AI reasoning, multimodal data and longer context windows. To support its next phase of growth, the company's AI infrastructure must scale up to meet not only training but inference demands of the most advanced models for agentic and reasoning AI users worldwide. "The cost per unit of intelligence will keep falling and falling and falling, and we think that's great," said Altman. "But on the other side, the frontier of AI, maximum intellectual capability, is going up and up. And that enables more and more use -- and a lot of it." Without enough computational resources, Altman explained, people would have to choose between impactful use cases, for example either researching a cancer cure or offering free education. "No one wants to make that choice," he said. "And so increasingly, as we see this, the answer is just much more capacity so that we can serve the massive need and opportunity." The first gigawatt of NVIDIA systems built with NVIDIA Vera Rubin GPUs will generate their first tokens in the second half of 2026. The partnership expands on a long-standing collaboration between NVIDIA and OpenAI, which began with Huang hand-delivering the first NVIDIA DGX system to the company in 2016. "This is a billion times more computational power than that initial server," said Greg Brockman, president of OpenAI. "We're able to actually create new breakthroughs, new models...to empower every individual and business because we'll be able to reach the next level of scale." Huang emphasized that though this is the start of a massive buildout of AI infrastructure around the world, it's just the beginning. "We're literally going to connect intelligence to every application, to every use case, to every device -- and we're just at the beginning," Huang said. "This is the first 10 gigawatts, I assure you of that."
[19]
Nvidia's $100bn bet on 'gigantic AI factories' to power ChatGPT
Even for the biggest ever public company and the most valuable start-up in history, this week's artificial intelligence data centres deal between Nvidia and OpenAI was a blockbuster. Nvidia, valued at $4.3tn, pledged to make the tech industry's largest private investment into OpenAI, spending up to $100bn to fund new computing power. As the last remaining founder-chief executive of a major tech company from before the dotcom era, Nvidia's Jensen Huang is leveraging his commanding position in Silicon Valley like never before to ensure the AI boom endures -- and his chipmaker remains at its centre. "[$100bn] is a huge number but we are talking about a company with a market value of nearly $4.5tn," said Michael Cusumano, professor of technological innovation and entrepreneurship at MIT's Sloan School of Management. "That's also unprecedented." It comes after a whirlwind of huge deals from Nvidia, including investing $5bn in its rival Intel last week. Despite all the superlatives, Nvidia and OpenAI's announcement left big uncertainties around the proposed $100bn investment. Its unclear how quickly such huge facilities could be built and where the companies can source enough energy to run then. OpenAI plans to lease chips from Nvidia as part of the deal, according to people with knowledge of the matter, but details of the arrangement have not been announced. Nvidia's decision to pump money into OpenAI to fund its need for the chipmaker's hardware has raised concerns over the agreement's circular structure. Still, analysts concede Nvidia's investment can be comfortably funded from the chipmaker's rapidly growing cash flows -- and if fully consummated the deal could drive hundreds of billions of dollars in revenue for the company. It will also help fortify Nvidia's position as an indispensable player in the infrastructure underpinning AI models such as OpenAI's ChatGPT. Nvidia's share price has surged roughly 1,000 per cent since the chatbot launched in late 2022. However, OpenAI has recently moved to diversify its semiconductor supply chain, striking a deal with Broadcom produce custom chips. One senior Big Tech executive said the deal highlights Nvidia's "reliance" on OpenAI, and Huang's desire to "head off the threat of his biggest customer building its own chip with Broadcom". Nvidia pushed back against any such suggestion, saying its AI infrastructure provided "an unparalleled combination of performance, versatility and value, and is available to every AI lab, cloud and enterprise". The relationship between Nvidia and OpenAI dates back to 2016, when Huang delivered a device he has dubbed "the first AI supercomputer the world ever made" to the AI lab when it was barely a year old. Nine years later, Huang negotiated this week's deal directly with Sam Altman, OpenAI's co-founder and chief executive, said a person familiar with the matter. The two founders worked largely without formal advice from the bankers who would normally act as intermediaries in such deals, putting the finishing touches to the agreement last week in the UK during President Donald Trump's state visit. The data centres -- Huang has called them "AI factories" -- allow OpenAI to train its AI systems and produce answers, charging customers for the output. Huang has said that for every 1 gigawatt of AI infrastructure deployed, as much as $50bn is spent on the computing hardware, including Nvidia's specialised processors and its own networking technology, as well as the server racks that are produced by the likes of Foxconn, HP, Dell and Super Micro. "These are gigantic factory investments," Huang said at an event in Taiwan in May. Nvidia's OpenAI deal calls for "at least" 10GW of computing power to be built, over an unspecified period. The International Energy Agency estimates 10GW of AI data centres would consume as much energy in a year as 10mn typical US households. OpenAI said the deal is separate from the extravagant plans for Stargate, its global infrastructure project with Japan's SoftBank and US tech group Oracle, which includes a recent $300bn contract with Oracle. Morgan Stanley has estimated deploying 10GW of AI computing power could cost as much as $600bn, of which $350bn "potentially" goes to Nvidia. Morgan Stanley's analysts wrote in a note to clients: "That's a very large number so we are not viewing this level of investment as a certainty but part of the framing of the longer-term bull case [for Nvidia stock]." Still, the investment will only fuel the "arms race" to develop advanced AI, the analysts added. "The scale and scope of OpenAI investment is starting to dwarf all peers, but the desire to build intelligence compute remains intense." OpenAI on Tuesday said it has struck agreements to develop five new US data centres, pushing the cost of Stargate to about $400bn. "Our vision is simple: we want to create a factory that can produce a gigawatt of new AI infrastructure every week," Altman said in a blog post on Tuesday. OpenAI is competing with Google, Meta, Elon Musk's xAI and Anthropic in the US, as well as with Chinese rivals including DeepSeek and Alibaba. Their infrastructure arms race continues despite persistent warning signs that the industry's vast capital outlay is far outpacing the revenue that AI is delivering. A report by consultancy Bain, released just hours after the Nvidia-OpenAI deal was announced, estimates AI companies need to spend $500bn on capital investment each year to meet anticipated demand by 2030. Funding that huge outlay sustainably would require $2tn in annual revenues, Bain projected, but the industry is on pace to miss that target by some $800bn. With such uncertainty over AI's returns, OpenAI's future has been clouded by questions over which groups it would get to fund its vast infrastructure projects. Nvidia's $100bn pledge goes some way to answering those questions, providing capital that it will receive in increments as its data centre construction progresses and making it cheaper to finance the hundreds of billions more it needs to execute on its plans. Altman on Tuesday said he would "talk about how we are financing" OpenAI's infrastructure ambitions "later this year". The deal marks a "new financing model . . . where we can pay over time, instead of buying them up front", he added. "The chips and the systems are a humungous [percentage] of the cost and its hard to pay that upfront." With more than 700mn people using ChatGPT every week, OpenAI executives have said privately for months that they are already starved of the compute capacity they need to deliver such a complex product on a massive scale. Altman is betting that "innovation is increasingly gated by access to infrastructure rather than ideas", said Dimitri Zabelin, AI analyst at PitchBook, which tracks venture capital deals. Huang's move to ensure Nvidia is OpenAI's "preferred strategic compute and networking partner", as Monday's announcement put it, will make it harder for AI developers to move away to rival processors. Nvidia's Cuda software platform, which has become the default way to write the AI software that runs on its chips, adds to the company's grip on the industry. The deal comes at a time when many of the chipmaker's biggest customers -- including Google, Meta, Amazon and Microsoft -- are racing to develop their own custom processors as an alternative to Nvidia. Cusumano likens Nvidia's use of Cuda to extend its dominance to the way Microsoft and Apple gave away the tools needed to build apps for their Windows and iOS operating systems, allowing their platforms to dominate the personal computer and smartphone eras. "The difference with Nvidia is it's like combining Microsoft and Intel at their peak into one company," he said. "It's like a drug -- software developers will use Nvidia's tools and they have to use [Nvidia's] hardware." Huang has continued a strategy that can be traced back directly to that first supercomputer delivery to a fledgling AI lab in 2016. By keeping AI developers hooked on its product, Huang's investment into OpenAI -- as well as dozens of other start-ups involved in AI applications, cloud computing, robotics and healthcare -- "will pay off multifold in the future for Nvidia", Cusumano added.
[20]
More questions than answers in Nvidia's $100 billion OpenAI deal
SAN FRANCISCO, Sept 23 (Reuters) - Nvidia's (NVDA.O), opens new tab move to invest up to $100 billion into OpenAI at the same time it plans to supply millions of its market-leading artificial intelligence chips to the ChatGPT creator has little precedent in the tech industry. Under the deal, Nvidia will be taking a financial stake in one of its largest customers, but without receiving any voting power in return, according to a person close to OpenAI. The ChatGPT maker will receive some - but not nearly all - of the capital it needs for its ambitious plans to build the sprawling supercomputers required to develop new generations of AI. Nvidia's initial $10 billion investment would go toward a gigawatt of capacity using its next-generation Vera Rubin chips, with a build-out starting in the second half of 2026. The deal raises many questions. Here are five of the biggest ones: WHERE DOES THE REST OF THE MONEY COME FROM? In an earnings call in August, Nvidia CEO Jensen Huang said that AI data centers cost about $50 billion per gigawatt of capacity to build out, with about $35 billion of that money going toward Nvidia's chips and gear. Nvidia has committed to investing in OpenAI to help it build 10 gigawatts of data center capacity, or about $10 billion per gigawatt. That leaves about $40 billion in additional capital required for each gigawatt of capacity OpenAI plans to build. OpenAI has not signaled whether it agrees with Huang's cost estimates or, if it does, where it would procure the additional funds. OpenAI did not return a request for comment about its funding plans. Nvidia declined to comment beyond what it has said publicly. WHAT DOES IT MEAN FOR OPENAI'S EFFORTS TO BECOME A FOR-PROFIT? OpenAI is a non-profit corporation, a structure that dates to its days as an AI research group. It has been looking to change to a more conventional structure that would allow it to more easily raise money and hold a public offering. OpenAI has held extensive discussions with Microsoft (MSFT.O), opens new tab, a major shareholder that funded OpenAI's early computing needs, to change its structure. Earlier this month, the two firms said they had reached a tentative deal on OpenAI converting to a for-profit public benefit corporation that would be overseen by OpenAI's existing non-profit, though that move still needs approval from state officials in Delaware and California. On Monday, a person familiar with the matter told Reuters that Nvidia would be making a cash investment into OpenAI similar to other OpenAI investors. Moreover, Nvidia's initial $10 billion investment will not begin until OpenAI and Nvidia reach a definitive agreement in the coming months. It was not immediately clear whether Nvidia planned to invest in OpenAI's non-profit entity or whether its plans depend on OpenAI's conversion to a public benefit corporation overseen by a non-profit. WHAT DOES IT MEAN FOR OPENAI'S VALUATION? OpenAI is currently valued at $500 billion, and a person familiar with the matter told Reuters that Nvidia's initial $10 billion investment for one gigawatt of capacity would be at that valuation. But neither Nvidia nor OpenAI gave a timeframe for the entire 10 gigawatts of capacity coming online or for the $100 billion of investment to take place. Also unanswered is whether subsequent Nvidia investments in OpenAI would take place at OpenAI's current valuation, or at the valuation of the company at the time Nvidia makes each investment. WHAT DOES IT MEAN FOR COMPETITION? The deal between Nvidia and OpenAI could see Nvidia earmarking a significant number of its chips, which remain in high demand several years into the AI boom and access to which can determine success or failure in the field, to a single customer in which it is also a shareholder. An important question is whether OpenAI's rivals such as Anthropic, or even Microsoft, which competes with OpenAI to sell AI technology to businesses, will retain access to Nvidia's chips. The deal also raises questions about whether AMD (AMD.O), opens new tab, which is aiming to compete with Nvidia in selling chips to OpenAI and others, will have a viable chance of selling chips to AI companies. WHAT DOES IT MEAN FOR ORACLE? Oracle (ORCL.N), opens new tab said earlier this month that it has signed hundreds of billions of dollars in contracts to provide cloud computing services to OpenAI and a handful of other large customers, which sent its stock soaring and made co-founder Larry Ellison one of the world's richest people. But one of the key questions lingering after that forecast - and a question that debt-rating firm Moody's raised - is whether OpenAI has the cash to pay for the contracts. On Monday, shortly before Nvidia's announcement, Oracle re-affirmed its forecast as it named two new CEOs. It is possible that Nvidia's investment plans could put Oracle's revenue forecast on a firmer footing because a key customer, OpenAI, has fresh capital commitments. Reporting by Stephen Nellis in San Francisco; Editing by Muralikumar Anantharaman Our Standards: The Thomson Reuters Trust Principles., opens new tab
[21]
NVIDIA investing $100B in OpenAI data centers for next-gen AI
NVIDIA CEO Jensen Huang called the collaboration the "next leap forward" for both companies. "NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," Huang said. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." Huang told CNBC that the 10 gigawatts translates to between 4 million and 5 million GPUs. That equals the number of chips NVIDIA expects to ship this year and represents twice as much as last year. "This is a giant project," he said. NVIDIA shares rose 3% on Monday after the announcement. Analysts described the investment as both "monumental in size" and a clear sign of how closely the companies are tied. OpenAI CEO Sam Altman underscored how compute remains central to the company's mission. "Everything starts with compute," Altman said. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale."
[22]
Nvidia's investment in OpenAI will be in cash, and most will be used to lease Nvidia chips
OpenAI CEO Sam Altman speaks to media following a Q&A at the OpenAI data center in Abilene, Texas, U.S., Sept. 23, 2025. Nvidia's massive investment in OpenAI, announced earlier this week, will put billions of dollars into the coffers of the artificial intelligence startup to use as it sees fit. But most of the money will go towards use of Nvidia's cutting-edge chips. The agreement between the two companies was big on numbers but thin on specifics. They said the investment would reach up to $100 billion, paid out as AI supercomputing facilities open in the coming years, with the first one coming online in the second half of 2026. The timing of the buildouts and the cost of each data center remains up in the air. However, what's become clear is that OpenAI plans to pay for Nvidia's graphics processing units (GPUs) through lease arrangements, rather than upfront purchases, according to people familiar with the matter who asked not be named because the details are private. Nvidia CEO Jensen Huang, who described this week's deal as "monumental in size," has estimated that an AI data center with a gigawatt of capacity costs roughly $50 billion, with $35 billion of that used to pay for Nvidia's GPUs. By leasing the processors, OpenAI can spread its costs out over the useful life of the GPUs, which could be up to five years a person said, leaving Nvidia to bear more of the risk.
[23]
Nvidia invests $100 billion in OpenAI, sparking antitrust concerns
Following its recent surprise $5 billion Intel deal, Nvidia is spending big again, this time committing up to $100 billion to OpenAI alongside supplying millions of its chips. The move fits a broader pattern in which Nvidia channels money into businesses that rely on its own hardware, from $6.3 billion in CoreWeave to $700 million in nScale, effectively reinforcing demand for its products while bypassing hyperscalers like Google and Microsoft which are racing to reduce their dependence on Nvidia's hardware. This latest investment into the world's best-known AI firm immediately lifted Nvidia's market value by more than $220 billion. The deal involves a circular structure and will see Nvidia will buy non-voting shares in OpenAI, which OpenAI will then spend mostly on Nvidia systems. Citing people familiar with the matter, Reuters says the partnership will begin with a $10 billion investment and scale as OpenAI deploys more computing power. "This is the biggest AI infrastructure project in history," Nvidia founder and CEO Jensen Huang said in an interview with CNBC's Jon Fortt. "This partnership is about building an AI infrastructure that enables AI to go from the labs into the world." He said the companies will build data centers capable of running next-generation AI models, powered by Nvidia's new Vera Rubin platform. The first data centers are due online in 2026 and require 10 gigawatts of power, roughly equal to the needs of 8 million US households. OpenAI chief executive Sam Altman said the capacity was essential for the company's ambitions. "Building this infrastructure is critical to everything we want to do," Altman said. "This is the fuel that we need to drive improvement, drive better models, drive revenue, drive everything." Analysts welcomed the long-term demand for Nvidia's products but warned about the structure of the deal. "On the one hand this helps OpenAI deliver on some very aspirational goals for compute infrastructure," said Stacy Rasgon of Bernstein. "On the other hand the 'circular' concerns have been raised in the past, and this will fuel them further." Kim Forrest, Chief Investment Officer, Bokeh Capital also sounded a note of caution. "This sounds like Nvidia is investing in its largest customer. These arrangements can be beneficial for both parties. But there can be dangers as well. Being totally linked with each other can cause for short-sightedness and can make an entry point for other chip competitors to come into other AI companies and woo them," she said. MarketScreener quotes Rebecca Haw Allensworth, an antitrust professor at Vanderbilt Law School, who says there are concerns that Nvidia could favor OpenAI with better pricing or faster delivery times. "They're financially interested in each other's success," she said. "That creates an incentive for Nvidia to not sell chips to, or not sell chips on the same terms to, other competitors of OpenAI." An Nvidia spokesperson denied this would be case, saying, "We will continue to make every customer a top priority, with or without any equity stake."
[24]
OpenAI and Nvidia's AI spending spree could be a risky bet
Why it matters: The U.S. is betting its economic fortunes on the belief that OpenAI's Sam Altman, Nvidia's Jensen Huang and other AI leaders are wizardly innovators dreaming up novel financing vehicles to drive a golden future -- rather than salesmen juggling billions and praying the music never stops. Friction point: The trillion-dollar question neither Silicon Valley nor Wall Street can answer is whether the AI building spree will end up looking more like Google's epochal long-term value creation or Enron's catastrophically faulty financial engineering. Driving the news: Nvidia announced Monday it would invest up to $100 billion in OpenAI in stages, with OpenAI using the money to "build and deploy at least 10 gigawatts of AI data centers with Nvidia systems." * OpenAI also announced Tuesday additional commitments for its Stargate alliance with Oracle and Softbank to build five new U.S. data centers. This brings the sum it has lined up for the effort $400 billion of the way toward a $500 billion goal. * Google, Microsoft, Meta and Amazon have all projected a total of hundreds of billions more in capital expenditures on data centers for AI. Yes, but: Some market observers are asking questions about the closed-loop appearance of the OpenAI-Nvidia deal. * "Nvidia to pay openai so they can get paid by softbank so they can pay oracle to pay nvidia," CNBC's Steve Kovach joked on Bluesky. * The deal's structure "will clearly fuel 'circular' concerns," one Wall Street analyst wrote. Flashback: At the raging peak of the dotcom-era bubble in the late '90s and early 2000s, online ad giants like AOL Time Warner and high-flying telecoms like Qwest were accused of inflating revenue figures by essentially flooding customers with cash that the customers would then spend on ads or services. * Such parallels haunt today's AI boom and reinforce the skeptical view that this investment bonanza might be a shell game and can't last forever. The other side: The big bucks flowing into data centers are coming not from paper gains -- the skyrocketing IPOs fueled by day-trading speculators that fueled the '90s boom and then evaporated -- but from tech giants' profit-fed cash hoards. * "These companies aren't mortgaging the future; they're spending current winnings," Semafor's Liz Hoffman notes. * And in the case of Nvidia's OpenAI deal, the chipmaker is getting a stake in the ChatGPT maker -- an investment opportunity many other companies are also making right now (or wish they could make). Big Tech has a long history of cyclical over-investment, most famously with the turn-of-the-millennium fiber buildout. * As this wheel turns, fortunes are made and lost in a flash, depending on investors' timing. But even if today's AI market goes south, data centers are durable goods -- hardware and buildings. * The companies that own them can still use them to bootstrap whatever technology wave comes next. Our thought bubble: The real unknown in this gigantic equation is AI consumption.
[25]
Nvidia to invest $100bn in OpenAI, bringing the two AI firms together
Deal will involve two transactions - OpenAI will pay Nvidia for chips, and the chipmaker will invest in the AI start-up Nvidia, the chipmaking company, will invest up to $100bn in OpenAI and provide it with data center chips, the companies said on Monday, a tie-up between two of the highest-profile leaders in the global artificial intelligence race. The deal, which will see Nvidia start delivering chips as soon as late 2026, will involve two separate but intertwined transactions, according to a person close to OpenAI. The startup will pay Nvidia in cash for chips, and Nvidia will invest in OpenAI for non-controlling shares, the person said. The first $10bn of Nvidia's investment in OpenAI, which was most recently valued at $500bn, will begin when the two companies reach a definitive agreement for OpenAI to purchase Nvidia chips. Nvidia previously funded OpenAI with a $6.6bn investment. The ChatGPT maker has pledged 49% of its profits to Microsoft after a $13bn investment made in 2023. OpenAI is in the midst of a long and much-litigated process to convert to a for-profit entity. The companies unveiled a letter of intent for a landmark strategic partnership to deploy at least 10GW of Nvidia chips for OpenAI's AI infrastructure. "Everything starts with compute," Sam Altman, CEO of OpenAI, said in a release. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Altman has said in the past that his company is constrained by how much computing power it can access, most often measured in the number of graphics processing units, or GPUs, that allow artificial intelligence products to answer users' queries. They aim to finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026. Nvidia's investment comes just days after it committed $5bn to struggling chipmaker Intel. Nvidia, the most valuable company in the world at a $4tn market capitalization, is seen as a leader in artificial intelligence by dint of its cutting-edge chips.
[26]
Nvidia to invest up to $100bn in OpenAI
Nvidia on Monday said it planned to invest up to $100bn in OpenAI as part of a "landmark strategic partnership" to support a massive build-out of data centres for artificial intelligence. OpenAI said it plans to buy millions of Nvidia's AI processors as part of the deal, the two companies said, in a move that could ultimately generate hundreds of billions of dollars in revenue for the US chipmaker. The groups did not specify a long-term timeframe for the deployment. Nvidia plans to buy equity in OpenAI progressively over time as its systems are deployed. The investment will be made in cash, said one person familiar with the deal. Nvidia's shares rose nearly 3 per cent on the announcement. The infrastructure deal, the first phase of which is planned to come online in the second half of 2026, will deploy Nvidia's next-generation "Vera Rubin" chip system, the successor to its current generation Blackwell technology. "Everything starts with compute," said Sam Altman, chief executive of OpenAI. "Compute infrastructure will be the basis for the economy of the future, and we will utilise what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Jensen Huang, Nvidia chief executive, called the investment and infrastructure partnership "the next leap forward" in AI. The announcement follows Huang and Altman's trip last week to the UK, where they accompanied US President Donald Trump on his state visit to announce major AI infrastructure plans.
[27]
VIEW Analysts react to Nvidia's $100 billion investment in OpenAI
Sept 22 (Reuters) - Chipmaker Nvidia (NVDA.O), opens new tab is set to invest up to $100 billion in ChatGPT-parent OpenAI, signing a letter of intent for a strategic partnership to deploy at least 10 gigawatts of compute, the companies said on Monday. Nvidia has used its financial clout to keep its hardware central to the buildout of artificial intelligence systems. Keeping OpenAI, which is also exploring its own chip designs, as a key customer could help the company reinforce its dominance as the industry considers rival suppliers. Here are some analyst reactions to the partnership: MATT BRITZMAN, SENIOR EQUITY ANALYST, HARGREAVES LANSDOWN "For Nvidia, the prize is huge -- every gigawatt of AI data centre capacity is worth about $50 billion in revenue, meaning this project could be worth as much as $500 billion. "By locking in OpenAI as a strategic partner and co-optimizing hardware and software roadmaps, Nvidia is ensuring its GPUs remain the backbone of next-gen AI infrastructure. "The market is clearly big enough for multiple players, but this deal underscores that, when it comes to scale and ecosystem depth, Nvidia is still setting the pace -- and raising the stakes for everyone else." JACOB BOURNE, TECHNOLOGY ANALYST, EMARKETER "Demand for Nvidia GPUs is effectively baked into the development of frontier AI models, and deals like this should also ease concerns about lost sales in China. "It also throws cold water on the idea that rival chipmakers or in-house silicon from the Big Tech platforms are anywhere close to disrupting Nvidia's lead. "For OpenAI, it signals greater independence as it continues diversifying away from its Microsoft partnership and races to develop its next-generation models." ANSHEL SAG, PRINCIPAL ANALYST, MOOR INSIGHTS & STRATEGY "I think this strengthens the partnership between the two companies that has existed since the beginning of OpenAI's existence. This also validates Nvidia's long-term growth numbers with so much volume and compute capacity, also enabling OpenAI to scale to even bigger customers." BEN BAJARIN, CEO OF TECHNOLOGY CONSULTING FIRM CREATIVE STRATEGIES "Really the point Nvidia was making was that it's just enabling OpenAI to meet surging demand and, at this point, we know there's surging demand for Nvidia GPUs, because that's primarily what OpenAI runs on." Reporting by Juby Babu in Mexico City, Kritika Lamba and Arsheeya Bajwa in Bengaluru; Editing by Alan Barona Our Standards: The Thomson Reuters Trust Principles., opens new tab
[28]
How much of the AI boom is underpinned by Nvidia's own balance sheet? Investors increasingly are asking. | Fortune
Nvidia's announcement earlier this week that it is investing $100 billion into OpenAI to help fund its massive data center build out has added to a growing sense of unease among investors that there is a dangerous financial bubble around AI, and that the revenues and earnings math underpinning the valuations of both public and private companies in the sector just doesn't add up. While Nvidia's latest announcement is by far the largest example, the AI chipmaker has engaged in a series of "circular" deals in which it invests in, or lends money to, its own customers. Vendor financing exists to some degree in many industries, but in this case, circular transactions may give investors an inflated perception of the true demand for AI. In past technology bubbles, revenue "roundtripping" and tech companies financing their own customers have exacerbated the damage when those bubbles eventually popped. While the share of Nvidia's revenues that are currently being driven by such financing appears to be relatively small, the company's dominance as the world's most valuable publicly-traded company means that its stock is "priced for perfection" and that even minor missteps could have outsized impact on its valuation -- and on financial markets and perhaps even the wider economy. The extent to which the entire AI boom is backstopped by Nvidia's cash isn't easy to answer precisely, which is also one of the unsettling things about it. The company has struck a number of investment and financing deals, many of which are too small individually for the company to consider "material" and report in its financial filings, even though collectively they may be significant. In addition, there are so many interlocking rings of circularity -- where Nvidia has invested in a company, such as OpenAI, that in turn purchases services from a cloud service provider that Nvidia has also invested in, which then also buys or leases GPUs from Nvidia -- that disentangling what money is flowing where is far from easy. Two of the most prominent examples of Nvidia's web of circuitous investments are OpenAI and Coreweave. In addition to the latest investment in OpenAI, Nvidia had previously participated in a $6.6 billion investment round in the fast-growing AI company in October 2024. Nvidia also has invested in CoreWeave, which supplies data center capacity to OpenAI and is also an Nvidia customer. As of the end of June, Nvidia owned about 7% of Coreweave, a stake worth about $3 billion currently. The benefits that companies get from a Nvidia investment extend beyond the cash itself. Nvidia's equity stakes in companies such as OpenAI and Coreweave enable these companies to access debt financing for data center projects at potentially significantly lower interest rates than they would be able to access without such backing. Jay Goldberg, an analyst with Seaport Global Securities, compares such deals to someone asking their parents to be a co-signer on their mortgage. It gives lenders some assurance that they may actually get their money back. Startups financing data centers have often had to borrow money at rates as high as 15%, compared to 6% to 9% that a large, established corporation such as Microsoft might have to pay. With Nvidia's backing, OpenAI and Coreweave have been able to borrow at rates closer to what Microsoft or Google might pay. Nvidia has also signed a $6.3 billion deal to purchase any cloud capacity that CoreWeave can't sell to others. The chipmaker had previously agreed to spend $1.3 billion over four years on cloud computing with CoreWeave. Coreweave, meanwhile, has purchased at least 250,000 Nvidia GPUs so far -- the majority of which it says are H100 Hopper models, which cost about $30,000 each -- which means Coreweave has spent about $7.5 billion buying these chips from Nvidia. So in essence, all of the money Nvidia has invested in Coreweave has come back to it in the form of revenue. Nvidia has struck similar cloud computing deals with other so-called "neo-cloud" companies. According to a story in The Information, Nvidia agreed this summer to spend $1.3 billion over four years renting some 10,000 of its own AI chips from Lambda, which like Coreweave runs data centers, as well as a separate $200 million deal to rent some 8,000 more over an unspecified time period. For those who believe there's an AI bubble, the Lambda deal is clear evidence of froth. Those Nvidia chips Lambda is renting time on back to Nvidia? It bought them with borrowed money collateralized by the value of the GPUs themselves. Besides its large investments in OpenAI and Coreweave, AI chipmaker also holds multi-million dollar stakes in several other publicly-traded companies that either purchase its GPUs or work on related chip technology. These include chip design firm Arm, high-performance computing company Applied Digital, cloud services company Nebius Group, and biotech company Recursion Pharmaceuticals. (Nvidia also recently purchased a 4% stake in Intel for $5 billion. Like Arm, Intel makes chips that in some cases are alternatives to Nvidia's GPUs, but which for the most part are complementary to them.) Earlier this month, Nvidia also pledged to invest £2 billion ($2.7 billion) in U.K. AI startups, including at least £500 million in Nscale, a U.K.-based data center operator that will, presumably, be using some of that money to purchase Nvidia GPUs to provision the data centers it is building. Nvidia also said it would invest in a number of British startups, both directly and through local venture capital firms, and some of that money too, will likely come back to OpenAI in the form of computing purchases, either directly, or through cloud service providers, who in turn will need to buy Nvidia GPUs. In 2024, Nvidia invested about $1 billion in AI startups globally either directly or through its corporate venture capital arm NVentures, according to data from Dealroom and The Financial Times. This amount was up significantly from what Nvidia invested in 2022, the year the generative AI boom kicked off with OpenAI's debut of ChatGPT. How much of this money winds up coming right back to Nvidia in the form of sales is again, difficult to determine. Wall Street research firm NewStreet Research has estimated that for every $10 billion Nvidia invests in OpenAI, it will see $35 billion worth of GPU purchases or GPU lease payments, an amount equal to about 27% of its annual revenues last fiscal year. That kind of return would certainly make this sort of customer financing worthwhile. But it does raise concerns among analysts about a bubble in AI valuations. These kinds of circular deals have been a hallmark of previous technology bubbles and have often come back to haunt investors. In this case, the lease arrangements that Nvidia is entering into with OpenAI as part of its latest investment could prove problematic. By leasing GPUs to OpenAI, rather than requiring them to buy the chips outright, Nvidia is sparing OpenAI from having to take an accounting charge for the high depreciation rates on the chips, which will ultimately help OpenAI's bottom line. But it means that instead Nvidia will have to bear this depreciation costs. What's more, Nvidia will also take on the risk of being stuck with an inventory of GPUs no one wants if demand for AI workloads don't match Nvidia CEO Jensen Huang's rosy predictions. To some market watchers, Nvidia's latest deals feel all-too-similar to the excesses of past technology booms. During the dot com bubble at the turn of the 21st Century, telecom equipment makers such as Nortel, Lucent, and Cisco lent money to startups and telecom companies to purchase their equipment. Just before the bubble burst in 2001, the amount of financing Cisco and Nortel had extended to their customers exceeded 10% of annual revenues, and the amount of financing the top five telecom equipment makers had provided to customers exceeded 123% of their combined earnings. Ultimately, the amount of fiber-optic cabling and switching equipment installed far exceeded demand, and when the bubble burst and many of those customers went bust, the telecom equipment makers were left holding the bad debt on their balance sheets. This contributed to a greater loss of value when the bubble burst than would have otherwise been the case, with networking equipment businesses losing more than 90% of their value over the ensuing decade. Worse yet were companies such as fiber-optic giant Global Crossing that engaged in direct "revenue roundtripping." These companies cut deals -- often at the end of a quarter in order to hit topline forecasts -- in which they paid money to another company for services, and then that company agreed to purchase equipment of exactly equal value. When the bubble burst, Global Crossing went bankrupt, and its executives ultimately paid large legal settlements related to revenue roundtripping. It is memories of these kinds of transactions that have caused analysts to at least raise an eyebrow at some of Nvidia's circular investments. Goldberg, the Seaport Global analyst, said the deals had a whiff of circular financing and were emblematic of "bubble-like behavior." "The action will clearly fuel 'circular' concerns," Stacy Rasgon, an analyst with Bernstein Research, wrote in an investor note following Nvidia's announcement of its blockbuster investment in OpenAI. It's a long way from a concern to a crisis, of course, but as AI company valuations get higher, that distance is starting to close.
[29]
The $100bn deal that signals the AI bubble could be about to burst
At the height of the dotcom bubble in 2000, AOL was one of the world's hottest companies. The internet pioneer had brought the web to millions of American households, and its advertising revenue was doubling year over year. In a bullish sign of the web's future, AOL announced a $360bn (£266bn) merger with media firm Time Warner - the biggest deal in American history. It would take years after the bubble burst to discover the truth of AOL's meteoric rise. In 2005, American regulators charged the company with propping up its revenues by using fraudulent "round-trip" transactions in which it secretly paid its customers to buy AOL advertising. "The company effectively funded its own online advertising revenue," prosecutors said. AOL paid a $300m penalty to settle the claims. These circular deals were a common feature of the dotcom bubble. Telecoms and software companies paid each other to finance new networks and boost sales, boosting revenue and maintaining the illusion of growth. But when the bubble popped, the house of cards collapsed. A quarter of a century later, sceptics of the artificial intelligence (AI) movement claim to be observing similar patterns and suggest a new bubble could be inflating. Spending spree On Monday, Nvidia, the semiconductor giant which has become the world's most valuable company on the back of the AI boom, said it would invest $100bn in OpenAI, the San Francisco start-up behind ChatGPT. Much of the cash could ultimately flow back to Nvidia. The deal also includes plans for OpenAI to spend billions on data centres likely to be filled with Nvidia's chips. The investment will come in stages, with Nvidia investing more cash in OpenAI as OpenAI spends more.
[30]
Chip stocks climb after Nvidia-OpenAI deal announcement
Global semiconductor stocks surged on Tuesday after Nvidia announced plans to invest as much as $100 billion in OpenAI - despite a warnings of an industry shortfall in the revenue needed to fund future computing power. The Nvidia-OpenAI deal will see the chipmaker put vast amounts of cash into helping build data centers, which it has called "AI factories," through a staged partnership that guarantees OpenAI first dibs on the world's most coveted GPUs. While more details of the deal will be finalized in the coming weeks, global chip stocks were sent higher on the news. In Taiwan, shares of Taiwan Semiconductor Manufacturing Co., which makes chips for Nvidia, rose 3.5%, while South Korea's SK Hynix also saw shares rise more than 2.5%. The company makes memory chips for Nvidia's systems. SK Hynix's rival Samsung also jumped 1.4% amid expectations that the company will get approval to supply its own memory chips to Nvidia in the near future. The agreement combines a supplier deal with a financing package. Nvidia will fund construction as OpenAI builds data centers powered by millions of its chips. The first facility, using Nvidia's upcoming Vera Rubin platform, is expected to begin operating in the second half of 2026. By the time the network reaches 10 gigawatts, OpenAI will run a grid comparable in scale to national power utilities. Under the deal, Nvidia becomes OpenAI's "preferred strategic compute and networking partner." The companies will align their hardware and software schedules so that new Nvidia platforms launch alongside new OpenAI models. The stock rally came despite a warning from Bain & Co. about the financial pressure building across the AI sector. The consulting firm said companies developing AI are committing vast sums to new data centers but are not on course to generate the income to pay for them. In its annual Global Technology Report, Bain said AI firms will need roughly $2 trillion in yearly revenue by 2030 to cover the computing power needed to meet demand, but will miss that by $800 billion as services such as ChatGPT bring in less money than the infrastructure costs required to support them. OpenAI is losing billions of dollars a year as it focuses on expansion, though the company has said it expects to turn cash-flow positive by 2029. The Bain report said the mismatch could increase scrutiny of how AI companies are valued. "If the current scaling laws hold, AI will increasingly strain supply chains globally," said David Crawford, chairman of Bain's global technology practice. -- Shannon Carroll contributed to this article.
[31]
Nvidia to invest $100 billion in OpenAI to help expand the ChatGPT maker's computing power
Chipmaker Nvidia will invest $100 billion in OpenAI as part of a partnership announced Monday that will add at least 10 gigawatts of Nvidia AI data centers to ramp up the computing power for the owner of the artificial intelligence chatbot ChatGPT. Per the letter of intent signed by the companies, the first gigawatt of Nvidia systems will be deployed in the second half of 2026. Nvidia and OpenAI said they would be finalizing the details of the arrangement in the coming weeks. "This partnership complements the deep work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world's most advanced AI infrastructure," the companies said in a release. Those companies pledged to invest at least $100 billion in building data centers for OpenAI in January. The Nvidia-OpenAI partnership also comes about 10 days after OpenAI said it had reached a new tentative agreement that will give Microsoft a $100 billion equity stake in its for-profit corporation. OpenAI is technically controlled by its nonprofit. Speaking on CNBC, OpenAI CEO Sam Altman said the new data centers that Nvidia will build are in addition to the previously announced projects. "Building this infrastructure is critical to everything we want to do," Altman said. "Without doing this, we cannot deliver the services people want. We can't keep making better models." He said both Nvidia and Microsoft are "passive investors," and OpenAI's nonprofit and board controls the company. OpenAI was founded as a nonprofit in 2015 and its nonprofit board has continued to control the for-profit subsidiary that now develops and sells its AI products. OpenAI's corporate structure and nonprofit mission are the subject of a lawsuit brought by Elon Musk, who helped found the nonprofit research lab and provided initial funding. Musk's suit seeks to stop OpenAI from taking control of the company away from its nonprofit and alleges it has betrayed its promise to develop AI for the benefit of humanity. Earlier this month, the attorneys general of California and Delaware warned OpenAI that they have "serious concerns" about the safety of ChatGPT, especially for children and teens. The two state officials, who have unique powers to regulate nonprofits such as OpenAI, noted "deeply troubling reports of dangerous interactions between" chatbots and their users, including the suicide of one young Californian after he had prolonged interactions with an OpenAI chatbot. The parents of the 16-year-old California boy, who died in April, sued OpenAI and its CEO, Sam Altman, last month. OpenAI says it has 700 million weekly active users. Also, just last week Nvidia announced that it was investing $5 billion in fellow chipmaker Intel, which has struggled to keep up with the frenzied demand for artificial intelligence.
[32]
Altman, Huang and the last-minute negotiations that sealed the $100 billion OpenAI-Nvidia deal
OpenAI's ascent to the forefront of generative AI has relied on Nvidia's high-powered graphics processing units (GPUs). Now the companies are more intimately linked than ever, as they plan to carve a path to jointly building the next wave of AI supercomputing facilities. "You should expect a lot from us in the coming months," Altman told CNBC's Jon Fortt in an interview at Nvidia's Silicon Valley headquarters on Monday. "There are three things that OpenAI has to do well: we have to do great AI research, we have to make these products people want to use, and we have to figure out how to do this unprecedented infrastructure challenge." Altman and Huang negotiated their pact largely through a mix of virtual discussions and one-on-one meetings in London, San Francisco, and Washington, D.C., with no bankers involved, according to people close to the talks who declined to be named because they weren't authorized to speak publicly on the matter. The arrangement calls for Nvidia to invest $10 billion at a time in OpenAI, the company behind ChatGPT. As the buildout unfolds, Nvidia will also supply the cutting-edge processors powering a host of new data centers. While OpenAI gets more intimate with Nvidia, it has to maneuver through a number of high-stakes relationships with other key partners. OpenAI only informed Microsoft, its principal shareholder and primary cloud provider, a day before the deal was signed, the people familiar with the matter said. Earlier this year, Microsoft lost its status as OpenAI's exclusive provider of computing capacity. The pact also comes less than two weeks after a disclosure from Oracle indicated that OpenAI agreed to spend $300 billion in computing power with the company over about five years, starting in 2027. At the start of the year, OpenAI joined Stargate, a multibillion-dollar project announced by President Trump and backed by Oracle and SoftBank, to build out next-generation AI infrastructure. Going forward, all of OpenAI's infrastructure projects will fall under the Stargate umbrella. Representatives from Microsoft, Oracle and SoftBank didn't immediately respond to requests for comment. Nvidia and OpenAI provided scant details about where and when the buildout will take place, other than to say that the first of the 10 gigawatt sites will go online in the back half of next year. Executives said they've reviewed between 700 and 800 potential locations since unveiling Stargate in January. In the months that followed, they fielded a flood of proposals from developers across North America offering land, power, and facilities. That list has been narrowed as OpenAI weighs energy availability, permitting timelines, and financing terms, the company said. In Monday's announcement, OpenAI described Nvidia as a "preferred" partner. But executives told CNBC that it's not an exclusive relationship, and the company is continuing to work with large cloud companies and other chipmakers to avoid being locked in to a single vendor.
[33]
Nvidia Commits $100 Billion to OpenAI in Historic AI Infrastructure Deal -- Stock Pops - Decrypt
Experts warn AI's electricity and water demands could strain grids worldwide. Nvidia is betting big on the future of artificial intelligence, committing up to $100 billion to build out OpenAI's computing muscle in what could be the largest infrastructure deal in AI history to date. The chipmaker said Monday it had signed a letter of intent with OpenAI to deploy at least 10 gigawatts of Nvidia-powered AI systems. A gigawatt is equal to one billion watts of power -- enough electricity to supply millions of homes. According to a report by Bloomberg, the money will be provided in stages, with the first $10 billion coming when the deal is signed, citing people familiar with the matter. Nvidia will receive OpenAI equity as part of the deal. Nvidia's stock (NVDA) rose 4% from $177.50 to $184.16 on news of the partnership. In an interview with CNBC, Nvidia CEO Jensen Huang called the deal the next leap forward for artificial intelligence. "The computing demand is going through the roof," Jensen said. "This partnership is about building an AI infrastructure that enables AI to go from the labs into the world. This is about the AI industrial revolution arriving." The plan calls for vast new data centers filled with millions of GPUs, including its upcoming Vera Rubin platform, designed for training and running next-generation models. The first gigawatt of capacity is expected to go online in the second half of 2026, with the rest rolling out as Nvidia ramps its investment. Joining Huang were OpenAI CEO Sam Altman, and OpenAI President Greg Brockman. Altman said the deal with Nvidia will expand the ambitions set by the Stargate project announced by President Donald Trump in January. "This is the fuel we need to drive improvement -- to build better models, generate revenue, everything," Altman said. "This is helping us, along with our partners at Stargate, Microsoft, and Oracle, to build increasing amounts of infrastructure to deliver on what the world is demanding." Deploying 10 gigawatts of computing capacity won't be simple, however. Cooling alone can consume nearly 40% of a data center's power use, according to energy consultancy 174 Power Global. Environmental concerns are also mounting. Deloitte estimates data centers will account for about 2% of global electricity use in 2025, or 536 terawatt-hours, but says demand from power-hungry AI could push that figure to more than 1,000 terawatt-hours by 2030. The United Nations Environment Programme has warned of rising water consumption for cooling, while the Environmental and Energy Study Institute said data centers are already straining electric grids.
[34]
Nvidia plans to splash OpenAI with cash, pouring out $100 billion for ChatGPT's creator and making last week's Intel investment look like a drop in the money bucket
Big tech is capable of throwing around some eye-watering amounts of cash. As you may recall, Nvidia announced $46.7 billion total revenue during its Q2 2025 earnings call. That's not just a lot of moolah, but serious spending power. As such, this week, Nvidia announced it will be investing $100 billion into OpenAI. Part of this mountain of money will go towards supplying the steward of ChatGPT with data centre chips. Details have yet to be finalised, but a letter of intent signed by the two companies announced plans to deploy 10 gigawatts of Nvidia systems for use in OpenAI's data centres. The two companies were hardly strangers to begin with, but this latest deal gives Nvidia a stake in one of its biggest customers. Nvidia's investment in OpenAI will eventually take the form of non-voting shares in the company. OpenAI will then use the resulting cash flow to buy the aforementioned AI chips. Ultimately, this latest pledge of $100 billion makes last week's surprising news that Nvidia would be putting $5 billion into Intel look like a drop in the bucket. Once this most recent deal is finalised, sources close to the company claim the plan is for Nvidia to invest an initial sum of $10 billion, followed by a hardware rollout sometime towards the end of 2026. The first gigawatt of power will likely take to the stage of Nvidia's upcoming Vera Rubin AI compute platform, which was first revealed back in March. OpenAI CEO Sam Altman explained in a statement that it was all about maintaining a competitive edge in an increasingly crowded field, saying, "Everything starts with compute. Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." But Nvidia spending money on OpenAI so OpenAI can then buy Nvidia hardware has raised some concerns; if this flow of cash looks a little circular to you, you're not the only one concerned about the potential shape of things to come. Speaking to Reuters, Bernstein analyst Stacy Rasgon commented, "On the one hand this [deal] helps OpenAI deliver on what are some very aspirational goals for compute infrastructure, and helps Nvidia ensure that that stuff gets built. On the other hand the 'circular' concerns have been raised in the past, and this will fuel them further." Though that said, it's perhaps too early to start throwing around words like 'antitrust,' particularly as the US Trump administration is all in on AI. Still though, the proposed 10 gigawatt data centres will demand power equivalent to the needs of 8 million U.S. households; despite Nvidia CEO Jensen Huang's suggestion that AI customers 'pace themselves' and other major big tech players looking to nuclear to meet AI's power demands, there may come a time when such a power imbalance can no longer be ignored.
[35]
Nvidia invests $100 billion in OpenAI to fuel its computing power
The landmark agreement will boost the ChatGPT maker's computing power with multi-gigawatt data centres, powered by millions of Nvidia's high-speed graphics processing units. Chipmaker Nvidia will invest $100 billion (€85bn) in OpenAI as part of a partnership announced on Monday that will add at least 10 gigawatts of Nvidia AI data centres to ramp up the computing power for the owner of the artificial intelligence chatbot ChatGPT. Per the letter of intent signed by the companies, the first gigawatt of Nvidia systems will be deployed in the second half of 2026. Nvidia and OpenAI said they would be finalising the details of the arrangement in the coming weeks. "This partnership complements the deep work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world's most advanced AI infrastructure," the companies said in a statement. Those companies pledged to invest at least $100bn in building data centres for OpenAI in January. The Nvidia-OpenAI partnership also comes about 10 days after OpenAI said it had reached a new tentative agreement that will give Microsoft a $100 billion equity stake in its for-profit corporation. OpenAI is technically controlled by its nonprofit. Speaking on CNBC, OpenAI CEO Sam Altman said the new data centres that Nvidia will build are in addition to the previously announced projects. "Building this infrastructure is critical to everything we want to do," Altman said. "Without doing this, we cannot deliver the services people want. We can't keep making better models." He said both Nvidia and Microsoft are "passive investors", and OpenAI's nonprofit and board controls the company. Nvidia also announced that it was investing $5bn in fellow chipmaker Intel, which has struggled to keep up with the frenzied demand for artificial intelligence. OpenAI was founded as a nonprofit in 2015, and its nonprofit board has continued to control the for-profit subsidiary that now develops and sells its AI products. OpenAI's corporate structure and nonprofit mission are the subject of a lawsuit brought by Elon Musk, who helped found the nonprofit research lab and provided initial funding. Musk's suit seeks to stop OpenAI from taking control of the company away from its nonprofit and alleges it has betrayed its promise to develop AI for the benefit of humanity. OpenAI says it has 700 million weekly active users. Earlier this month, the attorneys general of California and Delaware warned OpenAI that they have "serious concerns" about the safety of ChatGPT, especially for children and teens. The two state officials, who have unique powers to regulate nonprofits such as OpenAI, noted "deeply troubling reports of dangerous interactions between" chatbots and their users, including the suicide of one young Californian after he had prolonged interactions with an OpenAI chatbot. The parents of the 16-year-old California boy, who died in April, sued OpenAI and its CEO, Sam Altman, last month.
[36]
Chipmaker Nvidia to invest up to $100 billion in OpenAI
OpenAI Ceo Sam Altman and Nvidia CEO Jensen Huang. Getty Images ChapGPT parent OpenAI is set to receive up to $100 billion in investments from chip giant Nvidia, helping cement the two firms as leaders in the race to build artificial intelligence systems that could transform the economy and society. The companies said the move will enable OpenAI to expand the fleet of data centers it needs to power ChatGPT, which in August hit 700 million weekly global users. It will require the buildout of 10 gigawatts of power, equivalent to the amount consumed by about eight million homes. No specific timetable for the buildout was announced. "This is a giant project," Nvidia CEO Jensen Huang said in a joint appearance on CNBC alongside OpenAI CEO Sam Altman and Greg Brockman, the company's president. Altman said the investment represents a bet that the current capacities of its AI products -- and the financial returns to them -- can be significantly improved. "There are three things that OpenAI has to do well. We have to do great AI research. We have to make these products people want to use. And we have to figure out how to do this unprecedented infrastructure challenge," he said. The news also pushed stocks to fresh highs, despite growing evidence of a broader economic slowdown. Shares in Nvidia climbed more than 3% -- equivalent to about $200 billion -- on the announcement, adding to its lead as the world's most valuable publicly traded company, now worth nearly $4.5 trillion. The S&P 500 climbed more than 0.3% in Monday trading as it touched a fresh all-time high. The Dow Jones Industrial Average gained about 0.1%, while the tech-heavy Nasdaq jumped 0.6%. AI bets have continued to fuel investors' appetite for stocks even as signs of economic stress mount. The Federal Reserve announced its first rate cut of 2025 last week amid growing indications of a weakening jobs picture. "The labor market is really cooling off," Fed Chair Jerome Powell said.
[37]
Nvidia to invest up to $100 bn in OpenAI data centers
San Francisco (United States) (AFP) - Nvidia said Monday it will invest up to $100 billion in OpenAI, building infrastructure for next-generation artificial intelligence. The strategic partnership aimed at deploying massive data center capacity unites generative AI star OpenAI with the leading maker of chips powering the technology. "Compute infrastructure will be the basis for the economy of the future," OpenAI chief executive Sam Altman said in a joint release. "We will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale." The partnership will enable San Francisco-based OpenAI to build and deploy AI data centers with Nvidia systems, representing millions of sophisticated graphics processing units (GPUs), according to the companies. The first Nvidia systems are expected to be operating in the second half of next year. OpenAI and Nvidia added that they will work together to optimize how the companies' hardware and software complement each other. No financial details were provided beyond the possible magnitude of Nvidia's investment in OpenAI. Tech industry rivals Amazon, Google, Meta, Microsoft and Elon Musk's xAI have been pouring billions of dollars into artificial intelligence since the blockbuster launch of the first version of ChatGPT in late 2022. Nvidia has become a coveted source of high-performance GPUS tailored for generative AI. Chinese startup DeepSeek shook up the AI sector early this year with a model that delivers high performance using less costly chips. Silicon Valley-based Nvidia last week announced it would invest $5 billion in struggling chip rival Intel. The investment represents a significant commitment to Intel's turnaround efforts. Nvidia joined Japanese investment giant SoftBank and the US government in backing the once-dominant chipmaker, which has fallen behind in recent years after missing key technology shifts. Nvidia's GPUs, originally designed for gaming systems, have become the essential building blocks of artificial intelligence applications, with tech giants scrambling to secure them for their data centers and AI projects. OpenAI released a keenly awaited new generation of its hallmark ChatGPT last month, touting significant advancements in artificial intelligence capabilities as a global race over the technology accelerates. ChatGPT-5 was made available free to the more than 700 million who use the AI tool weekly, according to OpenAI.
[38]
"The next leap forward" - Nvidia is investing $100bn in OpenAI, and will start by deploying as much power for 10 nuclear reactors
Nvidia will provide at least 10 gigawatts (millions of GPUs) to OpenAI Nvidia has lifted the wraps off a multi-billion dollar investment into ChatGPT maker OpenAI as part of its plans to take the company's AI infrastructure to the next level. At least 10 gigawatts of Nvidia systems are set to power OpenAI's next-generation models, which equates to millions of GPUs, with Nvidia ploughing up to $100 billion into OpenAI progressively as each gigawatt is deployed. "NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Jensen Huang, founder and CEO of NVIDIA. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." The two companies confirmed that the first gigawatt, using the Vera Rubin platform, could come online as early as the second half of 2026. In its announcement, OpenAI named Nvidia as its "preferred strategic compute and networking partner," noting the two companies will "co-optimize their roadmaps" to align OpenAI models and infrastructure with Nvidia hardware and software. In the near-three years since its public preview launch, ChatGPT and OpenAI's other AI tools have amassed over 700 million weekly active users, and continued growth mandates further expansion. This initiative has been described as the biggest AI infrastructure project in history, and it could lead to next-generation superintelligence. "Everything starts with compute," said Sam Altman, co-founder and CEO of OpenAI. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale." The news comes shortly after OpenAI revealed Project Stargate, a $500 billion push to build new AI infrastructure together with Nvidia and Oracle. "We're excited to deploy 10 gigawatts of compute with NVIDIA to push back the frontier of intelligence and scale the benefits of this technology to everyone," OpenAI President Greg Brockman concluded.
[39]
Nvidia to invest $100B in ChatGPT creator OpenAI
Why it matters: Nvidia is the world's most valuable company, with a dominant position in the chips powering the AI revolution, and OpenAI is the innovating force behind market leader ChatGPT. Driving the news: The deal calls for the $4 trillion chip giant to deliver the funds in stages, enabling OpenAI to "build and deploy at least 10 gigawatts of AI data centers with Nvidia systems." * Nvidia called it a "strategic partnership" and said it would allow OpenAI to "train and run its next generation of models on the path to deploying superintelligence." * The company said the first phase of systems will come online in the second half of 2026 using Nvidia's next-generation Vera Rubin chips. What they're saying: "Nvidia and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," Nvidia CEO Jensen Huang said in a statement. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." * "Everything starts with compute," OpenAI CEO Sam Altman said in a statement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Zoom in: Nvidia said the companies plan to work together to "co-optimize their roadmaps." * They described their deal as a complementary to OpenAI's relationship with other collaborators, including Microsoft, Oracle, SoftBank and Stargate.
[40]
'Giant project', says Nvidia CEO as company bets $100bn on OpenAI
CNBC reports that the first investment of $10bn will be deployed when the first GW is completed. Nvidia is planning on investing $100bn in OpenAI, which, the Financial Times says is in return for a "significant stake" in the company. The "giant project", as Jensen Huang, the founder and CEO of Nvidia puts it, will see at least 10GW of Nvidia systems being deployed for OpenAI's AI infrastructure. The first phase of this massive project is expected towards the second half of 2026. OpenAI has been an early and voracious consumer of Nvidia's AI technology. "We've been working closely with Nvidia since the early days of OpenAI," said Greg Brockman, the co-founder and president of OpenAI. "We've utilised their platform to create AI systems that hundreds of millions of people use every day." OpenAI's weekly active user-base has grown to more than 700m, the company said. This partnership takes the tight-knit relationship a step further, placing Nvidia as OpenAI's preferred strategic compute and networking partner. Huang told the press that the 10GW in planned infrastructure would involve 4m to 5m of Nvidia's AI chips. While sources told the Financial Times that most of the planned infrastructure will be deployed in the US. CNBC reports that the first investment of $10bn will be deployed when the first GW is completed. "Everything starts with compute," said Sam Altman, co-founder and CEO of OpenAI. "Compute infrastructure will be the basis for the economy of the future, and we will utilise what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." OpenAI has expensive plans for AI domination. Just this month, the start-up reportedly tapped Broadcom for a $10bn order of custom AI chips, and signed a $300bn contract with Oracle for its compute power. The company also expanded its Stargate joint venture into the UK in partnership with Nvidia and Nscale. Meanwhile, the OpenAI is in ongoing negotiations with lead investor Microsoft over its company restructuring plans. "Nvidia and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Huang. "This investment and infrastructure partnership mark the next leap forward - deploying 10GW to power the next era of intelligence." Meanwhile Nvidia has been in its own spending spree over AI tech. Last week, the company, along with several US Big Tech leaders announced billions to develop the UK's AI capacity, with Nvidia alone pledging £2bn for the country's AI start-up ecosystem. The company also bet $5bn on its struggling rival Intel, becoming a large investor in the company along with the US government and Japanese investment giant SoftBank. Don't miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic's digest of need-to-know sci-tech news.
[41]
Nvidia will invest up to $100B in OpenAI to finance data center construction - SiliconANGLE
Shares of Nvidia Corp. rose nearly 4% today after it announced plans to invest up to $100 billion in OpenAI. The funding is intended to help the artificial intelligence provider grow its data center capacity. According to OpenAI, the plan is to add least 10 gigawatts' worth of computing infrastructure. One gigawatt corresponds to the energy use of several hundred thousand homes. Nvidia plans to disburse the funds "progressively as each gigawatt is deployed." OpenAI expects to complete the initial phase of the construction project in the second half of 2026. It didn't specify how many gigawatts' worth of infrastructure will be built during that initial phase, but disclosed the hardware will be powered by Nvidia's upcoming Vera Rubin chip. The Vera Rubin combines a graphics processing unit with a central processing unit in a single package. The CPU, Vera, features 88 cores based on an Arm Holdings plc design. It's linked to the attached GPU by an interconnect that can move 1.8 terabits of data between the two chips every second. The GPU in the Vera Rubin chip is based on a new graphics card architecture called Rubin. Nvidia plans to launch two different Rubin GPUs next year. Each chip is optimized for a different subset of the computations involved in running inference workloads. Nvidia previewed one of the upcoming GPUs, the Rubin CPX, earlier this month. It's optimized for the initial set of computations that an AI model carries out after it receives a user prompt. Those initial computations require less memory than the subsequent processing steps. According to Nvidia, the Rubin CPX comprises a single die with 128 gigabytes of GDDR7 memory. It includes circuits optimized to run language models' attention mechanism and video processing workloads. Nvidia hasn't shared the specifications of the other Rubin GPU that it plans to launch. It's unclear how OpenAI plans to implement Nvidia's processors in its data centers. One possibility is that it will use the Vera Rubin NVL144 CPX, an upcoming appliance the chipmaker detailed earlier this month. The system includes 288 GPUs and 36 CPUs that can provide 8 exaflops of performance. Nvidia and OpenAI stated that they will "work together to co-optimize their roadmaps for OpenAI's model and infrastructure." That hints the companies may collaborate on certain engineering initiatives. Other market players have made similar moves. After OpenAI rival Anthropic PBC raised $4 billion from Amazon Web Services Inc. last November, it announced plans to contribute to the development of the cloud giant's Neutron AI model compiler. "NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Nvidia Chief Executive Officer Jensen Huang. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." Nvidia and OpenAI have outlined the terms of the partnership in a letter of intent. They plan to finalize the agreement in the coming weeks.
[42]
Nvidia to invest $100 billion in OpenAI
Chipmaker Nvidia will invest up to $100 billion in OpenAI and provide it with data center chips, the companies said on Monday, a tie-up between two of the highest-profile leaders in the global artificial intelligence race. The deal, which will see Nvidia start delivering chips as soon as late 2026, will involve two separate but intertwined transactions, according to a person close to OpenAI. The startup will pay Nvidia in cash for chips, and Nvidia will invest in OpenAI for non-controlling shares, the person said. The first $10 billion of Nvidia's investment in OpenAI, which was most recently valued at $500 billion, will begin when the two companies reach a definitive agreement for OpenAI to purchase Nvidia chips
[43]
OpenAI, NVIDIA Sign $100 Billion Deal to Deploy 10 GW of AI Systems
The first gigawatt of capacity is scheduled for deployment in the second half of 2026 on NVIDIA's Vera Rubin platform. OpenAI and NVIDIA on Monday signed a letter of intent for a strategic partnership to deploy at least 10 gigawatts of NVIDIA systems for OpenAI's next-generation AI infrastructure. The agreement includes NVIDIA investing up to $100 billion in OpenAI, tied to each gigawatt of systems deployed. The first gigawatt of capacity is scheduled for deployment in the second half of 2026 on NVIDIA's Vera Rubin platform. "NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Jensen Huang, founder and CEO of NVIDIA. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." "Everything starts with compute," said Sam Altman, co-founder and CEO of OpenAI. "Compute infrastructure will be the basis for the economy of the future, and we will utilise what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale." Earlier, Altman said the company will soon roll out new compute-intensive offerings, noting that some of the features will initially be limited to Pro subscribers and certain products may come with additional fees. "Our intention remains to drive the cost of intelligence down as aggressively as we can and make our services widely available," Altman said on X. He added that OpenAI aims to test the limits of current models by applying significant compute power to new ideas, even at today's costs. "We've utilised their platform to create AI systems that hundreds of millions of people use every day. We're excited to deploy 10 gigawatts of compute with NVIDIA to push back the frontier of intelligence and scale the benefits of this technology to everyone," said Greg Brockman, co-founder and president of OpenAI. As part of the agreement, OpenAI will use NVIDIA as a preferred compute and networking partner for its AI factory growth. Both companies will coordinate their hardware and software roadmaps to support future model development and infrastructure. The partnership builds on OpenAI and NVIDIA's existing collaborations with Microsoft, Oracle, SoftBank, and Stargate partners, which are focused on building advanced AI infrastructure. OpenAI currently reports over 700 million weekly active users, with adoption spanning enterprises, small businesses, and developers worldwide. The companies said this new agreement will support OpenAI's mission to develop artificial general intelligence that benefits humanity.
[44]
Nvidia to back OpenAI with staged $100 billion investment
Nvidia just put real money behind its favorite metaphor. After two years of calling modern data centers "AI factories," the chipmaker is going all-in with chips -- and cash -- to help build them. Today, Nvidia and OpenAI announced that the chipmaker plans to invest up to $100 billion in OpenAI through a staged partnership that doesn't just guarantee the startup first dibs on the world's most coveted GPUs -- it effectively makes Nvidia a co-builder of the infrastructure that will power the next wave of artificial intelligence. Nvidia will pour in money gigawatt by gigawatt as OpenAI builds data centers stocked with millions of its chips. The first of those, running on Nvidia's coming Vera Rubin platform, is slated to switch on in the second half of 2026. By the time the build-out hits 10 gigawatts -- about the output of several nuclear reactors -- OpenAI will be operating the kind of compute grid that is usually reserved for national utilities. The deal is part supplier agreement, part financing package -- and has a moat big enough to make rivals sweat. This isn't a one-time wire; it's a letter-of-intent to invest progressively as each gigawatt is deployed -- money that shows up when concrete, power, racks, and GPUs do. In return, OpenAI names Nvidia its preferred strategic compute and networking partner, aligning hardware and software roadmaps so that model releases and silicon arrive in lockstep.
[45]
Nvidia to invest $100 billion in OpenAI
Sept 22 (Reuters) - Chipmaker Nvidia (NVDA.O), opens new tab plans to invest up to $100 billion in artificial intelligence startup OpenAI under a new agreement, the companies said on Monday, as competition intensifies among technology giants to secure access to energy and chips needed for AI growth. The companies unveiled a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure. They aim to finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026 "Everything starts with compute," Sam Altman, CEO of OpenAI said in a release. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Shares of Oracle (ORCL.N), opens new tab, a partner with OpenAI, SoftBank (9434.T), opens new tab, and Microsoft (MSFT.O), opens new tab on the $500 billion Stargate AI data center project, gained nearly 5%. Nvidia's investment comes days after it committed $5 billion to struggling chipmaker Intel (INTC.O), opens new tab. Reporting by Arsheeya Bajwa in Bengaluru; Editing by Tasim Zahid Our Standards: The Thomson Reuters Trust Principles., opens new tab
[46]
Nvidia to invest up to $100bn in OpenAI
Nvidia has revealed plans to invest up to $100bn (£74bn) in OpenAI in return for a large stake in ChatGPT's owner. The deal also gives OpenAI the cash and access it needs to buy advanced chips that are key to maintaining its own dominance. It is understood that, under the plans, Nvidia will buy non-voting shares in OpenAI later this year. OpenAI can then use this cash to buy Nvidia's chips - vital for the additional processing power it wants. Money latest: I was charged £885 for airport parking error OpenAI boss Sam Altman said: "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." It builds on a series of similar deals in recent weeks. Nvidia was among top US tech firms to reveal major investment in the UK during last week's state visit by president Donald Trump. "This partnership complements the deep work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world's most advanced AI infrastructure," the pair said. The deal could yet attract attention from competition regulators despite that sentiment. At the same time OpenAI, like Google, Amazon and others, has been working on plans to build its own AI chips, aiming for a cheaper alternative to Nvidia. Matt Britzman, senior equity analyst at Hargreaves Lansdown, said of the partnership: "For Nvidia, the prize is huge - every gigawatt of AI data centre capacity is worth about $50bn in revenue, meaning this project could be worth as much as $500bn. "This move cements Nvidia's position as the undisputed king of AI at a time when custom chips from hyperscalers and startups had started to nibble at its dominance. "By locking in OpenAI as a strategic partner and co-optimizing hardware and software roadmaps, Nvidia is ensuring its GPUs remain the backbone of next-gen AI infrastructure. "The market is clearly big enough for multiple players, but this deal underscores that, when it comes to scale and ecosystem depth, Nvidia is still setting the pace - and raising the stakes for everyone else."
[47]
Nvidia and OpenAI Form $100 Billion Partnership for 10 GW AI Datacenters
Nvidia and OpenAI are teaming up on what might be the biggest AI hardware deal so far: a $100 billion partnership to build datacenters powered by 10 gigawatts of Nvidia hardware. To put that in perspective, that's about the same power output as ten nuclear reactors. The first of these massive sites is scheduled to go live in the second half of 2026, and the long-term plan is to keep building until they hit that 10 GW target. For Nvidia, the deal isn't just about selling GPUs -- it's also putting money back into OpenAI. The company has committed $100 billion, likely through share purchases and direct investment, making sure both sides are deeply tied together. In return, OpenAI is naming Nvidia as its go-to partner for compute and networking, giving the chipmaker a secure spot at the center of OpenAI's growing infrastructure. One of the more interesting points is that the two companies say they'll "co-optimize" their roadmaps. Nvidia has always taken feedback from its big AI customers, but this could give OpenAI an even stronger influence over how future GPUs and accelerators are designed. Considering OpenAI's need to train and deploy massive language models, it's likely future Nvidia hardware will be even more tuned for that kind of workload. The datacenters themselves will run on Nvidia's Vera Rubin and Rubin Ultra platforms. Rubin GPUs come with 76 TB of HBM4 memory and can hit 3.6 exaflops for FP4 inference, plus 1.2 exaflops for FP8 training. Rubin Ultra goes further, packing 365 TB of HBM4e memory and 14 GB300 GPUs, pushing up to 15 exaflops of FP4 inference and 5 exaflops of FP4 training. These are staggering numbers -- just a few years ago, "exaflops" was a term reserved for the biggest supercomputers in the world. Now it's becoming normal for AI datacenters. Of course, all this power comes with a cost. Ten gigawatts of energy is a huge draw, and it raises questions about sustainability. Running hardware at this scale means massive energy and cooling demands, even with efficiency improvements in modern GPU design. But as OpenAI's Sam Altman put it, compute infrastructure is becoming the backbone of the future economy, suggesting that the benefits may outweigh the costs in the eyes of tech leaders. Source: openai
[48]
Nvidia plans to invest up to $100 billion in OpenAI as part of data center buildout
Nvidia will invest $100 billion in OpenAI as the artificial intelligence lab sets out to build hundreds of billions of dollars in data centers based around Nvidia's AI chips, the companies said on Monday. OpenAI plans to build and deploy Nvidia systems that require 10 gigawatts of power, the companies said on Monday. A gigawatt is a measure of power that is increasingly being used to describe the biggest clusters of AI chips. Nvidia stock rose 3% during trading on Monday. The partnership highlights how linked OpenAI and Nvidia are as two of the biggest drivers of the recent artificial intelligence boom. Demand for Nvidia's GPUs first started rising when OpenAI first released ChatGPT in 2022, and OpenAI still relies on Nvidia's AI chips, called GPUs, to develop its software and deploy it to users. It also signals just how many Nvidia chips OpenAI will need in order to develop next-generation artificial intelligence that can do more than its current models. It also suggests that OpenAI will need increasing amounts of chips to serve its users, and said it had 700 million active weekly users on Monday. "This is monumental in size," CEO Jensen Huang told CNBC's Jon Fortt in an interview in San Jose on Monday alongside OpenAI CEO Sam Altman and OpenAI president Greg Brockman. "You should expect a lot from us in the coming months," OpenAI CEO Sam Altman said. "There are three things that OpenAI has to do well: we have to do great Ai research, we have to make these products people want to use, and we have to figure out how to do this unprecedented infrastructure challenge."
[49]
Nvidia and OpenAI announce landmark $100 billion partnership, igniting global stock rally
The strategic partnership aims to deploy 10 gigawatts of AI infrastructure using millions of Nvidia GPUs. Major global semiconductor stocks surged on Tuesday following the announcement of a monumental strategic partnership between Nvidia and OpenAI. The deal, centered on an intention for Nvidia to invest up to $100 billion in OpenAI, sent powerful ripples through the world's financial markets, providing a stark contrast for investors who often wonder "why is the stock market down today" by showcasing the immense upside potential of the AI sector. The positive sentiment lifted major indices, including the tech-heavy Nasdaq index and the broader S&P 500, while the Dow Jones stock markets also saw positive movement. The landmark agreement The Nvidia OpenAI partnership is designed to deploy at least 10 gigawatts of AI infrastructure, a project of unprecedented scale. According to Nvidia CEO Jensen Huang, these systems will require a massive deployment of between 4 and 5 million graphics processing units (GPUs). This historic buildout will power the next generation of OpenAI models on the path toward superintelligence. The first phase of the project, representing one gigawatt of power, is scheduled to come online in the second half of 2026, built on Nvidia's upcoming Vera Rubin platform. The staggering Nvidia $100 billion OpenAI investment is intended to be progressive, released as each gigawatt of the new infrastructure is deployed. NVDA news sparks global rally The announcement was the most significant Nvidia news of the year, triggering a rally on Wall Street that quickly spread to the global chip sector. For investors tracking NVDA news, the implications were clear and immediate. While there is no direct way for the public to purchase OpenAI stock, this deal solidifies Nvidia stock as the premier way to invest in the growth of its partner. The excitement around a potential future OpenAI stock offering has only intensified, but for now, all eyes are on NVDA. The impact was felt across the supply chain: * In Taiwan, shares of Taiwan Semiconductor Manufacturing Co. (TSMC), which manufactures chips for Nvidia, closed 3.5% higher. * In South Korea, SK Hynix, whose memory chips are critical for Nvidia's systems, saw its shares end the session up more than 2.5%. * In Europe, the rally continued, with equipment manufacturers like ASML and ASMI expected to benefit from increased demand from TSMC. While the Dow today reflected a broad-based optimism, the technology sector was the clear leader, driven by the seismic partnership announcement. A new era of computing Executives from both companies highlighted the transformative potential of the collaboration. "Nvidia and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Jensen Huang. "This investment and infrastructure partnership mark the next leap forward." Sam Altman, CEO of OpenAI, emphasized the foundational importance of this infrastructure. "Everything starts with compute," he stated. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." TSMC still eyeing a slice of Intel's foundry This partnership complements the extensive work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, and SoftBank. The goal is to create the world's most advanced AI infrastructure, supporting the more than 700 million weekly active users of OpenAI services and expanding its enterprise adoption. The strategic agreement provides OpenAI with the dedicated capacity to continue scaling its models and pursuing its mission to build artificial general intelligence that benefits all of humanity.
[50]
Nvidia to invest $100B in OpenAI to help expand ChatGPT maker's computing power
Partnership will add at least 10 gigawatts of Nvidia AI data centers. Chipmaker Nvidia will invest $100 billion in OpenAI as part of a partnership announced Monday that will add at least 10 gigawatts of Nvidia AI data centers to ramp up the computing power for the owner of the artificial intelligence chatbot ChatGPT. Per the letter of intent signed by the companies, the first gigawatt of Nvidia systems will be deployed in the second half of 2026. Nvidia and OpenAI said they would be finalizing the details of the arrangement in the coming weeks. "This partnership complements the deep work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world's most advanced AI infrastructure," the companies said in a release. The Nvidia-OpenAI partnership comes about 10 days after OpenAI said it had reached a new tentative agreement that will give Microsoft a $100 billion equity stake in its for-profit corporation. OpenAI is technically controlled by its nonprofit. OpenAI was founded as a nonprofit in 2015 and its nonprofit board has continued to control the for-profit subsidiary that now develops and sells its AI products. OpenAI's corporate structure and nonprofit mission are the subject of a lawsuit brought by Elon Musk, who helped found the nonprofit research lab and provided initial funding. Musk's suit seeks to stop OpenAI from taking control of the company away from its nonprofit and alleges it has betrayed its promise to develop AI for the benefit of humanity. Earlier this month, the attorneys general of California and Delaware warned OpenAI that they have "serious concerns" about the safety of ChatGPT, especially for children and teens. The two state officials, who have unique powers to regulate nonprofits such as OpenAI, noted "deeply troubling reports of dangerous interactions between" chatbots and their users, including the suicide of one young Californian after he had prolonged interactions with an OpenAI chatbot. The parents of the 16-year-old California boy, who died in April, sued OpenAI and its CEO, Sam Altman, last month. OpenAI says it has 700 million weekly active users. Also, just last week Nvidia announced that it was investing $5 billion in fellow chipmaker Intel, which has struggled to keep up with the frenzied demand for artificial intelligence.
[51]
Nvidia to invest $100 billion in OpenAI to help expand the ChatGPT maker's computing power
Chipmaker Nvidia will invest $100 billion in OpenAI as part of a partnership announced Monday that will add at least 10 gigawatts of Nvidia AI data centers to ramp up the computing power for the owner of the artificial intelligence chatbot ChatGPT. Per the letter of intent signed by the companies, the first gigawatt of Nvidia systems will be deployed in the second half of 2026. Nvidia and OpenAI said they would be finalizing the details of the arrangement in the coming weeks. "This partnership complements the deep work OpenAI and Nvidia are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world's most advanced AI infrastructure," the companies said in a release. The Nvidia-OpenAI partnership comes about 10 days after OpenAI said it had reached a new tentative agreement that will give Microsoft a $100 billion equity stake in its for-profit corporation. OpenAI is technically controlled by its nonprofit. OpenAI was founded as a nonprofit in 2015 and its nonprofit board has continued to control the for-profit subsidiary that now develops and sells its AI products. OpenAI's corporate structure and nonprofit mission are the subject of a lawsuit brought by Elon Musk, who helped found the nonprofit research lab and provided initial funding. Musk's suit seeks to stop OpenAI from taking control of the company away from its nonprofit and alleges it has betrayed its promise to develop AI for the benefit of humanity. Earlier this month, the attorneys general of California and Delaware warned OpenAI that they have "serious concerns" about the safety of ChatGPT, especially for children and teens. The two state officials, who have unique powers to regulate nonprofits such as OpenAI, noted "deeply troubling reports of dangerous interactions between" chatbots and their users, including the suicide of one young Californian after he had prolonged interactions with an OpenAI chatbot. The parents of the 16-year-old California boy, who died in April, sued OpenAI and its CEO, Sam Altman, last month. OpenAI says it has 700 million weekly active users. Also, just last week Nvidia announced that it was investing $5 billion in fellow chipmaker Intel, which has struggled to keep up with the frenzied demand for artificial intelligence.
[52]
AI Investment Is Starting to Look Like a Slush Fund
No two companies are as important to the recent AI boom as OpenAI and Nvidia. OpenAI's journey from research lab to chatbot juggernaut reoriented the entire tech industry around the thesis that large language models will change everything. Nvidia's chips have been almost synonymous with scaling, on which OpenAI and others are collectively spending hundreds of billions of dollars. For the last few years, the companies' fortunes have been aligned, and their relationship has been, for the most part, an obvious one: OpenAI, which has become one of the biggest startups in the world, is a major customer for Nvidia, which has become the largest public company in the world. In 2025, their twin trajectories have left much of the rest of the economy behind, and their stated ambitions -- necessary to keep the momentum going -- have become extreme. In a recent blog post, Sam Altman said his company wanted to "create a factory that can produce a gigawatt of new AI infrastructure every week" to "maybe" be able to "figure out how to cure cancer," or "provide customized tutoring to every student on earth." OpenAI hasn't had any trouble raising money so far, but when your investment projections start crossing into the trillions, financing starts to get tricky, particularly if you aren't a company like Google or Meta, with an existing business throwing off tens of billions of dollars a year. Which might help explain this: OpenAI and NVIDIA today announced a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of NVIDIA systems for OpenAI's next-generation AI infrastructure to train and run its next generation of models on the path to deploying superintelligence. To support this deployment including data center and power capacity, NVIDIA intends to invest up to $100 billion in OpenAI as the new NVIDIA systems are deployed. Emphasis mine! The issue of circular financing in the AI world has been bubbling up for a while. As The Information reported in May, Nvidia has been engineering disorienting deals with customers for a while, most obviously with a company called Coreweave, which rents compute -- basically, access to Nvidia hardware -- to AI firms: The chip giant invested $100 million in [Coreweave] in early 2023. It funneled hundreds of thousands of high-end graphics processing units to CoreWeave. And it agreed to rent back its chips from CoreWeave through August 2027. In related news, just this week, Coreweave announced it had "expanded its partnership with OpenAI in a new deal worth up to $6.5 billion, bringing the total value of their agreements to $22.4 billion." These announcements aren't exactly attempts to obfuscate what's going on here -- "NVIDIA intends to invest... as the new NVIDIA systems are deployed" -- but it's worth stating even more plainly what's happening here: Nvidia is investing money in one of its largest customers, which will use some of that money to buy or lease capacity from Nvidia. On Tuesday, we got a little more insight into how that would actually happen. OpenAI, Oracle (a cloud provider that depends on Nvidia), and Softbank (a major investor in Nvidia and OpenAI) fleshed out plans to build out more data centers, which would one day be full of Nvidia hardware paid for or leased by... OpenAI. If you're a layperson, this probably sounds a bit weird. Maybe our friends in finance can explain a hidden logic here? Rich Privorotsky of Goldman Sachs attempts to summarize the arrangement: "Nvidia invests up to $100bn for non-voting shares, and OpenAI uses the cash to buy Nvidia chips, with a plan to deploy what could be at least 10GW of Nvidia systems." It is what it looks like, in other words: A small group of large companies handing money back and forth. With a number of caveats -- every cycle is different, and the earnings multiples aren't as crazy yet -- he also draws a parallel to the late 90s, which have been coming up a lot lately. "Vendor financing was a feature of that era, when the telecom equipment makers (Cisco, Lucent, Nortel, etc.) extended loans, equity investments, or credit guarantees to their customers who then used the cash/credit to buy back the equipment," he writes. Indeed, vendor financing features prominently in tech bubble post-mortems. In 2002, Newsweek explained the strange role they played at the tail end of the 90s tech investment boom: Swept up in the free-money spirit of the time, they were financing not only the customers' purchase of their wares-the switches, routers and other nuts and bolts of the Internet-but the growth of the customers' businesses, too. The standard practice of vendor financing thus became another wretched bubble excess: as if Ford were loaning customers money to buy cars-and boats, jewels and houses, too. "Vendor financing was the sickest sign of how things spun out of control," says Tero Kuittinen, technology adviser to Finnish investment bank Opstock in Helsinki. The parallels are obvious -- or, at least, would look extremely obvious if the AI industry's fortunes turned -- but they're not determinative. During the telecom buildout, vendor financing surged at a particular time. "Vendor-financing deals account for a relatively small fraction of the total $1 trillion in debt held by telecom carriers worldwide," Newsweek noted at the time, but investment continued well after the an early 2000 market collapse "wiped out the silliest ideas for pet, toy and trinket Web sites," on the thesis that "dot-coms might be dead, but the Internet itself would thrive," a perfectly reasonable argument if timing doesn't matter, which of course it does, quite a bit. Last month, Kuittinen argued that this time, things really are different, but not necessarily in a good way: The constellation of companies raising, spending, and sharing money on the broad technological and ideological project that is artificial intelligence in the 2020s is betting that, given enough resources, they can build the most profitable and powerful companies of all time, and that early concerns over profitability and circular financing will, in hindsight, seem blinkered and silly. (Or, according to Peter Thiel, like complicity in the summoning of the antichrist.) In the process of trying the industry is willing to drift into a strange and potentially risky arrangement -- although some of the implied risk might be mitigated, as Alphaville mockingly points out, by vagueness and lack of binding obligations -- which, in any case, given the potential upside, investors still seem comfortable with for now: A hyper-capitalized financial polycule in which everyone is everyone's investor, partner, vendor, and customer.
[53]
Why OpenAI Keeps Making Multibillion-Dollar AI Deals
The dealmaking started on Monday, when OpenAI and Nvidia, respectively the most valuable private and public companies in the world, announced plans for Nvidia to invest up to $100 billion dollars in OpenAI over several years. With this funding, the companies will collaborate to greatly expand OpenAI's data center infrastructure, which OpenAI CEO Sam Altman says will allow the company behind ChatGPT to solve the world's largest societal problems -- including, according to Altman, curing cancer. Infrastructure initiatives like these haven't come out of nowhere. In May, Altman announced that he would be stepping back from some of his duties at OpenAI to focus on the company's compute pillar, which is focused on building the data center infrastructure (including GPUs, power sources, and buildings) that enable AI models to be trained and used. With a recent run of giant-sized deals, the results of Altman's efforts over the past few months are starting to take shape. In an interview on Monday with CNBC's John Fort, Altman, Nvidia CEO Jensen Huang, and OpenAI president Greg Brockman explained that the $100 billion investment will expand on OpenAI's previously-announced Stargate joint venture with partners including Oracle, Microsoft, CoreWeave, and Softbank. In total, the Nvidia investment is expected to give OpenAI an additional 10 gigawatts of AI data center capacity. Altman believes OpenAI's only option is to majorly expand its infrastructure footprint. According to a CNBC report, Altman and Huang personally negotiated the deal without bankers initially being involved.
[54]
Nvidia's $100 Billion OpenAI Play Raises Big Antitrust Issues
(Reuters) -The $100 billion partnership between dominant AI chipmaker Nvidia and leading artificial intelligence company OpenAI could give both companies an unfair advantage over their competitors, experts say. The move underscores the increasingly overlapping financial interests of the various tech giants developing advanced AI systems, and the potential for a dwindling number of key players to stave off smaller rivals. It "raises significant antitrust concerns," said Andre Barlow, an antitrust lawyer with Doyle, Barlow & Mazard, who also noted that the Trump administration has taken a pro-business approach to regulations, removing hurdles that would slow AI growth. And while unleashing U.S. dominance in artificial intelligence by clearing away regulations and creating incentives for growth is a top priority for President Donald Trump, a Department of Justice official said last week that spurring innovation by protecting AI competition through antitrust enforcement is also part of Trump's AI plan. "The question is whether the agencies see this investment as pro-growth or something that could slow AI growth," Barlow said. Nvidia holds more than half of the market for the GPU chips that run the data centers powering artificial intelligence models and applications, such as OpenAI's ChatGPT. That dominant market position raises concerns that Nvidia would favor OpenAI over other customers with better pricing or faster delivery times, said Rebecca Haw Allensworth, an antitrust professor at Vanderbilt Law School. "They're financially interested in each other's success. That creates an incentive for Nvidia to not sell chips to, or not sell chips on the same terms to, other competitors of OpenAI," Allensworth said. A Nvidia spokesperson said that its investment in OpenAI would not change its focus. "We will continue to make every customer a top priority, with or without any equity stake," the spokesperson said. OpenAI did not immediately respond to a request for comment. Nvidia's biggest customer base is already relatively concentrated, with the two largest buyers accounting for 23% and 16% of its revenue in the second quarter of this year, according to its financial filings, which do not name the buyers. The scope of Monday's deal - in which Nvidia would invest up to $100 billion in OpenAI, and the latter would buy millions of chips from Nvidia -- goes to show "just how expensive frontier AI has become," said Sarah Kreps, director of the Tech Policy Institute at Cornell University. "The cost of chips, data centers and power has pushed the industry toward a handful of firms able to finance projects on that scale," Kreps said. During Joe Biden's presidency, the DOJ and U.S. Federal Trade Commission were on guard against anticompetitive actions by Big Tech companies in the AI space, warning that such companies could use their existing scale to dominate the nascent field. Under Trump, both agencies have continued other cases against Big Tech companies, and DOJ antitrust division head Gail Slater said on Thursday that enforcement "must focus on preventing exclusionary conduct over the resources that are needed to build competitive AI systems and products." "The competitive dynamics of each layer of the AI stack and how they interrelate, with a particular eye towards exclusionary behavior that forecloses access to key inputs and distribution channels, are legitimate areas for antitrust inquiry," she said. (Reporting by Jody Godoy; Editing by Chris Sanders and Lisa Shumaker)
[55]
Is the Big Business of AI Dominated by Too Few Big Businesses?
Others warn that cloud computing deals inked with unprofitable start-ups like OpenAI could fall far short of expectations if investors lose their risk appetite or future AI models disappoint. U.S. companies expect to spend hundreds of billions of dollars on AI over the next few years, but some investors are worried those plans -- and the stock market rally they've fueled -- depend too much on just a handful of companies. Chip giant Nvidia (NVDA) revealed in a regulatory filing last month that two direct customers accounted for nearly 40% of total revenue in the second quarter, with "Customer A" representing 23% of sales and "Customer B" 16%. At no other point since 2022, when ChatGPT sparked the AI craze, have two individual buyers represented a greater share of Nvidia's sales. But experts note that the 40% figure likely overstates Nvidia's reliance on those two customers. Nvidia doesn't disclose who its customers are, but Customers A and B are likely distributors or system integrators that package Nvidia products in larger systems for sale to end users like hyperscalers and software companies. Nvidia has less visibility into how much of its revenue comes from the hyperscalers. In the most recent quarter, the company estimated that two unnamed end users each accounted for 10% or more of total revenue. Unless they each account for well over 10%, the estimate suggests Nvidia's pool of final users could be more diverse than its pool of direct customers. "Feast or famine," is how Bill Kleyman, Senior Data Center Analyst at HostingAdvice, characterizes Nvidia's sales. The company's fortunes, Kleyman told Investopedia, "are increasingly tied to how much the big cloud platforms spend on AI infrastructure." Competitor Broadcom (AVGO) is in the same boat. One distributor accounted for nearly 30% of sales in the past two quarters, and Broadcom estimated that about 40% of revenue came from its five largest end users. Broadcom's sales have become more concentrated during the AI arms race of the past few years. Nonetheless, according to Kleyman, the unique dynamic of the AI infrastructure buildout makes this level of concentration less risky than it otherwise would be. "Overall AI demand is so strong across many sectors that a shortfall from one large customer would likely be offset by others ramping up their investments," he said. Chipmakers aren't the only companies in the AI supply chain that could be forced to grapple with the consequences of customer concentration. Enterprise software and cloud computing company Oracle (ORCL) reported earlier this month that its backlog swelled by nearly $320 billion in the most recent quarter. Shares soared as Wall Street hailed the "historic" quarter. But it quickly became apparent that nearly all of the backlog could be attributed to a 5-year, $300 billion cloud computing deal with OpenAI. The terms of the agreement are unknown, but experts say flexibility is always written into contracts of that size and duration. How much of Oracle's backlog is converted into revenue will depend on OpenAI's usage, which hinges on factors beyond how much the public uses ChatGPT. "These deals are only as sticky as the models they serve," says Rory Bokser, Head of Product at Moken.io, a decentralized computing network data provider. "If OpenAI changes architectures, moves off Oracle's flavor of cloud, or pivots deployment strategy (edge vs. centralized), those 'backlogs' get revised real fast." Another risk of inking a $300 billion deal with one company is that the customer may not have $300 billion to spend. OpenAI's annualized recurring revenue reportedly reached $12 billion in July, and the company is targeting $125 billion in revenue by 2029. Whether OpenAI hits those aggressive growth targets will likely depend on the ability of other companies to use its models to expand their profit margins or develop revenue-generating applications. Without rapid revenue growth, OpenAI may need to lean on investors to supply it with the billions of dollars it expects to burn through by the end of the decade. A recent funding round put the company's valuation at about $500 billion, making it one of the world's most valuable private enterprises. "It remains to be seen whether or not OpenAI will continue to receive the lavish funding it has received so far," said Greg Osuri, founder and CEO of Akash, a decentralized computing marketplace. Future funding could become scarce if AI implementation fails to live up to Wall Street's lofty expectations or economic stress forces investors and creditors to pull back.
[56]
OpenAI, Nvidia's Deal Could Change the AI Landscape: 5 Things to Know
OpenAI has also struck a deal with Oracle to build AI data centres When OpenAI and Nvidia announced their strategic partnership on Monday, it immediately turned heads in Silicon Valley. The commotion was not due to the size of the deal or what was being offered to either entity, but rather what it could mean for the artificial intelligence (AI) landscape in the near future. The implications of the partnership are many; however, one thing it has clearly indicated is that OpenAI is now preparing for a future where its reliance on Microsoft will be a thing of the past. Here are the five things you should know. Five Things to Know About the OpenAI-Nvidia Partnership 1. What is the OpenAI-Nvidia Deal? On Monday, Nvidia announced that a letter of intent for a strategic partnership with OpenAI was signed. As part of this collaboration, the chipmaker will deploy "at least 10GW of Nvidia systems for OpenAI's next-generation AI infrastructure to train and run its next generation of models on the path to deploying superintelligence." Nvidia also committed to investing up to $100 billion (roughly Rs. 8.8 lakh crore) to help OpenAI set up new AI data centres. The first phase of this plan is expected to be launched in the second half of 2026, and Nvidia's Vera Rubin platform will power the data centres. 2. What does it mean for OpenAI's partnership with Microsoft? Earlier this month, OpenAI and Microsoft signed a cryptic non-binding agreement where no details were disclosed to the public. However, a subsequent announcement by the ChatGPT maker made it clear that the agreement finally allowed it to convert the for-profit arm into a Public Benefit Corporation (PBC). Multiple reports have claimed that OpenAI and Microsoft were at loggerheads due to OpenAI's decision to go public. The Windows maker reportedly felt that the move would take away all the exclusivity and financial benefits tied to their original $13 billion (roughly Rs. 1.1 lakh crore) investment agreement, and opposed the move, effectively bringing the IPO plans to a halt. It is said that the Sam Altman-led AI startup even considered going public and accusing Microsoft of anticompetitive behaviour. While neither party took the aggressive route and eventually came to an agreement, many in the Silicon Valley feel that the relationship between the two is beyond repair. 3. What can the end-consumers expect? Over the last few months, many users have questioned OpenAI's product decisions. The company first took away the GPT-4o, GPT-4.1, and o3 models after launching GPT-5, and later also suggested removing the standard voice mode. While it eventually brought them all back, it highlighted that OpenAI was struggling to meet the compute requirements to provide all of the AI services. The same was observed in Altman's latest post on X (formerly known as Twitter), where he highlighted that future AI products and features might be reserved for the Pro subscribers or could charge users an additional amount. OpenAI's compute struggles could finally be over with the Nvidia deal, which will allow it to expand its AI infrastructure and easily power the AI services. A similar deal with Oracle should also help. As a result, end consumers can expect higher rate limits, more features to the Plus and free tiers, and faster deployments of new features. 4. What about OpenAI's for-profit direction? The PBC creation now allows OpenAI to remove the 100X return cap placed on investments. It also allows the entity to make commercial investments to generate revenue, all while being under the control of the non-profit entity. It is likely that the Nvidia and Oracle deals will witness investment into the for-profit entity instead of the non-profit. 5. How can the decision impact the world? The provision to generate more revenue and focus on making investors' money has been criticised by some, with the most vocal voice being Elon Musk, who accused Altman of moving OpenAI away from its original charter. The creation of the PBC might make OpenAI less of a "do good for humanity" company, but it will enable it to move faster towards the path of artificial general intelligence (AGI). When that happens, OpenAI's decision to choose public welfare or its self-interest could very well decide the future of the world.
[57]
Nvidia's $100 billion OpenAI play raises big antitrust issues - The Economic Times
Nvidia's biggest customer base is already relatively concentrated, with the two largest buyers accounting for 23% and 16% of its revenue in the second quarter of this year, according to its financial filings, which do not name the buyers. The scope of Monday's deal - in which Nvidia would invest up to $100 billion in OpenAI, and the latter would buy millions of chips from Nvidia -- goes to show "just how expensive frontier AI has become," said Sarah Kreps, director of the Tech Policy Institute at Cornell University.
[58]
Nvidia Is Investing $100 Billion in OpenAI as AI Datacenter Competition Grows
Chipmaker Nvidia will invest up to $100 billion in OpenAI and provide it with data center chips, the companies said on Monday, a tie-up between two of the highest-profile leaders in the global artificial intelligence race. The deal, which will see Nvidia start delivering chips as soon as late 2026, will involve two separate but intertwined transactions, according to a person close to OpenAI. The startup will pay Nvidia in cash for chips, and Nvidia will invest in OpenAI for non-controlling shares, the person said. The first $10 billion of Nvidia's investment in OpenAI, which was most recently valued at $500 billion, will begin when the two companies reach a definitive agreement for OpenAI to purchase Nvidia chips Nvidia did not respond to immediate requests for clarification about the deal. The pact is among a spate of agreements between major technology players that includes years of investment in OpenAI from Microsoft and a deal last week between Nvidia and Intel to collaborate on AI chips. The two companies signed a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure. They aim to finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026. "Everything starts with compute," OpenAI CEO Sam Altman said in a statement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Nvidia shares were up 4.4 percent while shares of Oracle, which partners with OpenAI, SoftBank and Microsoft on the $500 billion Stargate AI data center project, gained nearly 5 percent. Nvidia's investment comes days after it committed $5 billion to struggling chipmaker Intel. OpenAI and its backer, Microsoft, also announced earlier this month that they have signed a non-binding deal for new relationship terms that would allow for OpenAI's restructuring into a for-profit company. Nvidia also backed OpenAI in a $6.6 billion funding round in October 2024. However, the world's most valuable firm making another sizeable investment in OpenAI could lead to antitrust scrutiny. The Trump administration has taken a much lighter touch on competition issues compared with former President Joe Biden's antitrust enforcers. In June 2024 the Justice Department and the Federal Trade Commission reached a deal that cleared the way for potential antitrust investigations into the dominant roles that Microsoft, OpenAI and Nvidia play in the artificial intelligence industry. Reporting by Arsheeya Bajwa in Bengaluru and Deepa Seetharaman and Stephen Nellis in San Francisco; Editing by Tasim Zahid and Anil D'Silva.
[59]
Jensen Huang Calls OpenAI The Next Multi-Trillion-Dollar Hyperscaler As Nvidia Pledges $100 Billion: 'This Is Some Of The Smartest Investment...' - Microsoft (NASDAQ:MSFT), CoreWeave (NASDAQ:CRWV)
Nvidia Corporation (NASDAQ: NVDA) CEO CEO Jensen Huang said OpenAI is on track to become the world's next multi-trillion-dollar hyperscaler, defending his company's decision to invest up to $100 billion in its infrastructure buildout. Huang Defends Massive OpenAI Partnership Speaking on the Open Source podcast with Bill Gurley and Brad Gerstner that was posted earlier this week, Huang described OpenAI as a once-in-a-generation opportunity. "OpenAI is likely going to be the world's next multi-trillion-dollar hyperscale company," Huang said. "If that's the case, the opportunity to invest before they get there, this is some of the smartest investments we can possibly imagine." He explained that Nvidia's role goes far beyond selling chips. The company will work with OpenAI "at the chip level, at the software level, at the systems level, at the AI factory level" to help it build and operate hyperscale AI infrastructure. See Also: Intel Wraps Up Altera Stake Sale In Bid To Streamline Business Partnership Details: From Azure To Stargate Huang said Nvidia is already collaborating with OpenAI across multiple projects, including Microsoft Corporation's (NASDAQ: MSFT) Azure expansion, Oracle Corp's (NYSE: ORCL) OCI buildout and CoreWeave's (NASDAQ: CRWV) infrastructure scaling. The new commitment adds to those deals, supporting OpenAI as it builds self-operated data centers at unprecedented scale. For the unversed, Nvidia and OpenAI have announced plans to deploy at least 10 gigawatts of Nvidia-powered systems beginning in 2026. That infrastructure could require up to 5 million GPUs and potentially generate $300 billion to $500 billion in long-term revenue, according to Wall Street estimates. Skeptics Raise 'Circular Investment' Concerns Some analysts have questioned whether Nvidia's deal amounts to demand engineering, since OpenAI will buy massive amounts of Nvidia hardware with the help of Nvidia's funding. Bernstein analyst Stacy Rasgon acknowledged the "circular" concerns but said demand remains strong enough to ease fears for now. Bank of America analysts projected Nvidia could earn three to five times its $100 billion investment, while Evercore's Mark Lipacis boosted his price target to $225, arguing Wall Street still underestimates the company's growth trajectory. Nvidia's market capitalization now stands at $4.33 trillion, with shares gaining 46.73% over the past year and rising 28.83% so far in 2025, according to Benzinga Pro. Benzinga's Edge Stock Rankings indicate that NVDA continues to trend higher across short, medium and long-term periods, with additional performance details available here. Photo courtesy: jamesonwu1972 / Shutterstock.com Read Next: Cathie Wood Dumps Palantir As Stock Touches Peak Prices, Bails On Soaring Flying-Taxi Maker Archer Aviation Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. CRWVCoreWeave Inc$120.50-4.86%OverviewMSFTMicrosoft Corp$510.410.67%NVDANVIDIA Corp$177.64-0.03%ORCLOracle Corp$283.22-2.78%Market News and Data brought to you by Benzinga APIs
[60]
OpenAI Becomes One Of The First AI Giants To Announce Integration of NVIDIA's Vera Rubin AI Platform, Investing a Whopping $100 Billion
OpenAI isn't stopping in the race of AI compute at all, as the firm has entered into a major partnership with NVIDIA, investing $100 billion into 'multi-GW' projects. The demand for AI power isn't drying up at all, since NVIDIA is entering into partnerships worth several 'billions of dollars', focusing on scaling up AI compute capabilities, and delivering the best to its customers. Just a week ago, Team Green entered into a significant deal with Intel, and now, based on a 'blockbuster' announcement, NVIDIA has announced to supply compute power worth $100 billion to OpenAI, where the AI firm will deploy "at least 10 gigawatts of NVIDIA systems for OpenAI's next-generation AI infrastructure". The more important announcement here is that the collaboration is specifically focused on Vera Rubin. This means that OpenAI intends to acquire $100 billion worth of Rubin AI clusters, whether the NVL144 configuration or even the newer Rubin CPX platform. This strategic partnership will make OpenAI one of the exclusive customers for Vera Rubin, hence allowing Sam Altman & Co to get access to one of the most powerful platforms out there. NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT. This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence. Based on power consumption alone, this deal could result in up to 40,000 Rubin AI racks being sold to OpenAI, a massive figure for a product that has yet to enter the market. Speaking of market entry, NVIDIA's Rubin lineup is set to enter volume production by H2 2026, which means that OpenAI and other firms will get access to the next-generation of racks at the same timeline. Rubin is featuring huge improvements, which we have already talked about in previous posts, but for a quick summary, you are looking at gigantic improvements versus Blackwell.
[61]
Nvidia and OpenAI make $100 billion deal to build data centers
Nvidia will invest as much as $100 billion in OpenAI to support new data centers and other artificial intelligence infrastructure, a blockbuster deal that underscores booming demand for AI tools like ChatGPT and the computing power needed to make them run. The companies announced the agreement Monday, saying they'd signed a letter of intent for a strategic deal. The investment is meant to help OpenAI build data centers with a capacity of at least 10 gigawatts of power -- equipped with Nvidia's advanced chips to train and deploy AI models. The money will be provided in stages, with the first $10 billion coming when the deal is signed, according to people familiar with the matter. Nvidia is making the investment in cash and will receive OpenAI equity as part of the deal, said the people, who asked not to be identified because the talks were private. Further increments will follow when each gigawatt of computing power is deployed.
[62]
Microsoft-Backed OpenAI Would Get $100B In Nvidia Data Center Deal
'This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence,' Nvidia CEO Jensen Huang said. Microsoft-backed artificial intelligence model company OpenAI is finalizing a deal with Nvidia to receive up to $100 billion in investment as the two plan to deploy at least 10 gigawatts of AI data centers. San Francisco-based OpenAI will receive the funds progressively as each gigawatt deploys, according to statements the companies published Monday. The companies expect to finalize the details in the coming weeks. The first gigawatt should deploy in the second half of 2026 on Santa Clara, Calif.-based Nvidia's Vera Rubin platform. "Nvidia and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," Jensen Huang, founder and CEO of Nvidia, said in a statement Monday. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." [RELATED: Microsoft Unveils 'World's Most Powerful AI Datacenter,' President Brad Smith Says] Nvidia, OpenAI Partnership CRN has reached out to Nvidia, OpenAI and Microsoft for comment. OpenAI co-founder and CEO Sam Altman added in a statement Monday that "compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." The news comes days after Microsoft said it will bring the most powerful AI data center in the world-with 337.6 megawatts of capacity-online in Wisconsin and days after Nvidia said it plans to invest $5 billion in semiconductor rival Intel and jointly develop "multiple generations" of products with the chipmaker. The 10 gigawatts of power required by the Nvidia systems comes out to between 4 million and 5 million GPUs, the amount Nvidia will ship this year in total and twice the amount shipped last year, accordingto CNBC. As part of the deal, Nvidia will serve as OpenAI's preferred strategic compute and networking partner for AI factory growth plans. The two companies will co-optimize roadmaps for OpenAI's model and infrastructure software and Nvidia's hardware and software, according to Monday's statements. The work Nvidia and OpenAI do will complement existing partnerships the companies have with Microsoft, Oracle and SoftBank, according to the companies' statements Monday. OpenAI is teamed up with Oracle and SoftBank as part of the $500 billion Stargate venture. Microsoft leverages OpenAI's models for its Azure cloud service and productivity applications. The centers will have Nvidia systems and represent millions of graphics processing units (GPUs). OpenAI now counts more than 700 million weekly active users of its generative AI products. Nvidia's stock was up about 5 percent after market opening Monday, trading at about $183 a share as of Monday afternoon Eastern time.
[63]
Nvidia Leads Tech Rally With Plans to Invest Up to $100B in ChatGPT Maker OpenAI
Nvidia and OpenAI said they expect to finalize the details of the partnership in the coming weeks. Nvidia and OpenAI's business relationship is entering a new phase. The chipmaker at the heart of the AI boom and world's most valuable company said Monday it plans to invest up to $100 billion in ChatGPT maker OpenAI to build out AI data centers. Nvidia (NVDA) shares jumped over 4% following the news, making the stock one of the best performers in the Dow, Nasdaq, and S&P 500 Monday afternoon. Several Nvidia partners, including Super Micro Computer (SMCI) and Micron Technology (MU), saw their shares rise along with companies that make equipment to manufacture chips, boosting the tech sector. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence," Nvidia CEO Jensen Huang said in a release. Nvidia said the first of those 10 gigawatts is expected to be deployed in the second half of next year, on its next-generation Vera Rubin platform. OpenAI CEO Sam Altmansaid it will use what it's building with Nvidia to "create new AI breakthroughs and empower people and businesses with them at scale." The companies added they expect to finalize the details of the partnership in the coming weeks. With Monday's gains, Nvidia shares have added more than a third of their value since the start of the year.
[64]
Could Nvidia's $100 Billion Data Center Gamble Make It the Market's First $10 Trillion Stock? | The Motley Fool
Nvidia just committed up to $100 billion to OpenAI as the ChatGPT developer pursues next-generation artificial intelligence applications. Few companies in history have reshaped industries as profoundly as Nvidia (NVDA 0.27%). A company that began as a niche designer of graphics chips for video games has evolved into the undisputed leader of the artificial intelligence (AI) revolution. Today, Nvidia isn't just a chip company -- it's the bellwether investors use to gauge the direction that the computing sector is headed. Nvidia's share price moves on practically every headline tied to AI infrastructure, as its GPUs are powering a growing universe of generative AI applications across the globe. It's no wonder that over the past three years, Nvidia's meteoric rise has catapulted the company into the position of the most valuable business in the world as measured by market cap. Now, Nvidia is doubling down. Fresh off a $5 billion strategic investment in Intel, Nvidia just opened up its pocketbook again -- this time committing up to $100 billion in ChatGPT developer OpenAI. The question investors may be asking now is whether these aggressive moves could be the catalysts that propel Nvidia toward even greater milestones, potentially positioning it for another historic run upward -- this time, toward a $10 trillion valuation. On Sept. 22, Nvidia announced a letter of intent to supply 10 gigawatts of computing infrastructure to power OpenAI's next generation of models as the company pursues "superintelligence." Put simply, OpenAI requires unprecedented quantities of GPU clusters to train and scale its large language models -- and Nvidia has both the hardware and software that is needed. While Nvidia still has first-mover advantages in the GPU market, competition in that area has steadily intensified. Advanced Micro Devices offers lower-cost alternatives, while hyperscalers such as Microsoft, Alphabet, Amazon, and Meta Platforms are investing heavily in designing their own custom AI accelerator chips. Yet the OpenAI deal illustrates how Nvidia's competitive moat continues to expand despite these pressures. The deal underscores Nvidia's dominance in AI infrastructure -- securing long-term demand, revenue visibility, and ecosystem lock-in. By aligning itself closely with OpenAI, Nvidia further embeds itself at the center of generative AI adoption. In many ways, the agreement functions like a form of vertical integration -- ensuring that OpenAI's most advanced models continue to be built upon and optimized for Nvidia's platform well into the future. Because it's a private company, it's difficult to peg the true value of OpenAI. Media reports suggest it could generate around $20 billion in annual recurring revenue by December -- with projections as high as $125 billion by 2029. Sustaining that type of growth is ambitious, especially given rising competition from Perplexity, Anthropic, xAI, and Gemini. Moreover, achieving these targets will require astronomical expansions in its processing power -- hence the rationale behind OpenAI's deepening alliance with Nvidia. Wall Street currently expects Nvidia to generate $320 billion in annual revenues in 2027. Yet with the Intel and OpenAI deals in hand, that forecast already looks outdated. If Nvidia were to capture even 30% of OpenAI's projected annual recurring revenue, it could add nearly $40 billion of incremental revenue annually. Perhaps even more significant than the figures above is the underlying signal of these deals: By committing its next-generation systems to Nvidia's backbone, OpenAI -- one of the most influential brands in the AI arena -- is effectively telling the market that Nvidia remains the gold standard for advanced computing. This endorsement is likely to echo across enterprises, cloud providers, and even government agencies, reinforcing Nvidia's platform as the default choice for AI infrastructure. This halo effect could push Nvidia's revenues far beyond the current analyst consensus, potentially into the neighborhood of $500 billion by 2030. Applying its three-year average price-to-sales ratio of 28 to that figure implies that by that year, its market cap could be well north of $10 trillion. Ultimately, precise time frames and dollar figures matter less than the broader takeaway: Nvidia stock continues to look like a durable, profitable, long-term holding for investors seeking exposure to the AI megatrend.
[65]
OpenAI deal train arrives at Nvidia: All we know so far - The Economic Times
Nvidia and OpenAI have signed a $100-billion deal, marking a major move in the AI sector. The partnership will give OpenAI vital computing power to develop superintelligence, while Nvidia stands to benefit from OpenAI's shift to a for-profit model, strengthening both firms' positions in the fast-growing AI industry.A $100-billion deal between chipmaking giant Nvidia and Sam Altman's OpenAI is the latest in a string of deals between major technology companies in recent days. Both companies have been crucial to the burgeoning artificial intelligence (AI) sector -- Nvidia chips are critical to data centres used to train and run AI models, while OpenAI is one of the forerunners in the field. The deal between the two companies will help OpenAI secure much-needed compute, as it rushes to achieve superintelligence for its models. Nvidia can reap the dividends from the planned restructuring of OpenAI into a for-profit entity. Here's a look at the details of the deal: OpenAI will deploy at least 10 gigawatts (GW) of AI data centres equipped with Nvidia hardware. This would include millions of graphic processor units (GPU) from the chipmaker and will be used for OpenAI's next-generation AI infrastructure. Nvidia will invest the promised $100 billion progressively, starting with $10 billion, as each GW is deployed. The first data centre is expected in the second half of 2026, based on Nvidia's Vera Rubin platform. Nvidia was recently valued at $500 billion. The finer details of the deal will be finalised in the coming weeks, OpenAI said in a statement. Nvidia will start investing in OpenAI for non-voting shares after terms are laid down. OpenAI can then use the cash to buy Nvidia's chips. Deals for days This has been a month of deals between technology majors. Oracle announced a five-year, $300 billion deal with OpenAI, which sent the company's stock soaring and briefly made founder Larry Ellison the richest person in the world. Last week, Nvidia said it would invest $5 billion in Intel to jointly develop next-generation custom data centre CPUs and PC system-on-chips (SOCs), integrating the former's AI and GPU technologies with the latter's x86 architecture. OpenAI is already working with Oracle and SoftBank on an ambitious $500-billion AI infrastructure project, Stargate. Announced in January, the project was originally supposed to build AI data centres in the US. However, stakeholders are now planning to expand the project to the UK, the UAE and India. Meanwhile, OpenAI recently signed a deal with long-time backer Microsoft to transition into a for-profit entity. Experts' take on the Nvidia-OpenAI deal Nvidia and OpenAI have been working together since the latter was operationalised. OpenAI has been trying to develop its own AI chips, like rivals Amazon and Google, but is dependent on Nvidia for its compute needs till that happens. "Deals like this should also ease concerns about lost sales in China," eMarketer analyst Jacob Bourne told Reuters. "It also throws cold water on the idea that rival chipmakers or in-house silicon from the big tech platforms are anywhere close to disrupting Nvidia's lead." "The deal could change the economic incentives of Nvidia and OpenAI as it could potentially lock in Nvidia's chip monopoly with OpenAI's software lead. It could potentially make it more difficult for Nvidia competitors like AMD in chips or OpenAI's competitors in models to scale," said Andre Barlow, an antitrust lawyer with Doyle, Barlow & Mazard.
[66]
AI revolution hits next stage of growth in Nvidia/OpenAI deal: Wedbush (NVDA:NASDAQ)
Nvidia (NASDAQ:NVDA) and OpenAI's massive deal announced on Monday signals no slowdown in the artificial intelligence buildout as Wedbush pronounces the AI revolution is striking a new stage of growth. The numbers behind the deal are staggering. Nvidia plans to invest Nvidia's $100B investment to build 10GW of data center capacity with OpenAI signals ongoing acceleration in AI development and a new stage of sector growth. Providing this volume of GPUs, matching Nvidia's projected 2025 output and exceeding 2024 levels, cements Nvidia's leadership and strengthens AI infrastructure. Wedbush believes strong AI investments will help companies like Palantir further grow into their valuations and justifies high targets, despite typical valuation concerns.
[67]
Nvidia to Invest Up to $100 Billion in OpenAI, Setting Private Funding Record | PYMNTS.com
By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions. The deal ties the investment to the rollout of "at least 10 gigawatts of Nvidia systems for OpenAI's next-generation AI infrastructure to train and run its next generation of models on the path to deploying superintelligence," according to a Monday (Sept. 22) press release. OpenAI will buy millions of Nvidia AI processors. The investment will support the progressive deployment, including data center and power capacity, with the first gigawatt of Nvidia systems to be deployed in the second half of next year, the release said. The deal is the largest private-company investment on record, provided Nvidia invests the full $100 billion, the Financial Times reported Monday. The partnership cements Nvidia's dominance in AI compute by ensuring its chips remain at the core of OpenAI's stack for training and inference. OpenAI will build the infrastructure primarily in the United States, the report said. The project will use Nvidia's upcoming Vera Rubin platform, the successor to its Blackwell chips. Each phase of deployment will trigger a new investment from Nvidia. Nvidia will invest $10 billion when the first gigawatt is deployed, according to the report. Later tranches will be priced at OpenAI's prevailing valuation of $500 billion. The investment gives OpenAI capital and the secured supply of the hardware it needs to continue scaling. The company has signed several large agreements in recent months, including a $300 billion, 5-year contract with Oracle to supply compute infrastructure for training and inference workloads. That deal requires more than 4 gigawatts of electricity and positions Oracle as a central provider for OpenAI's workloads. For Nvidia, the arrangement secures billions of dollars in chip sales and ensures that its technology remains embedded in OpenAI's stack, anchoring demand as the market shifts from training massive models to serving them efficiently at scale. For OpenAI, the partnership provides a long-term hedge against supply shortages and rising hardware costs. The Oracle contract locks in cloud services at a multiyear scale, while the Nvidia deal ensures hardware availability. By layering in Nvidia's staged equity commitment, OpenAI reduces execution risk while retaining access to processors. Some execution risk remains, however. Before the first deployment, technological, regulatory and power challenges could emerge. OpenAI is also hedging by exploring custom chips with Broadcom, which could reduce its reliance on Nvidia in the long term. But the deal, as structured, keeps Nvidia central to OpenAI's near-term roadmap and signals to the market that compute remains the resource shaping the pace of generative AI adoption.
[68]
Nvidia-OpenAI Deal Sparks 'Circular' Investment Concerns, Says Analyst: 'Incremental Worries Around...' - NVIDIA (NASDAQ:NVDA), Oracle (NYSE:ORCL)
Nvidia Corporation's (NASDAQ: NVDA) recent investment in OpenAI has sparked investor concerns that the startup might buy Nvidia chips, creating a "circular" investment loop and raising questions about the move's rationale and potential impact on Nvidia's stock. Nvidia's OpenAI Deal Sparks Circular Sales Concerns Stacy Rasgon, an analyst at Bernstein, acknowledged that the OpenAI deal "will clearly fuel 'circular' concerns." These concerns revolve around Nvidia investing in startups that subsequently purchase Nvidia chips, or reselling cloud services built on GPUs sold specifically for that purpose, reported MarketWatch. Track NVDA stock here The OpenAI deal has raised concerns, though Nvidia clarified that the investment funds won't be used to buy its own products. The Sam Altman-led company made a major cloud deal with Oracle (NYSE: ORCL), potentially boosting Oracle's Nvidia chip purchases, while Nvidia has also invested in OpenAI. Bank of America analysts, led by Vivek Arya, project that Nvidia's investment in OpenAI could yield 3-5× returns, potentially generating $300-$500 billion in revenue, but they caution that the size of the investment in a single customer may raise perception concerns, "until [Nvidia] clarifies the appropriate accounting treatment." Rasgon believes current demand looks strong, and the news is likely positive for Nvidia -- for now. "..incremental worries around sustainability don't seem to present any imminent threat for now," he said. See Also: Shiba Inu Burn Rate Soars 396%: Why Is SHIB Not Going Up? - Benzinga Analysts Confident About Nvidia-OpenAI Nvidia's plan to invest up to $100 billion in OpenAI could generate billions in long-term revenue, reinforcing its dominance in the booming artificial intelligence market. This partnership cements Nvidia as OpenAI's preferred compute and networking partner. Evercore's Mark Lipacis raised his price target to $225 from $214, believing Wall Street is underestimating the company. He sees its OpenAI deal as a confidence booster and thinks investing is worthwhile despite concerns of a potential bubble in cloud providers, given the stock trades below its nine-year median forward P/E of 36, as per MarketWatch. The company's strategy of funding AI data centers stacked with millions of Nvidia GPUs ensures its chips remain the backbone of the AI boom. This move, which some see as a masterstroke in demand engineering, has been applauded by Wall Street. Nvidia has previously written smaller checks to AI cloud firms like CoreWeave and Lambda, helping them scale capacity while ensuring those servers were powered by its silicon. According to Benzinga Edge Stock Rankings, Nvidia has a growth score of 97.90% and a momentum rating of 87.29%. Click here to see how it compares to other leading tech companies. READ NEXT: Nvidia Exec Says AI Adoption Lags Because People Don't Trust It, Pushes Open-Source Nemotron As Solution Image via Shutterstock Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. NVDANVIDIA Corp$176.08-0.91%OverviewORCLOracle Corp$288.00-1.14%Market News and Data brought to you by Benzinga APIs
[69]
Bank of America Nvidia-OpenAI call puts half trillion on the table
Just when you think Nvidia NVDA can't go any bigger, it does. In Nvidia's latest AI flex, the tech behemoth announced a whopping $100 billion investment in OpenAI. The deal anchors the rollout of 10+ gigawatts of Nvidia systems to efficiently train and run the next wave of models, with deliveries set to begin in the second half of 2026. Think utility-scale compute, but for artificial intelligence. Nvidia's OpenAI investment at a glance: * Staged LOI: Letter of intent to deploy ≥10 GW of NVIDIA systems; with Nvidia to shell out $100 billion as capacity is built out. * Cash-for-chips loop: Nvidia's cash buys non-voting OpenAI equity, with OpenAI using those proceeds to scoop up Nvidia's chips/systems. * Milestone tranches: First $10 billion at signing, with additional increments linked to each gigawatt deployed (multi-year rollout). OpenAI's reach justifies the scale. ChatGPT boasts a mind-boggling user base exceeding 700 million weekly active users, with the company exploring a secondary market that could peg its value around $500 billion. This is typically a waypoint that keeps IPO chatter alive. For Nvidia, this is home turf. It owns the AI training market with an incredible 90%+ share, and the stock's $4 trillion milestone in July is a clear tell on shareholder conviction. More importantly, for its business, the roadmap expands, from Blackwell to next-gen chips geared to video and software generation, to the switches connecting it all. All of this has caught the eye of Bank of America analysts, who revamped their Nvidia OpenAI outlook as Wall Street resets expectations across the entire AI trade. Bank of America says Nvidia-OpenAI could be half-trillion-dollar engine Bank of America's Vivek Arya is leaning in hard on Nvidia NVDA, following its OpenAI tie-up, hailing the sales opportunity that could reach $300 billion to $500 billion over time. "The partnership includes a letter of intent for NVDA to be involved in at least 10 gigawatts of systems, starting in the second half of 2026 with Vera Rubin," the team wrote, effectively saying that the haul could be a 3x to 5x return on Nvidia's planned $100 billion outlay. Vera Rubin is Nvidia's next-generation AI accelerator, the successor to Blackwell. Where Blackwell sets new computing power records, Rubin takes things up a notch or two. In fact, it dishes out 50 petaflops of FP4 inference, more than double the 20 petaflops that Blackwell can achieve. On a bigger scale, a full Rubin rack can potentially provide 3.3× the performance of a comparable Blackwell Ultra rack. The other needle-mover is that OpenAI will work closely with Nvidia as a preferred strategic compute and networking partner, expanding the company's lead over competitors like Broadcom and AMD. It matters a ton for investors, as the companies feel the buildout targets at least 10 GW of Nvidia systems, with initial capacity landing in late 2026. However, we could see its impact in order books well before then. Also, Bank of America reiterated its buy rating on Nvidia with a $215 price target following the announcement, keeping it at the top of its preferred list as AI infrastructure spend grows. Moreover, if the LOI turns into iron-clad purchase commitments, BofA's half-trillion math becomes a multi-year revenue engine. Nvidia's $100 billion OpenAI bet comes with bold words from Huang, Altman Nvidia's power-packed $100 billion deal with OpenAI isn't just about hardware; it's about staking a claim on the future of AI. CEO Jensen Huang called it "the next leap forward," planning to deploy nearly 10 gigawatts of Nvidia systems, powering "the next era of intelligence." Also, OpenAI chief Sam Altman was just as blunt: "Everything starts with compute... we will utilize what we're building with NVIDIA to empower people and businesses at scale." On Wall Street, analysts are buzzing; the scale is jaw-dropping. Stacy Rasgon at Bernstein called the move "positive" for OpenAI's compute ambitions but warned of "circular" concerns over Nvidia's investment layering back into chip revenues. Gene Munster of Deepwater also called it the "latest salvo" in the AI arms race, raising the bar for rivals. Also, Gil Luria at D.A. Davidson flagged risks of Nvidia looking like an "investor of last resort" if OpenAI struggles to monetize. The structure itself, though, seems tricky. Nvidia takes non-voting shares, and OpenAI can effectively use the proceeds to buy Nvidia's chips. While the setup looks neat on paper, it could unravel if tranche triggers slip or deal terms change. Also, if OpenAI's build plan wobbles, and Nvidia still has limited control despite the outlay, things could even get more complicated. Then there's execution and scrutiny. Rolling out 10 GW only starts in the second half of 2026, with access to electricity being a real bottleneck that can slow timelines. Additionally, regulators are also eyeing the tie-up given Nvidia's chip dominance. Key takeaways: * 10 GW buildout locked in, with the first 1 GW slated for the second half of 2026 (Vera Rubin). * OpenAI names Nvidia as the preferred computing and networking partner. * Analysts peg AI data centers at a whopping $50 billion per gigawatt, hinting at significant revenue potential. The Arena Media Brands, LLC THESTREET is a registered trademark of TheStreet, Inc. This story was originally published September 24, 2025 at 9:03 AM.
[70]
Nvidia Says It's Investing Up to $100B in ChatGPT Maker OpenAI
A former Senior Publishing Editor on the Dow Jones Newswires team at The Wall Street Journal, Aaron earned a Bachelor's degree in Economics from the University of Michigan and a Master's in Journalism from Columbia University. Nvidia and OpenAI's business relationship is entering a new phase. The chipmaker at the heart of the AI boom and world's most valuable company said Monday it plans to invest up to $100 billion in ChatGPT maker OpenAI to build out AI data centers. Shares of Nvidia (NVDA) jumped 4% in recent trading following the news. They've added more than a third of their value since the start of the year, as demand for the company's AI chips surged. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence," Nvidia CEO Jensen Huang said in a release. Nvidia said the first of those 10 gigawatts is expected to be deployed in the second half of next year, on its next-generation Vera Rubin platform. OpenAI CEO Sam Altmansaid it will use what it's building with Nvidia to "create new AI breakthroughs and empower people and businesses with them at scale." The companies said they expect to finalize the details of the partnership in the coming weeks.
[71]
Did Nvidia Just Repeat Cisco's Mistake and Build a House of Cards With OpenAI Investment? | The Motley Fool
Nvidia's (NVDA 0.27%) announcement that it will invest up to $100 billion in OpenAI is being hailed by the company as a massive bet on the future of artificial intelligence (AI). Still, investors should take a closer look at what is really going on here. The money OpenAI receives will ultimately be plowed right back into Nvidia hardware, mostly through Oracle's cloud buildout, where the two companies recently signed a massive $300 billion deal. OpenAI plans to deploy Nvidia systems that need 10 gigawatts of power, which is equal to roughly 4 million to 5 million graphics processing units (GPUs). If that sounds like a lot, it is, as it's about the same total number of GPUs that Nvidia will ship this year. The first $10 billion of Nvidia's investment will be deployed as soon as the first gigawatt of capacity is up, and the rest will be rolled out in stages as new data centers come online. On paper, the OpenAI investment helps secure billions of dollars in future demand. But it's worth remembering that Nvidia is now helping finance one of its biggest customers to keep buying its chips. This is called circular financing. Nvidia is essentially funding its own demand. This is exactly what Cisco Systems (CSCO -0.93%) did during the internet bubble, when it provided credit to telecoms so they could buy more Cisco routers. Those sales looked great -- until the capital dried up and the entire market collapsed. This is also a defensive move by Nvidia. More and more of Nvidia's largest customers are designing their own custom AI chips. Alphabet has its TPUs, Amazon has Trainium and Inferentia, and Microsoft is working on its own chip. OpenAI itself has been developing custom chips to bring its costs down, and before this announcement, it placed a $10 billion order with Broadcom for custom chips to be delivered next year. This is the same threat that Nvidia saw play out in crypto, where ASICs (application specific integrated circuits) displaced GPUs for Bitcoin mining. Nvidia doesn't want to see that happen again. By investing in OpenAI, it's trying to keep one of its biggest customers locked into the Nvidia ecosystem. This also comes at a time when the market is shifting more toward inference, where Nvidia's moat is much smaller. Training large language models (LLMs) is where Nvidia's CUDA software platform shines. However, inference isn't as complex and doesn't require the same deep software integration. That's why hyperscalers (owners of massive data centers) are so motivated to build custom chips. Inference is also a continuous cost, so the economics of cost per inference start to dominate the discussion. That's why Nvidia also took a $5 billion stake in Intel and announced a collaboration on AI processors, as it's also trying to stave off Advanced Micro Devices in the inference market and keep its grip on this next phase of AI computing. There's no question that Nvidia is in a dominant position right now, and the OpenAI deal only strengthens its near-term outlook. But its OpenAI investment clearly looks like a defensive move that adds risk. When Cisco used circular financing during the internet boom, it looked brilliant, until the customers it was funding went bust. Both Nvidia and OpenAI are better positioned, but the principle is the same: Nvidia is using its balance sheet to keep demand high. That works as long as the AI boom keeps running, but it makes the company more exposed if spending slows or if hyperscalers switch to cheaper solutions. Nvidia remains the key player in AI infrastructure, but this deal is a reminder that its growth isn't risk-free. A lot of Nvidia's success is now riding on an unprofitable company that is bleeding massive amounts of cash that it is financing. OpenAI hasn't actually proven yet that it has a great business model, and if it fails, this becomes a house of cards that collapses onto Nvidia.
[72]
Nvidia to invest up to $100 bil. in OpenAI, linking two artificial intelligence titans - The Korea Times
SAN FRANCISCO -- Nvidia will invest up to $100 billion in OpenAI and supply it with data center chips, the companies said on Monday, marking a tie-up between two of the highest-profile players in the global artificial intelligence race. The move underscores the increasingly overlapping interests of the various tech giants developing advanced AI systems. The deal gives chipmaker Nvidia a financial stake in the world's most prominent AI company, which is already an important customer. At the same time, the investment gives OpenAI the cash and access it needs to buy advanced chips that are key to maintaining its dominance in an increasingly competitive landscape. Rivals of both companies may be concerned the partnership will undermine competition. The deal will involve two separate but intertwined transactions, according to a person close to OpenAI. Nvidia will start investing in OpenAI for non-voting shares once the deal is finalized, then OpenAI can use the cash to buy Nvidia's chips, the person said. "Everything starts with compute," OpenAI CEO Sam Altman said in a statement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." The two companies signed a letter of intent to deploy at least 10 gigawatts of Nvidia systems for OpenAI and said they aim to finalize partnership details in the coming weeks. The power for those chips is equivalent to the needs of more than 8 million U.S. households. Nvidia shares rose as much as 4.4 percent after the announcement to a record intraday high, while data center builder Oracle gained about 6 percent. Oracle is working with OpenAI, SoftBank and Microsoft on a $500 billion project called Stargate, a plan to build massive AI data centers around the world. Under the new deal, once the two sides reach a definitive agreement for OpenAI to purchase Nvidia systems, Nvidia will invest an initial $10 billion, the person familiar with the matter said. OpenAI was most recently valued at $500 billion. Nvidia will start delivering hardware as soon as late 2026, with the first gigawatt of computing power to be deployed in the second half of that year on its upcoming platform, named Vera Rubin. Analysts said the deal was positive for Nvidia but also voiced concerns about whether some of Nvidia's investment dollars might be coming back to it in the form of chip purchases. "On the one hand this helps OpenAI deliver on what are some very aspirational goals for compute infrastructure, and helps Nvidia ensure that that stuff gets built. On the other hand the 'circular' concerns have been raised in the past, and this will fuel them further," said Bernstein analyst Stacy Rasgon. OpenAI, like Google, Amazon and others, has been working on plans to build its own AI chips, aiming for a cheaper alternative to Nvidia. A person familiar with the matter said the deal does not change any of OpenAI's ongoing compute plans, including that effort or its partnership with Microsoft. OpenAI was working on a custom chip with designer Broadcom and Taiwan Semiconductor Manufacturing Co, Reuters reported earlier this year. Broadcom shares were down 0.8 percent after the news. Broader industry moves The pact is the latest in a series of agreements between major technology players. Microsoft has invested billions in OpenAI since 2019, and Nvidia last week unveiled a collaboration with Intel on AI chips. Nvidia also committed $5 billion to Intel earlier this month and backed OpenAI in a $6.6 billion funding round in October 2024. The scale of Nvidia's latest commitment could attract antitrust scrutiny. The Justice Department and Federal Trade Commission reached a deal in mid-2024 that cleared the way for potential probes into the roles of Microsoft, OpenAI and Nvidia in the AI industry. However, the administration of U.S. President Donald Trump has so far taken a lighter approach to competition issues than the previous Biden administration. OpenAI and its main backer Microsoft also announced earlier this month that they had signed a non-binding agreement to restructure OpenAI into a for-profit entity, signaling further changes in the governance of the fast-growing AI company. "The deal could change the economic incentives of Nvidia and OpenAI as it could potentially lock in Nvidia's chip monopoly with OpenAI's software lead. It could potentially make it more difficult for Nvidia competitors like AMD in chips or OpenAI's competitors in models to scale," said Andre Barlow, an antitrust lawyer with Doyle, Barlow & Mazard. He added that the Trump administration has taken a pro-business approach to regulations, removing hurdles that would slow AI growth.
[73]
Strategic deal reinforces Nvidia's AI dominance - Nvidia to Invest $100B in OpenAI
Jacob Bourne of eMarketer observes that demand for Nvidia GPUs is essentially built into the development of frontier AI models. This partnership also reduces concerns about lost sales in China and signals that rival chipmakers or Big Tech platforms are unlikely to disrupt Nvidia's lead. For OpenAI, the deal represents greater independence as it diversifies from its Microsoft partnership. Gil Luria of D.A. Davidson expresses concern that Nvidia might be acting as an "investor of last resort" for OpenAI. David Wagner of Aptus Capital Advisors highlights that this move aligns with CEO Jensen Huang's long-term AI investment strategy, even earlier than expected. Stacy Rasgon of Bernstein notes that while the deal helps OpenAI achieve ambitious compute goals, it could intensify concerns about circular dependencies between the two companies.(Disclaimer: This slideshow has been sourced from Reuters) "Setting up manufacturing in the US is both costlier -- 50-60% higher than India -- and time-intensive, requiring three to four years for approvals. For Indian firms, the cost economics of manufacturing locally in India are far superior," she noted.
[74]
What Will Nvidia's OpenAI Investment Reap? - NVIDIA (NASDAQ:NVDA)
Nvidia NVDA plans to invest up to $100 billion in OpenAI, a move that could generate billions in long-term revenue. The partnership cements Nvidia as OpenAI's preferred compute and networking partner, reinforcing its dominance in the booming artificial intelligence market. Bank of America Securities analyst Vivek Arya maintained a Buy on Nvidia with a price forecast of $235. Also Read: Nvidia Jumps After $100B OpenAI Deal To Power Next-Gen AI Infrastructure Arya highlighted the company's unmatched role in powering the AI buildout -- the largest and fastest-growing technology investment in history. The analyst argued that Nvidia's decision to commit up to $100 billion in OpenAI marks both a bold financial move and a strategic reinforcement of its moat. On Monday, Nvidia announced a letter of intent with OpenAI to deploy at least 10 gigawatts of systems beginning in the second half of 2026 with the Vera Rubin platform. Arya estimated this initiative could ultimately deliver $300 billion to $500 billion in revenue for Nvidia, translating into a three to five times return on investment. The analyst emphasized that the partnership also cements Nvidia as OpenAI's preferred strategic compute and networking partner, which may intensify competitive risks for peers such as Broadcom AVGO and Advanced Micro Devices AMD. He acknowledged investor concerns over the optics of Nvidia financing a customer. Arya expects Nvidia will classify the deal like its prior CoreWeave CRWV equity investment, with OpenAI treated as a commercial client rather than an outlier. The analyst framed the potential revenue contribution in context: OpenAI could represent 15% to 25% of Nvidia's sales, similar to its largest existing hyperscale customers, while Nvidia could generate triple that revenue from 2026 to 2030 overall. From Arya's perspective, the $100 billion commitment reflects a strategic deployment of free cash flow. With margins at 40% to 50% and an annual topline approaching $200 billion, Nvidia is positioned to generate hundreds of billions in free cash over the coming years, according to the analyst. Arya stated that investing in external public assets has become less practical, given regulatory constraints and limited strategic alignment. Instead, the analyst noted that Nvidia is channeling its cash into ecosystem investments that expand its addressable market, accelerate product timelines, and even deliver geopolitical benefits -- similar to Intel's INTC recent ecosystem funding. At roughly 30 times calendar 2026 earnings and less than 1 times Price/Earnings to Growth (PEG) ratio, Arya sees Nvidia as attractively valued compared to large-cap growth peers that typically trade at closer to 2 times PEG. NVDA Price Action: NVIDIA shares were down 3.13% at $177.85 at the time of publication on Tuesday. The stock is approaching its 52-week high of $184.55, according to Benzinga Pro data. Read Next: Super Micro And Nvidia Power Lambda's Zero-Emission Data Center For AI Training Expansion Photo via Shutterstock NVDANVIDIA Corp$178.33-2.88%OverviewAMDAdvanced Micro Devices Inc$160.410.39%AVGOBroadcom Inc$338.59-0.06%CRWVCoreWeave Inc$130.41-2.12%INTCIntel Corp$29.522.64%Market News and Data brought to you by Benzinga APIs
[75]
NVIDIA Invests $100 Billion in OpenAI to Power Next AI Breakthroughs
The new partnership mainly focuses on two interrelated deals. After a definitive contract is signed, NVIDIA will initially invest $10 billion in OpenAI through non-voting . OpenAI will, in turn, use the money to buy NVIDIA's high-performance chips, which provide the foundation for next-generation AI development. Signing a letter of intent is still due, with a commitment to supply a minimum of 10 gigawatts of systems for OpenAI. Final details will be worked out within the next few weeks. This ensures that OpenAI has a pipeline of state-of-the-art computing power to support its rapidly growing AI models.
[76]
Nvidia investing up to $100B in ChatGPT owner OpenAI to create 'new...
Chipmaker Nvidia will invest up to $100 billion in ChatGPT owner OpenAI and provide it with data center chips, the companies said Monday, a tie-up between two of the highest profile leaders in the global artificial intelligence race. The deal, which will see Nvidia start delivering chips as soon as late 2026, will involve two separate but intertwined transactions, according to a person close to OpenAI. The startup will pay Nvidia in cash for chips, and Nvidia will invest in OpenAI for non-controlling shares, the person said. The first $10 billion of Nvidia's investment in OpenAI, which was most recently valued at $500 billion, will begin when the two companies reach a definitive agreement for OpenAI to purchase Nvidia chips Nvidia did not respond to immediate requests for clarification about the deal. The pact is among a spate of agreements between major technology players that includes years of investment in OpenAI from Microsoft and a deal last week between Nvidia and Intel to collaborate on AI chips. The two companies signed a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure. They aim to finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026. "Everything starts with compute," OpenAI CEO Sam Altman said in a statement.\ "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Nvidia shares were up 3% while shares of Oracle, which partners with OpenAI, SoftBank, and Microsoft on the $500 billion Stargate AI data center project, gained nearly 5%. Nvidia's investment comes days after it committed $5 billion to struggling chipmaker Intel. OpenAI and its backer Microsoft also announced earlier this month that they have signed a non-binding deal for new relationship terms that would allow for OpenAI's restructuring into a for-profit company. Nvidia also backed OpenAI in a $6.6 billion funding round in October 2024. The world's most valuable firm making another sizable investment in OpenAI could lead to antitrust scrutiny. The Trump administration has taken a much lighter touch on competition issues compared to President Biden's antitrust enforcers. In June 2024 the Justice Department and the Federal Trade Commission reached a deal that cleared the way for potential antitrust investigations into the dominant roles that Microsoft, OpenAI and Nvidia play in the artificial intelligence industry.
[77]
Nvidia to invest US$100 billion in OpenAI
Chipmaker Nvidia plans to invest up to US$100 billion in artificial intelligence startup OpenAI under a new agreement, the companies said on Monday, as competition intensifies among technology giants to secure access to energy and chips needed for AI growth. The companies unveiled a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure. They aim to finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026 "Everything starts with compute," Sam Altman, CEO of OpenAI said in a release. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Shares of Oracle, a partner with OpenAI, SoftBank, and Microsoft on the $500 billion Stargate AI data center project, gained nearly five per cent. Nvidia's investment comes days after it committed $5 billion to struggling chipmaker Intel. ---
[78]
Nvidia Makes a Familiar Move With $100 Billion Investment in OpenAI. Will It Pay Off? | The Motley Fool
Nvidia (NVDA 0.35%) is at it again. Just days after surprising the market with a $5 billion investment in Intel, it's forming a similar tie-up with OpenAI. However, the scale of this deal will be much larger: Nvidia plans to invest $100 billion in OpenAI over time as part of a project to build 10 gigawatts of artificial intelligence (AI) data centers, with millions of Nvidia GPUs powering OpenAI's next-generation AI infrastructure. Nvidia stock jumped 4% on the news, hitting an all-time high. The two companies signed a letter of intent for this "landmark strategic partnership" to further OpenAI's ambitions of achieving artificial superintelligence. With 10 gigawatts, one could power between 4 million and 5 million GPUs, which Nvidia CEO Jensen Huang said is about the volume that the company will ship this year. The first phase of this ambitious project is expected to come online in the second half of 2026, using chips built on Nvidia's upcoming Vera Rubin platform. OpenAI and Nvidia have already been working together for years, and OpenAI has run on Nvidia technology since its early days. As for the $100 billion Nvidia plans to invest, the timeline is unclear. The chipmaker simply says it plans to invest that money "progressively as each gigawatt is deployed." The companies said they would finalize the details over the coming weeks. Nvidia has engaged in similar maneuvers before. It has built up a portfolio of investments in companies that are often its customers, and sometimes even its suppliers. Among the public companies that Nvidia owns stakes in are CoreWeave and Nebius, the two leading AI neocloud providers; Arm Holdings, whose CPU designs it licenses; Applied Digital, with which it collaborates on high-performance computing data centers; and Intel, a struggling chipmaker that Nvidia just invested $5 billion in as part of a larger partnership. The OpenAI deal, if it fulfills its ambition, would by far be Nvidia's biggest investment to date. It's also the most valuable company Nvidia has invested in. OpenAI was valued at $300 billion in its last funding round in March, but in August, it was flirting with a $500 billion valuation in its plan for a secondary stock sale. While $100 billion is a huge amount of money for any company, it's a manageable sum for Nvidia, which will generate approximately that much in net income this year. Investing that much will also make it a major shareholder in OpenAI, but the move makes sense for Nvidia. OpenAI is the largest and most influential AI start-up, and much of the money that Nvidia is investing will come right back to it via the purchases of GPUs to power those data centers. In fact, the investment serves as an incentive for OpenAI to buy Nvidia's products. It also helps block challengers like AMD from cutting into Nvidia's huge market share lead in the data center GPU market. Nvidia seems to be making all the right moves lately in AI, both technologically and strategically, and forming a network of partnerships should only entrench its market leadership. It also gives it a portfolio of investments that could deliver strong returns. As long as the AI boom persists, Nvidia looks like a great stock to own. As recent reports by the likes of Oracle and others show, the dollars are still flowing into AI infrastructure. As they do, expect Nvidia's stock price to continue to move higher over the long term.
[79]
How Nvidia Became the Banker of Artificial Intelligence | Investing.com UK
Nvidia's (NASDAQ:NVDA) $100 billion investment in OpenAI is more than a financial lifeline to the world's most high-profile AI startup. It tells the story of how a gaming chip company transformed into the central banker of the artificial intelligence economy. By financing its own customers, Nvidia has secured long-term demand for its products, lifted global equity markets, and reshaped credit dynamics. Yet the same strategy ties its fortunes to unprofitable startups, raising questions about sustainability. Nvidia's story begins in video games. For years, its graphics cards were best known for powering immersive visuals. Then researchers realized GPUs could accelerate deep learning models. That discovery unlocked an entirely new market, one that would ultimately reshape global technology. As artificial intelligence moved from research labs to boardrooms, Nvidia became the indispensable supplier of computing power. Demand soared, but so did the cost of scaling. Training advanced AI models required thousands of GPUs and data centers consuming vast amounts of energy. Startups like OpenAI had the ambition but not the balance sheets. Traditional financing came with interest rates as high as 15 percent, a level that reflected deep skepticism about the economics of AI. Without a new source of capital, growth risked stalling. Instead of leaving its customers to struggle, Nvidia took matters into its own hands. The company committed $100 billion to support OpenAI's data center expansion. According to NewStreet Research, every $10 billion of Nvidia financing allows OpenAI to spend $35 billion on Nvidia hardware. The trade-off is clear: Nvidia sacrifices some margin but locks in orders at a scale no competitor can match. Investors approved. On the day of the announcement, Nvidia's market capitalization jumped $160 billion, reflecting confidence in CEO Jensen Huang's strategy of using financial muscle to secure technological dominance. OpenAI is only one part of Nvidia's growing financial empire. These deals show Nvidia is no longer just a supplier. It now acts as a financier, risk manager, and strategic allocator of capital across the AI ecosystem. The model is what analysts call "circularity." Nvidia finances customers, those customers buy Nvidia chips, and the cycle reinforces itself. This keeps the AI boom alive but raises questions about long-term stability. OpenAI, for instance, has 700 million monthly users but expects $44 billion in losses through 2029 before turning profitable. By lowering its borrowing costs, Nvidia keeps OpenAI afloat, but it also assumes risks that banks and bond markets have been reluctant to take. This circularity could fuel years of growth if AI monetization accelerates. If it does not, Nvidia may find itself propping up demand that cannot stand on its own. Equities Nvidia's stock is the flagship for AI optimism. Its strategy extends the boom, but reliance on financing customers could inflate valuations beyond fundamentals. Bonds Lower credit risk for Nvidia-backed projects narrows spreads. Yet Oracle's (NYSE:ORCL) negative outlook from Moody's shows how AI-related financing can stress balance sheets outside Nvidia's orbit. Energy and Commodities The expansion of AI data centers implies multi-decade growth in electricity demand. Utilities, renewable energy firms, and even natural gas producers stand to benefit. Currencies U.S. dominance in AI hardware and financing strengthens the dollar's global role. At the same time, speculative AI borrowing could create volatility for emerging markets dependent on dollar liquidity. Nvidia's rise to AI banker status creates both extraordinary opportunities and hidden risks. Nvidia began as a gaming chipmaker. Today, it is the banker of artificial intelligence. Its unique model of financing customers secures growth but exposes it to the fragility of the very companies it supports. For investors, the key question is whether this new banking role will deliver sustainable returns or whether it risks creating an AI bubble built on borrowed time.
[80]
Nvidia To Invest Up to $100 Billion in OpenAI To Meet Rapid Demand for Artificial Intelligence
Nvidia is investing up to an additional $100 billion in the company behind ChatGPT, OpenAI, as part of an expansion of their partnership to build and deploy the computing power to support the rapid growth of artificial intelligence. The arrangement calls for deploying at least 10 gigawatts of new AI data computing power, about the equivalent of powering 10 million homes for a year. The investment will be provided in stages as each gigawatt goes online. Nvidia's CEO, Jensen Huang, says the project is needed because computing demand for artificial intelligence is "going through the roof." "This is the biggest AI infrastructure project in history," Mr. Huang told CNBC on Monday. "This is the largest computing project in history." The first phase is targeted to come online in the second half of 2026. Mr. Huang says AI is entering its "industrial revolution" and says that in the near future, everything done online will be touched by AI. "It's a big deal," Mr. Huang says. CEO Sam Altman says OpenAI has been under "horrible" constraints due to insufficient computing capacity. "Now that we really see what is on the near-term horizon of how good the models are getting, the new use cases that are being enabled, what people want to do, this is like the fuel we need to drive improvement, to drive better models, to drive everything," says Mr. Altman, who joined Mr. Huang and the president of OpenAI, Greg Brockman, in the interview. OpenAI and Nvidia have been working together for nearly a decade, using Nvidia hardware for training early GPT-2 and GPT-3 models. OpenAI says the partnership complements the work the two companies are already doing with other partners including Microsoft, Oracle, SoftBank, and Stargate. Mr. Brockman says AI is growing faster than any product in history, with 700 million weekly active users already using ChatGPT and demand continuing to grow. "We're able to create new breakthroughs, new models, to be able to actually solve problems like cures for diseases and to actually empower every individual user and business because we'll be able to reach the next level of scale," Mr. Brockman says. The companies say they are still finalizing the details of the increased partnership and that will take weeks to fully flush out. Nvidia's stock was up about 3 percent after the announcement.
[81]
Nvidia to invest $100 billion in OpenAI to power next-generation AI, NVDA stock hits record high
Nvidia is investing up to $100 billion in OpenAI to build huge AI data centers and power next-generation AI models. The partnership will help OpenAI grow its AI systems and reach more users worldwide. Using Nvidia's advanced chips, OpenAI aims to create new AI technology and expand its role in the global AI economy. Nvidia will invest up to $100 billion in OpenAI to help build huge AI infrastructure. Nvidia is the world's most valuable company and makes chips that power AI. OpenAI is the company behind ChatGPT, a leading AI tool. The deal will provide funds in stages, so OpenAI can build and run at least 10 gigawatts of AI data centers using Nvidia systems. The first phase of this AI infrastructure will start in the second half of 2026 using Nvidia's next-generation Vera Rubin chips, according to the OpenAI Press Release. Nvidia Stock (NVDA) went up 3.9% to $183.52 on Monday after this partnership announcement of its $100 billion investment in OpenAI. Later, NVDA shares rose as much as 4.5% to $184.55, surpassing their previous intraday record of $184.48 on August 12. Nvidia called this a strategic partnership, which will allow OpenAI to train and run next-generation AI models and work toward superintelligence. Nvidia CEO Jensen Huang said, "Nvidia and OpenAI have pushed each other for a decade, from the first DGX supercomputer to ChatGPT. This investment marks the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." OpenAI CEO Sam Altman said, "Everything starts with compute. Compute infrastructure will be the basis for the economy of the future. We will use Nvidia systems to create new AI breakthroughs and help people and businesses at scale", as stated in the OpenAI Press Release. OpenAI President Greg Brockman said they have used Nvidia's platform to create AI systems used by hundreds of millions of people, and deploying 10 gigawatts of compute will push the frontier of intelligence. OpenAI will make Nvidia its preferred strategic compute and networking partner for future AI factory growth. Nvidia and OpenAI will co-optimize their roadmaps, meaning OpenAI's AI models and software will work efficiently with Nvidia's hardware and software. This partnership complements OpenAI's work with other collaborators, including Microsoft, Oracle, SoftBank, and Stargate. OpenAI now has over 700 million weekly active users and is widely used by global enterprises, small businesses, and developers. The companies plan to finalize the details of the partnership in the coming weeks. The investment and partnership aim to power the next generation of AI, strengthen OpenAI's mission to develop AI that benefits humanity, and expand Nvidia's influence in the AI economy, as mentioned by OpenAI Press Release. Q1. How much is Nvidia investing in OpenAI? Nvidia is investing up to $100 billion to help OpenAI build huge AI infrastructure. Q2. What will OpenAI do with Nvidia's AI systems? OpenAI will use Nvidia's systems to run 10 gigawatts of AI data centers and train next-generation AI models. (You can now subscribe to our Economic Times WhatsApp channel)
[82]
Nvidia Gives. Nvidia Gets. Nvidia Grows -- Jensen's $100B Brainwave - NVIDIA (NASDAQ:NVDA)
Nvidia Corp NVDA just flipped the script on corporate strategy: it gives customers money to buy its chips and, in return, secures its own growth. The $100 billion planned investment in OpenAI isn't charity; it's a masterstroke in demand engineering. By funding AI data centers stacked with millions of Nvidia GPUs, the company ensures its chips remain the backbone of the AI boom, while Wall Street applauds the genius of CEO Jensen Huang's latest play. * Track NVDA stock here. The plan? Build and deploy AI data centers stacked with "millions" of Nvidia GPUs, funded, in part, by Nvidia itself. Round-Trip Capital, But Make It $100 Billion If this feels circular, that's because it is. Nvidia has already written smaller checks to AI cloud firms like CoreWeave and Lambda earlier this year, helping them scale capacity while ensuring those servers were powered by its silicon. This time, the stakes are enormous: OpenAI alone represents a massive guaranteed buyer for Nvidia's GPUs. It's the ultimate circular strategy -- funding customers to buy from itself -- scaled up to blockbuster proportions. By underwriting demand, Nvidia makes sure its chips remain the industry default, even as Alphabet Inc GOOGL GOOG Google and startups like Groq nibble at its dominance with in-house designs. Managing The Billions, Flexing The Cash The sheer size of the commitment raised eyebrows: Nvidia had $57 billion in cash at the end of July. However, with analysts projecting $97 billion in free cash flow for the current fiscal year -- a 60% increase -- the company has plenty of financial firepower. Unlike the $500 billion Stargate venture OpenAI floated earlier this year, this plan appears less like vaporware and more like strategic insurance for both parties: OpenAI secures a guaranteed supply and financing, while Nvidia establishes a moat around its chip business. Investor Takeaway Call it round-trip capital or strategic foresight -- either way, Nvidia is turning cash into guaranteed demand and market dominance. OpenAI gets financing for its AI ambitions, and Nvidia ensures its hardware is front and center in the fastest-growing segment of computing. For investors, the math is simple: this isn't just a PR headline. It's a $100 billion pipeline for revenue, growth, and a long-term competitive moat. NVDANVIDIA Corp $182.06-0.84% Overview GOOGAlphabet Inc $253.370.19% GOOGLAlphabet Inc $252.970.17% Market News and Data brought to you by Benzinga APIs
[83]
Nvidia Just Announced a Record $100 Billion Deal With OpenAI -- Here's What It Means for Investors | The Motley Fool
Nvidia makes another aggressive move to control the AI market. Nvidia (NVDA -0.73%) is no stranger to investing in its customers. The company has put billions to work to expand the artificial intelligence (AI) ecosystem, aiming for more growth and investment from its core growth market. The company's latest deal with OpenAI -- the maker of ChatGPT -- is a prime example of this strategy. The first thing to understand about this deal is that it is simply a letter of intent. That means the partnership is non-binding, with no legal obligation for either of the companies to follow through on the deal framework discussed below. Even if the deal is non-binding, however, the spirit of the partnership is clear: Nvidia and OpenAI will be working closely together to enable each other's businesses. Next, let's discuss the figures you may have seen in the headlines. Nvidia, for example, has pledged to invest $100 billion into OpenAI. The details, however, paint a slightly different picture than the headlines. What the deal essentially outlines is OpenAI's intention to purchase Nvidia hardware for a massive, multiyear infrastructure buildout. According to a press release, OpenAI intends to "build and deploy at least 10 gigawatts of AI data centers with NVIDIA systems representing millions of GPUs for OpenAI's next-generation AI infrastructure." In return, Nvidia will invest in OpenAI equity in tranches, with each funding tranche being initiated as the infrastructure gradually expands. OpenAI gets two things from this partnership. First, it gets funding in the form of direct cash for equity. Second, it gets preferential treatment from Nvidia when it comes to technology sourcing. Nvidia's chips are in high demand, at one point facing 12-month shipping delays. OpenAI has now secured a long-term strategic advantage, gaining the ability to scale its infrastructure with the best chips on the planet, chips that the competition may not be able to source. Nvidia, meanwhile, gains an even stronger backlog. It locks in a huge customer for years to come. It also helps fund an accelerated buildout of AI infrastructure -- another long-term tailwind for its business. This is the type of deal that only Nvidia and OpenAI could pull off. Both are industry heavyweights with sizable competitive advantages. By joining forces, both companies stand to gain even more ground on the competition. Should you buy stock in Nvidia due to this deal alone? Probably not. The deal, as mentioned, is simply a signal of intent. Nothing is legally binding. Plus, the tie-up could draw the scrutiny of regulators. According to Reuters: The scale of Nvidia's latest commitment could attract antitrust scrutiny. The Justice Department and Federal Trade Commission reached a deal in mid-2024 that cleared the way for potential probes into the roles of Microsoft, OpenAI and Nvidia in the AI industry. However, the Trump administration has so far taken a lighter approach to competition issues than the Biden administration. Even if there are changes to the deal due to regulators or external influences, investors should be very bullish simply about Nvidia's ability to forge such a deal. It has a huge lead on the competition when it comes to real-world chip performance, access to capital, and industry influence. By making moves like this, the company is ensuring that its dominant market shares have the possibility of continuing far into the future. So while shares aren't a buy simply due to the deal with OpenAI, investors should take this news as a strong positive for Nvidia's future.
[84]
Analysts react to Nvidia's $100 billion investment in OpenAI - The Economic Times
Nvidia will invest up to $100 billion in OpenAI, deploying 10 gigawatts of compute under a new strategic partnership. Analysts say the tie-up secures demand for GPUs, validates Nvidia's growth, and accelerates next-gen AI model development.Chipmaker Nvidia is set to invest up to $100 billion in ChatGPT-parent OpenAI, signing a letter of intent for a strategic partnership to deploy at least 10 gigawatts of compute, the companies said on Monday. Nvidia has used its financial clout to keep its hardware central to the buildout of artificial intelligence systems. Keeping OpenAI, which is also exploring its own chip designs, as a key customer could help the company reinforce its dominance as the industry considers rival suppliers. Here are some analyst reactions to the partnership: Matt Britzman, senior equity analyst, Hargreaves Lansdown "For Nvidia, the prize is huge - every gigawatt of AI data centre capacity is worth about $50 billion in revenue, meaning this project could be worth as much as $500 billion. "By locking in OpenAI as a strategic partner and co-optimizing hardware and software roadmaps, Nvidia is ensuring its GPUs remain the backbone of next-gen AI infrastructure. "The market is clearly big enough for multiple players, but this deal underscores that, when it comes to scale and ecosystem depth, Nvidia is still setting the pace - and raising the stakes for everyone else." Jacob Bourne, technology analyst, Emarketer "Demand for Nvidia GPUs is effectively baked into the development of frontier AI models, and deals like this should also ease concerns about lost sales in China. "It also throws cold water on the idea that rival chipmakers or in-house silicon from the Big Tech platforms are anywhere close to disrupting Nvidia's lead. "For OpenAI, it signals greater independence as it continues diversifying away from its Microsoft partnership and races to develop its next-generation models." Anshel Sag, principal analyst, Moor Insights & Strategy "I think this strengthens the partnership between the two companies that has existed since the beginning of OpenAI's existence. This also validates Nvidia's long-term growth numbers with so much volume and compute capacity, also enabling OpenAI to scale to even bigger customers." Bajarin, CEO, Creative Strategies "Really the point Nvidia was making was that it's just enabling OpenAI to meet surging demand and, at this point, we know there's surging demand for Nvidia GPUs, because that's primarily what OpenAI runs on."
[85]
Nvidia's $100 billion OpenAI play raises big antitrust issues
(Reuters) -The $100 billion partnership between dominant AI chipmaker Nvidia and leading artificial intelligence company OpenAI could give both companies an unfair advantage over their competitors, experts say. The move underscores the increasingly overlapping financial interests of the various tech giants developing advanced AI systems, and the potential for a dwindling number of key players to stave off smaller rivals. It "raises significant antitrust concerns," said Andre Barlow, an antitrust lawyer with Doyle, Barlow & Mazard, who also noted that the Trump administration has taken a pro-business approach to regulations, removing hurdles that would slow AI growth. And while unleashing U.S. dominance in artificial intelligence by clearing away regulations and creating incentives for growth is a top priority for President Donald Trump, a Department of Justice official said last week that spurring innovation by protecting AI competition through antitrust enforcement is also part of Trump's AI plan. "The question is whether the agencies see this investment as pro-growth or something that could slow AI growth," Barlow said. Nvidia holds more than half of the market for the GPU chips that run the data centers powering artificial intelligence models and applications, such as OpenAI's ChatGPT. That dominant market position raises concerns that Nvidia would favor OpenAI over other customers with better pricing or faster delivery times, said Rebecca Haw Allensworth, an antitrust professor at Vanderbilt Law School. "They're financially interested in each other's success. That creates an incentive for Nvidia to not sell chips to, or not sell chips on the same terms to, other competitors of OpenAI," Allensworth said. A Nvidia spokesperson said that its investment in OpenAI would not change its focus. "We will continue to make every customer a top priority, with or without any equity stake," the spokesperson said. OpenAI did not immediately respond to a request for comment. Nvidia's biggest customer base is already relatively concentrated, with the two largest buyers accounting for 23% and 16% of its revenue in the second quarter of this year, according to its financial filings, which do not name the buyers. The scope of Monday's deal - in which Nvidia would invest up to $100 billion in OpenAI, and the latter would buy millions of chips from Nvidia -- goes to show "just how expensive frontier AI has become," said Sarah Kreps, director of the Tech Policy Institute at Cornell University. "The cost of chips, data centers and power has pushed the industry toward a handful of firms able to finance projects on that scale," Kreps said. During Joe Biden's presidency, the DOJ and U.S. Federal Trade Commission were on guard against anticompetitive actions by Big Tech companies in the AI space, warning that such companies could use their existing scale to dominate the nascent field. Under Trump, both agencies have continued other cases against Big Tech companies, and DOJ antitrust division head Gail Slater said on Thursday that enforcement "must focus on preventing exclusionary conduct over the resources that are needed to build competitive AI systems and products." "The competitive dynamics of each layer of the AI stack and how they interrelate, with a particular eye towards exclusionary behavior that forecloses access to key inputs and distribution channels, are legitimate areas for antitrust inquiry," she said. (Reporting by Jody Godoy; Editing by Chris Sanders and Lisa Shumaker)
[86]
Nvidia's $100 Billion Bet On OpenAI The 'Latest Salvo' In The Big Tech Arms Race, Says Gene Munster: 'Bumps Up the Bar' For The Rest - Alphabet (NASDAQ:GOOG), Amazon.com (NASDAQ:AMZN)
Deep Water Asset Management Managing Partner Gene Munster said on Monday that Nvidia Corp.'s NVDA multi-billion-dollar commitment to OpenAI marks a pivotal moment in the AI arms race, one that escalates things to a whole new level. Raises The Stakes For Tech Giants On Monday, Nvidia committed to investing $100 billion into OpenAI, aimed at deploying 10 gigawatts of NVIDIA-powered systems that to support OpenAI's next-generation AI infrastructure. Munster, however, believes that the story is much bigger than this, since the massive investment outlay effectively raises the stakes for every other tech giant. The analyst shared his take in a post on X. See Also: Is Nvidia Strong Enough To Overcome US-China Trade Tensions? According to Munster, Nvidia investing in OpenAI effectively creates "more demand for data centers, more demand for cloud compute, more demands [for] AI inference training that basically quickens the pace that all other players are going to have to respond [to]," he said. He notes that Meta Platforms Inc. META has already signaled a 45% increase in capital expenditures over the next three years, while adding that Amazon.com Inc. AMZN, Alphabet Inc. GOOG and Microsoft Corp. MSFT have set modest targets, which be believes will change soon. "This deal bumps up the bar for the rest of Big Tech," he said, adding that the September earnings calls from those three companies could provide the "best read" on where the AI investment cycle heads next. Oracle Shares Surge Amid Deal Clarity Shares of Oracle Corp. ORCL surged 6.31% on Monday, outpacing NVIDIA's 3.97% gain. According to Munster, the rally reflects renewed clarity around Oracle's massive $300 billion order contract with OpenAI, which had been questioned and dismissed as overstated last week. "The substance probability of [the] big contract announced a couple weeks ago from Oracle increased," Munster said, with NVIDIA set to cover at least one-third of this compute power that OpenAI seeks. Shares of NVIDIA were up 3.97% on Monday, closing at $183.61, following the announcement of the deal, but are down 0.23% overnight. The stock scores high in Benzinga's Edge Stock Rankings, with a favorable price trend in the short, medium and long terms. Click here for deeper insights into the stock, its peers and competitors. Read More: Dan Ives Compares Sam Altman's World Network To Tesla, Nvidia, Palantir In Their Infancy: 'An AI Meets Crypto Intersection' Photo Courtesy: Hepha1st0s on Shutterstock.com AMZNAmazon.com Inc$228.100.21%OverviewGOOGAlphabet Inc$253.590.28%METAMeta Platforms Inc$766.590.19%MSFTMicrosoft Corp$515.890.28%NVDANVIDIA Corp$182.603.36%ORCLOracle Corp$326.85-0.40%Market News and Data brought to you by Benzinga APIs
[87]
Nvidia's $100 Billion OpenAI Pact Buys Time in the Custom Chip Race | The Motley Fool
The AI infrastructure deal locks in near-term dominance while both companies hedge their bets. Wall Street's AI infrastructure story just got its biggest validation yet. Nvidia (NVDA -2.79%) and OpenAI's strategic partnership, which will deploy at least 10 gigawatts of computing power starting with the Vera Rubin platform in late 2026, turns lofty artificial intelligence (AI) buildout talk into staged deployments with tangible financial implications. Here's what investors need to know right now. The timing reveals OpenAI's pragmatic calculus. Yes, the company is developing custom chips with Broadcom for 2026 deployment. But replicating Nvidia's full stack -- CUDA software, NVLink networking, and years of optimization -- would delay scaling when every month matters in the AI race. This isn't about abandoning custom silicon. It's about securing proven capacity while keeping long-term options open. OpenAI gets immediate scale; Nvidia gets multi-year revenue visibility. This deal slots neatly into January's $500 billion Stargate blueprint -- a four-year U.S. infrastructure buildout anchored by an initial $100 billion tranche. What many dismissed as overreach now carries more weight with Nvidia attaching capital and hardware. The structure is phased: Nvidia invests as capacity comes online, echoing Microsoft's staged OpenAI funding. It's a reciprocal arrangement rather than a closed loop, though the scale ensures regulators will give it a close look. Advanced Micro Devices now faces a higher bar for OpenAI-scale wins, but MI300-class deployments at other hyperscalers remain very much in play. The upcoming MI350 and a broader Xilinx software stack underscore that AMD isn't standing still. At the same time, Nvidia has bolstered its ecosystem with a $5 billion Intel stake and a custom x86 CPU partnership -- a move that not only deepens its platform but also provides foundry optionality beyond Taiwan Semiconductor. For hyperscalers like Alphabet, Amazon, and Microsoft, the takeaway is clear: their most visible AI peer is tied to Nvidia through 2028. Yet each tech giant will continue investing in in-house silicon to control costs and secure strategic independence. At about 26 times projected fiscal 2028 earnings, Nvidia trades at a premium -- but one justified by its growth profile. Street forecasts already had revenue climbing toward $257 billion in fiscal 2027, numbers issued before this OpenAI pact that could raise the ceiling further. Even so, not everyone views the deal as transformative. Morningstar, for example, kept its fair value estimate at $190 per share, underscoring both execution risk and the stock's already lofty expectations. With shares up 58% over the past year to $182 (as of Sept. 22, 2025), that implies modest upside -- hardly the setup for explosive gains. The limiting factor is no longer chips -- it's electricity. U.S. data centers drew about 4.4% of the nation's power in 2023, and Department of Energy projections put that share as high as 12% by 2028. Each new gigawatt of capacity is roughly equivalent to adding a nuclear reactor. That reality is pushing tech companies to explore small modular reactors and other nontraditional sources. Intel's foundry partnership broadens supply chain geography, but grid capacity remains the choke point. The OpenAI pact locks Nvidia into the next wave of training cycles and reinforces its ecosystem lead. It doesn't end the custom silicon race, but it gives both sides time to advance their strategies. For investors, the message is simple: Nvidia is committing up to $100 billion in staged capital and hardware to OpenAI's buildout -- an investment that ultimately flows back into its own ecosystem. The stock isn't cheap, but the premium reflects a level of certainty few other tech growth stories can match.
[88]
More questions than answers in Nvidia's $100 billion OpenAI deal
SAN FRANCISCO (Reuters) -Nvidia's move to invest up to $100 billion into OpenAI at the same time it plans to supply millions of its market-leading artificial intelligence chips to the ChatGPT creator has little precedent in the tech industry. Under the deal, Nvidia will be taking a financial stake in one of its largest customers, but without receiving any voting power in return, according to a person close to OpenAI. The ChatGPT maker will receive some - but not nearly all - of the capital it needs for its ambitious plans to build the sprawling supercomputers required to develop new generations of AI. Nvidia's initial $10 billion investment would go toward a gigawatt of capacity using its next-generation Vera Rubin chips, with a build-out starting in the second half of 2026. The deal raises many questions. Here are five of the biggest ones: WHERE DOES THE REST OF THE MONEY COME FROM? In an earnings call in August, Nvidia CEO Jensen Huang said that AI data centers cost about $50 billion per gigawatt of capacity to build out, with about $35 billion of that money going toward Nvidia's chips and gear. Nvidia has committed to investing in OpenAI to help it build 10 gigawatts of data center capacity, or about $10 billion per gigawatt. That leaves about $40 billion in additional capital required for each gigawatt of capacity OpenAI plans to build. OpenAI has not signaled whether it agrees with Huang's cost estimates or, if it does, where it would procure the additional funds. OpenAI did not return a request for comment about its funding plans. Nvidia declined to comment beyond what it has said publicly. WHAT DOES IT MEAN FOR OPENAI'S EFFORTS TO BECOME A FOR-PROFIT? OpenAI is a non-profit corporation, a structure that dates to its days as an AI research group. It has been looking to change to a more conventional structure that would allow it to more easily raise money and hold a public offering. OpenAI has held extensive discussions with Microsoft, a major shareholder that funded OpenAI's early computing needs, to change its structure. Earlier this month, the two firms said they had reached a tentative deal on OpenAI converting to a for-profit public benefit corporation that would be overseen by OpenAI's existing non-profit, though that move still needs approval from state officials in Delaware and California. On Monday, a person familiar with the matter told Reuters that Nvidia would be making a cash investment into OpenAI similar to other OpenAI investors. Moreover, Nvidia's initial $10 billion investment will not begin until OpenAI and Nvidia reach a definitive agreement in the coming months. It was not immediately clear whether Nvidia planned to invest in OpenAI's non-profit entity or whether its plans depend on OpenAI's conversion to a public benefit corporation overseen by a non-profit. WHAT DOES IT MEAN FOR OPENAI'S VALUATION? OpenAI is currently valued at $500 billion, and a person familiar with the matter told Reuters that Nvidia's initial $10 billion investment for one gigawatt of capacity would be at that valuation. But neither Nvidia nor OpenAI gave a timeframe for the entire 10 gigawatts of capacity coming online or for the $100 billion of investment to take place. Also unanswered is whether subsequent Nvidia investments in OpenAI would take place at OpenAI's current valuation, or at the valuation of the company at the time Nvidia makes each investment. WHAT DOES IT MEAN FOR COMPETITION? The deal between Nvidia and OpenAI could see Nvidia earmarking a significant number of its chips, which remain in high demand several years into the AI boom and access to which can determine success or failure in the field, to a single customer in which it is also a shareholder. An important question is whether OpenAI's rivals such as Anthropic, or even Microsoft, which competes with OpenAI to sell AI technology to businesses, will retain access to Nvidia's chips. The deal also raises questions about whether AMD, which is aiming to compete with Nvidia in selling chips to OpenAI and others, will have a viable chance of selling chips to AI companies. WHAT DOES IT MEAN FOR ORACLE? Oracle said earlier this month that it has signed hundreds of billions of dollars in contracts to provide cloud computing services to OpenAI and a handful of other large customers, which sent its stock soaring and made co-founder Larry Ellison one of the world's richest people. But one of the key questions lingering after that forecast - and a question that debt-rating firm Moody's raised - is whether OpenAI has the cash to pay for the contracts. On Monday, shortly before Nvidia's announcement, Oracle re-affirmed its forecast as it named two new CEOs. It is possible that Nvidia's investment plans could put Oracle's revenue forecast on a firmer footing because a key customer, OpenAI, has fresh capital commitments. (Reporting by Stephen Nellis in San Francisco; Editing by Muralikumar Anantharaman)
[89]
Nvidia to invest $100 billion in OpenAI for data centre development - The Economic Times
The companies unveiled a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure.Chipmaker Nvidia plans to invest up to $100 billion in artificial intelligence startup OpenAI under a new agreement, the companies said on Monday. The partnership comes as competition intensifies among technology giants and startups to secure access to energy and chips needed for AI growth. The companies unveiled a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure. The companies will finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026. The investment comes days after Nvidia committed $5 billion to struggling chipmaker Intel.
[90]
Nvidia Stock Climbs On $100 Billion Deal To Supercharge OpenAI - NVIDIA (NASDAQ:NVDA)
Nvidia Corp. NVDA and OpenAI on Monday announced a new deal to deploy at least 10 gigawatts of Nvidia-powered systems to support OpenAI's next-generation AI infrastructure. NVDA stock climbed on the news. See the chart here. Nvidia has pledged to invest up to $100 billion in the AI infrastructure rollout, with the first phase expected to launch in the second half of 2026. Read More: Snap Stock Goes Full Meme -- Buyout Rumors Meet Retail's Roar OpenAI CEO Sam Altman said the infrastructure buildout is critical to meeting the AI company's future goals. "There are three things that OpenAI has to do well: we have to do great AI research, we have to make these products people want to use, and we have to figure out how to do this unprecedented infrastructure challenge," Altman told CNBC. "You should expect a lot from us in the coming months," he added. Nvidia CEO Jensen Huang elaborated on just how large the deal is by explaining that 10 gigawatts of infrastructure is equivalent to between 4 million and 5 million GPUs -- the same amount Nvidia expects to ship in 2025 and about double its output from last year. "This is a giant project," Huang said. He also told CNBC that the newly announced deal is "additive" to all other projects and deals, including the Stargate project and previous commitments with OpenAI. Hargreaves Lansdown analyst Matt Britzman weighed in. "For Nvidia, the prize is huge -- every gigawatt of AI data centre capacity is worth about $50 billion in revenue, meaning this project could be worth as much as $500 billion," Britzman said, per Reuters. NVDA Price Action: Nvidia stock climbed to nearly 52-week highs after the announcement and ended Monday's session up 3.93% at $183.61, according to Benzinga Pro. Read Next: Quantum Stocks Boom As IonQ, Rigetti Join Government Initiatives Photo: Shutterstock NVDANVIDIA Corp$183.033.60%OverviewMarket News and Data brought to you by Benzinga APIs
[91]
Nvidia's Partnership With OpenAI Could Become the Biggest Profit Engine in AI History | The Motley Fool
The artificial intelligence (AI) revolution already has been a boon to Nvidia (NVDA -2.79%), supercharging the company's revenue growth and stock performance. Thanks to Nvidia's dominance in the AI chip market, it's seen sales climb in the double and triple digits -- and last year delivered a record level of revenue at more than $130 billion. As for the stock, it's gained more than 1,300% over the past five years. And as this growth story has unfolded, Nvidia expanded into a variety of related AI products and services, from enterprise software to networking, cementing its position as the "go to" place for AI. Still, the company faces rivals in the market, from other chip designers and even from customers that have started making some of their own chips. That's prompted some investors to wonder whether Nvidia will be able to ensure its position over time and keep its pricing power -- its chips are the most powerful on the market and also the most costly. But Nvidia, in this one recent move, has shown its strength as the central player in the AI market. The company aims to invest as much as $100 billion in Open AI amid a buildout of the lab's data centers. This partnership could become the biggest profit engine in AI history -- and secure Nvidia's AI empire. Let's check it out. First, a little background on both Nvidia and OpenAI. Nvidia, as mentioned, is the world's No. 1 AI chip designer, offering the top performing graphics processing units (GPUs) to power crucial AI tasks. Among its biggest customers are leading tech and AI players, from Microsoft to Meta Platforms, that aim to use the best possible tools to power their AI projects. OpenAI also is a Nvidia customer, and like Nvidia, is central to the AI story. That's because OpenAI is the research lab that developed and owns popular chatbot ChatGPT -- today OpenAI reaches more than 700 million weekly active users. This AI giant isn't publicly traded but has significant partnerships in the tech world -- for example, it launched earlier this year the Stargate Project to build U.S. AI infrastructure with Oracle and Softbank -- and Microsoft has invested about $13 billion in OpenAI. Moving on to the latest news, Nvidia this week said it will invest as much as $100 billion in OpenAI's buildout, an effort that will involve the deployment of an enormous amount of Nvidia computing power. The Nvidia platform will use 10 gigawatts of power -- that's the equivalent of 4 million to 5 million GPUs, Nvidia chief Jensen Huang, calling the deal a "giant project," told CNBC. The investment will be made in installments, with the first $10 billion to come once the first gigawatt is complete, CNBC also reported, citing a person familiar with the project. This Nvidia partnership with OpenAI is an enormous win for both of these AI players. For OpenAI, it ensures the funding needed to build out infrastructure, and therefore continue to advance its research, increase its user base, and generate revenue growth. For Nvidia, this cements its position as a key OpenAI partner, with the research lab receiving funding from Nvidia, then going on to buy GPUs from the chip giant. As this project advances, involving two of the biggest AI names in the world, this relationship should supercharge revenue at both OpenAI and Nvidia. An additional benefit for Nvidia is this move strengthens its market position, showing investors who have worried about the company's future growth that it isn't likely to be unseated by rivals or forced to lower the prices of its GPUs to keep up. Finally, it's important to consider this latest partnership in the context of the full picture. Nvidia recently made another important move, investing in Intel to secure that company's top central processing units (CPUs) for Nvidia platforms and to offer Nvidia GPU chiplets for Intel PC systems. This strengthens Nvidia's current offerings and broadens its customer base. These decisions are putting this dominant AI company on track to maintain its leadership and significantly increase revenue in the upcoming phases of the AI story. And the latest deal with OpenAI, involving enormous future orders for Nvidia GPUs to power this data center empire, is on its way to becoming the biggest profit driver in the history of AI.
[92]
Nvidia invests $100bn in OpenAI for a massive infrastructure rollout
On Monday, Nvidia announced an investment of up to $100bn in OpenAI to fund the construction of large-scale data centers powered by its own chips. The project involves the installation of Nvidia systems with a total capacity of 10 gigawatts, or between 4 and 5 million GPUs, equivalent to the group's entire annual shipments. The investment will be made gradually, with Nvidia becoming OpenAI's preferred supplier for its processors and network equipment. This alliance illustrates the interdependence between the companies, which is behind the current explosion in artificial intelligence. ChatGPT, launched at the end of 2022, had already triggered a spectacular surge in demand for Nvidia GPUs, which are now essential for both training and deploying models. Sam Altman summarized OpenAI's priorities: maintaining high-level research, developing products that are adopted by the general public, and meeting the infrastructure challenge posed by this unprecedented deployment. The deal reinforces Nvidia's aggressive strategy, already active throughout the AI ecosystem with stakes in Intel, Nscale, and Enfabrica. Nvidia's stock rose nearly 4% on Monday, valuing the group at around $4.5 trillion. This partnership complements OpenAI's existing collaborations with Microsoft, Oracle, SoftBank, and the Stargate project. According to analysts, it creates a "virtuous circle," with the money invested by Nvidia ultimately returning to its own systems, and confirms that infrastructure is the new strategic frontier for artificial intelligence.
[93]
What Is Going On With Nvidia Stock On Monday? - NVIDIA (NASDAQ:NVDA)
Nvidia NVDA stock gained on Monday after it announced that it has moved to expand its decade-long collaboration with OpenAI by signing a letter of intent for a strategic partnership. The companies plan to deploy at least 10 gigawatts of Nvidia systems to build OpenAI's next-generation AI infrastructure, a leap toward training and running models on the path to superintelligence. NVIDIA is committed to investing up to $100 billion in OpenAI as these systems roll out, with the first phase likely to go live in the second half of 2026 using the new Nvidia Vera Rubin platform. Also Read: Nvidia's Bold Move Catapults Intel To The Heart Of AI Innovation Nvidia traded close to its 52-week high of $184.48. Under the agreement, OpenAI named Nvidia its preferred strategic compute and networking partner for AI factory growth. Both companies will co-optimize their roadmaps, aligning OpenAI's model and infrastructure software with Nvidia's hardware and software. This deal complements Nvidia and OpenAI's existing collaborations with Microsoft MSFT, Oracle ORCL, SoftBank SFTBY, and Stargate partners to build the world's most advanced AI infrastructure. Nvidia's equity investment in OpenAI does not confer control rights. OpenAI's non-profit parent retains majority governance control. The timing highlights OpenAI's rapid growth, with more than 700 million weekly active users and widespread adoption by enterprises and developers. Nvidia and OpenAI expect to finalize partnership details in the coming weeks. OpenAI chief Sam Altman told CNBC Live that building this infrastructure is critical to everything it wants to do. Nvidia CEO Jensen Huang told CNBC that this project is additive to everything announced and contracted. Nvidia's stock has gained over 36% year-to-date, topping the Nasdaq 100 index's 18% returns, as Big Tech giants keep splurging on their AI ambitions, bearing testimony to the continued AI frenzy. Earlier in September, OpenAI shared plans to begin mass production of its first proprietary AI chip next year by partnering with Nvidia rival Broadcom AVGO. Broadcom CEO Hock Tan confirmed the deal on its earnings call, disclosing that it secured $10 billion in orders from OpenAI, its fourth major customer win. The partnership, which started in 2024, will allow OpenAI to use the chips internally to support ChatGPT's 700 million weekly users and upcoming models like GPT-5. The move mirrors strategies by Alphabet GOOGL, Amazon.com AMZN, and Meta Platforms META, which have all developed custom processors to optimize AI workloads. Price Action: At last check on Monday, NVDA stock was trading higher by 3.65% to $183.11. Read Next: Nvidia Commits $500 Million To UK Autonomous Driving Startup Wayve Photo: Shutterstock NVDANVIDIA Corp$183.954.12%OverviewAMZNAmazon.com Inc$228.37-1.35%AVGOBroadcom Inc$339.44-1.59%GOOGLAlphabet Inc$251.10-1.42%METAMeta Platforms Inc$770.67-0.99%MSFTMicrosoft Corp$514.63-0.64%ORCLOracle Corp$325.735.53%SFTBYSoftBank Group Corp$61.48-0.98%Market News and Data brought to you by Benzinga APIs
[94]
Nvidia to invest $100 billion in OpenAI, linking up two AI titans
STORY: Nvidia plans to invest $100 billion into ChatGPT-maker OpenAI, and supply it with data center chips. The companies announced the move Monday, marking a tie-up between two leaders in artificial intelligence. A source close to OpenAI says the deal will involve two separate but intertwined transactions: Nvidia will invest in OpenAI for non-controlling shares, and the start-up can then use the cash to buy chips from Nvidia. The chipmaker will invest an initial $10 billion, and then start delivering hardware as soon as late 2026. On Monday, the two companies signed a letter of intent to deploy at least 10 gigawatts of Nvidia systems for OpenAI. They aim to finalize partnership details in the coming weeks. Nvidia shares closed around 4% higher after the announcement. OpenAI has been working to build its own AI chips as a cheaper alternative to Nvidia. But a person familiar with the matter said its deal with Nvidia doesn't change those plans... ...or its partnership with Microsoft, which has invested billions in OpenAI since 2019. Nvidia's investment comes days after it committed $5 billion to struggling chipmaker Intel. Analysts say the scale of the new OpenAI deal means it could attract the attention of regulators, though the Trump administration has so far taken a relatively hands-off approach to competition concerns.
[95]
Why Nvidia Stock Popped Monday | The Motley Fool
Investors loved what they heard from Nvidia and OpenAI today. Nvidia (NVDA 3.70%) has seen explosive growth for its advanced semiconductor chips. The artificial intelligence (AI) leader made a major announcement today that will supplement that growth. Nvidia and AI research organization and ChatGPT developer OpenAI have created a new strategic partnership that could be a win-win for both companies. That helped Nvidia's stock surge as much as 5.3%. As of 2:10 p.m. ET, shares were 3.4% higher than Friday's closing price. Nvidia said it will invest as much as $100 billion in OpenAI. But in reality, that money will turn right around to become revenue for Nvidia. The companies said the $100 billion will be used by OpenAI to establish and deploy data centers for OpenAI's next-generation AI infrastructure. The AI data centers will run at least 10 gigawatts (GW) of compute power utilizing millions of Nvidia graphics processing unit (GPU) systems. The systems will train and run OpenAI's next generation of models. Nvidia CEO Jensen Huang called the deal "monumental in size." Huang said the 10 GW is equivalent to between 4 million and 5 million GPUs. That is the total volume of what the company will ship in all of this year. Huang also added that the investment is "additive to everything that's been announced and contracted." That's great news for Nvidia investors. The majority of the data center costs will be used for Nvidia chips and systems. This deal alone could help Nvidia continue its astonishing rate of growth. Investors are recognizing that and adding Nvidia shares today.
[96]
OpenAI and NVIDIA Announce Strategic Partnership to Deploy 10 Gigawatts of NVIDIA Systems
OpenAI and NVIDIA announced a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of NVIDIA systems for OpenAI?s next-generation AI infrastructure to train and run its next generation of models on the path to deploying superintelligence. To support this deployment including data center and power capacity, NVIDIA intends to invest up to $100 billion in OpenAI as the new NVIDIA systems are deployed. The first phase is targeted to come online in the second half of 2026 using the NVIDIA Vera Rubin platform. OpenAI will work with NVIDIA as a preferred strategic compute and networking partner for its AI factory growth plans. OpenAI and NVIDIA will work together to co-optimize their roadmaps for OpenAI?s model and infrastructure software and NVIDIA?s hardware and software. This partnership complements the deep work OpenAI and NVIDIA are already doing with a broad network of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners, focused on building the world?s most advanced AI infrastructure. OpenAI has grown to over 700 million weekly active users and strong adoption across global enterprises, small businesses and developers. This partnership will help OpenAI advance its mission to build artificial general intelligence that benefits all of humanity. NVIDIA and OpenAI look forward to finalizing the details of this new phase of strategic partnership in the coming weeks.
[97]
Nvidia Stock On Track for Record Close on Expansive Partnership With OpenAI
Shares of Nvidia rose after the chipmaker said it plans to invest up to $100 billion into OpenAI as part of an expansive new partnership. The stock edged up 4% to $183.76 Monday, on track for a new all-time closing high. Shares are up 37% this year. The companies said Monday that OpenAI will build and deploy at least 10 gigawatts of Nvidia systems for its artificial intelligence data centers to train and run its next generation of models. Nvidia will make its investment in OpenAI progressively as each gigawatt is deployed to support data center and power capacity, the companies said. The first phase of the partnership is targeted to come online in the second half of 2026 using Nvidia's Vera Rubin platform. The companies said they plan to finalize the details of their new partnership in the coming weeks. The partnership comes after Nvidia said last week it would invest $5 billion in chipmaker Intel. Under the terms of the deal, Intel will design custom central processing units that will be easily integrated with Nvidia's chips and other equipment in both data centers and personal computers.
[98]
Analysts react to Nvidia's $100 billion investment in OpenAI
(Reuters) -Chipmaker Nvidia is set to invest up to $100 billion in ChatGPT-parent OpenAI, signing a letter of intent for a strategic partnership to deploy at least 10 gigawatts of compute, the companies said on Monday. Nvidia has used its financial clout to keep its hardware central to the buildout of artificial intelligence systems. Keeping OpenAI, which is also exploring its own chip designs, as a key customer could help the company reinforce its dominance as the industry considers rival suppliers. Here are some analyst reactions to the partnership: MATT BRITZMAN, SENIOR EQUITY ANALYST, HARGREAVES LANSDOWN "For Nvidia, the prize is huge -- every gigawatt of AI data centre capacity is worth about $50 billion in revenue, meaning this project could be worth as much as $500 billion. "By locking in OpenAI as a strategic partner and co-optimizing hardware and software roadmaps, Nvidia is ensuring its GPUs remain the backbone of next-gen AI infrastructure. "The market is clearly big enough for multiple players, but this deal underscores that, when it comes to scale and ecosystem depth, Nvidia is still setting the pace -- and raising the stakes for everyone else." JACOB BOURNE, TECHNOLOGY ANALYST, EMARKETER "Demand for Nvidia GPUs is effectively baked into the development of frontier AI models, and deals like this should also ease concerns about lost sales in China. "It also throws cold water on the idea that rival chipmakers or in-house silicon from the Big Tech platforms are anywhere close to disrupting Nvidia's lead. "For OpenAI, it signals greater independence as it continues diversifying away from its Microsoft partnership and races to develop its next-generation models." ANSHEL SAG, PRINCIPAL ANALYST, MOOR INSIGHTS & STRATEGY "I think this strengthens the partnership between the two companies that has existed since the beginning of OpenAI's existence. This also validates Nvidia's long-term growth numbers with so much volume and compute capacity, also enabling OpenAI to scale to even bigger customers." BEN BAJARIN, CEO OF TECHNOLOGY CONSULTING FIRM CREATIVE STRATEGIES "Really the point Nvidia was making was that it's just enabling OpenAI to meet surging demand and, at this point, we know there's surging demand for Nvidia GPUs, because that's primarily what OpenAI runs on." (Reporting by Juby Babu in Mexico City, Kritika Lamba and Arsheeya Bajwa in Bengaluru; Editing by Alan Barona)
[99]
Nvidia to invest $100 billion in OpenAI as AI datacenter competition intensifies
(Reuters) -Chipmaker Nvidia will invest up to $100 billion in OpenAI and provide it with data center chips, the companies said on Monday, a tie-up between two of the highest-profile leaders in the global artificial intelligence race. The deal, which will see Nvidia start delivering chips as soon as late 2026, will involve two separate but intertwined transactions, according to a person close to OpenAI. The startup will pay Nvidia in cash for chips, and Nvidia will invest in OpenAI for non-controlling shares, the person said. The first $10 billion of Nvidia's investment in OpenAI, which was most recently valued at $500 billion, will begin when the two companies reach a definitive agreement for OpenAI to purchase Nvidia chips Nvidia did not respond to immediate requests for clarification about the deal. The pact is among a spate of agreements between major technology players that includes years of investment in OpenAI from Microsoft and a deal last week between Nvidia and Intel to collaborate on AI chips. The two companies signed a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of Nvidia chips for OpenAI's AI infrastructure. They aim to finalize partnership details in the coming weeks, with the first deployment phase targeted to come online in the second half of 2026. "Everything starts with compute," OpenAI CEO Sam Altman said in a statement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Nvidia shares were up 4.4% while shares of Oracle, which partners with OpenAI, SoftBank and Microsoft on the $500 billion Stargate AI data center project, gained nearly 5%. Nvidia's investment comes days after it committed $5 billion to struggling chipmaker Intel. OpenAI and its backer, Microsoft, also announced earlier this month that they have signed a non-binding deal for new relationship terms that would allow for OpenAI's restructuring into a for-profit company. Nvidia also backed OpenAI in a $6.6 billion funding round in October 2024. However, the world's most valuable firm making another sizeable investment in OpenAI could lead to antitrust scrutiny. The Trump administration has taken a much lighter touch on competition issues compared with former President Joe Biden's antitrust enforcers. In June 2024 the Justice Department and the Federal Trade Commission reached a deal that cleared the way for potential antitrust investigations into the dominant roles that Microsoft, OpenAI and Nvidia play in the artificial intelligence industry. (Reporting by Arsheeya Bajwa in Bengaluru and Deepa Seetharaman and Stephen Nellis in San Francisco; Editing by Tasim Zahid and Anil D'Silva)
[100]
Nvidia to invest up to $100 bn in OpenAI: Here's what they are planning
Under the partnership, Nvidia will be OpenAI's preferred partner for compute and networking. Nvidia and OpenAI have announced a major partnership that could reshape the future of artificial intelligence. The two companies plan to work together to deploy at least 10 gigawatts of Nvidia systems to power OpenAI's next-generation AI infrastructure that will be used to train and run future AI models, including those aimed at advancing toward superintelligence. To support this project, Nvidia intends to invest up to $100 billion in OpenAI as the new systems are rolled out. The first phase of this plan is expected to begin in the second half of 2026, utilising Nvidia's Vera Rubin platform. "Nvidia and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT," said Jensen Huang, founder and CEO of Nvidia. "This investment and infrastructure partnership mark the next leap forward -- deploying 10 gigawatts to power the next era of intelligence." Also read: Google faces antitrust trial over online ads monopoly, check details "Everything starts with compute," said Sam Altman, co-founder and CEO of OpenAI. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale." Under the partnership, Nvidia will be OpenAI's preferred partner for compute and networking, and both companies will align their hardware and software roadmaps to optimise AI model performance and infrastructure. This collaboration will also complement ongoing work with partners like Microsoft, Oracle, SoftBank, and Stargate, all focused on building advanced AI infrastructure. Also read: GST 2.0 kicks in today: Here's the full list of items getting cheaper and costlier OpenAI currently has over 700 million weekly active users, with strong adoption among enterprises, small businesses, and developers worldwide. With Nvidia's investment and infrastructure support, OpenAI aims to continue its mission of developing artificial general intelligence that benefits all of humanity.
Share
Share
Copy Link
Nvidia and OpenAI announce a groundbreaking partnership, with Nvidia planning to invest up to $100 billion in OpenAI's AI infrastructure. The deal aims to deploy 10 gigawatts of Nvidia systems, marking a significant leap in AI computing power.
In a move that's set to reshape the landscape of artificial intelligence, Nvidia and OpenAI have announced a monumental partnership aimed at dramatically expanding AI infrastructure. The deal, formalized through a letter of intent, outlines Nvidia's commitment to invest up to $100 billion in OpenAI as they collaborate to deploy at least 10 gigawatts of Nvidia systems for AI development
1
2
.Source: New York Post
The scale of this project is staggering, with the planned 10-gigawatt deployment equating to the power output of approximately 10 nuclear reactors. To put this into perspective, Nvidia CEO Jensen Huang revealed that the power consumption would match between 4 million and 5 million graphics processing units, doubling the company's total GPU shipments from the previous year
1
.The partnership will unfold in phases, with the first gigawatt of Nvidia systems scheduled to come online in the second half of 2026. This initial phase will utilize Nvidia's advanced Vera Rubin platform, named after the renowned dark matter astronomer
3
.Related Stories
For OpenAI, this partnership ensures a long-term pipeline of cutting-edge hardware, crucial for maintaining its competitive edge in AI research and development. The company, which recently surpassed 700 million weekly active users, views this collaboration as essential for pursuing artificial general intelligence (AGI) and supporting its rapidly growing user base
4
.Source: Quartz
Nvidia, on the other hand, solidifies its position at the heart of the AI boom. By taking a substantial stake in OpenAI, Nvidia goes beyond being a mere supplier to becoming a key player in shaping the future of AI technology
4
.This partnership sets a new benchmark in the AI industry, dwarfing previous investments and infrastructure commitments. It comes at a time of fierce global competition to build massive AI datacenter capacity, potentially accelerating the pace of AI advancements and reshaping the competitive landscape
4
.Source: NBC News
As OpenAI CEO Sam Altman stated, "Compute infrastructure will be the basis for the economy of the future." This collaboration between Nvidia and OpenAI may well be the catalyst that ushers in a new era of AI capabilities and applications, with far-reaching implications for businesses, research, and society at large
1
.Summarized by
Navi