6 Sources
6 Sources
[1]
Meet CoreWeave, the AI industry's ticking time bomb
Hello, my friends. Have you been feeling too sane lately? Have I got something for you! It is a company called CoreWeave. You may not have heard of it because it's not doing the consumer-facing part of AI. It's a data center company, the kind people talk about when they say they want to invest in the "picks and shovels" of the AI gold rush. At first glance, it looks impressive: it's selling compute, the hottest resource in the industry; it's landed a bunch of big-name customers such as Microsoft, OpenAI and Meta; and its revenue is huge -- $1.4 billion in the third quarter this year, double what it was in the third quarter of 2024. The company has almost doubled in share price since its IPO earlier this year, which was the biggest in tech since 2021. So much money! But as I began to look more closely at the company, I began feeling like I'd accidentally stumbled on an eldritch horror. CoreWeave is saddled with massive debt and, except in the absolute best-case scenario of fast AI adoption, has no obvious path toward profitability. There are some eyebrow-raising accounting choices. And then, naturally, there are the huge insider sales of CoreWeave stock. After I unfocused my eyes a little, I realized CoreWeave did make a horrible kind of sense: It's a tool to hedge other companies' risks and juice their profits. It's taking on the risk and the costs of building data centers that bigger tech companies can then rent while they build their own data centers which may very well wind up competing with CoreWeave. What's more, it's part of a whole stable of companies that are propping up demand for the behemoth of the AI boom: Nvidia. I don't think CoreWeave's weaknesses are a secret. It just seems like a lot of investors are ignoring them. Whether that's because of AI fomo or a sophisticated game of chicken -- getting as much money from CoreWeave shares before the inevitable collapse as possible -- I can't really say. I just know that after spending several days parsing its financials, talking to analysts and other experts, and trying to understand AI infrastructure, here's my life now: CoreWeave went public in March, at a share price of $40, and at its peak this year was worth $187 a share. That was back in June. Today, the shares opened at $75.51. Some of the price decline is due to CoreWeave announcing, along with its third quarter earnings, that delays on a data center mean it'll make less money this year. The delay underscored some of the difficulties CoreWeave faces in becoming a profitable business. CoreWeave is "the poster child of the AI infrastructure bubble," wrote Kerrisdale Capital, an investment manager, in a September note announcing it was shorting the stock. "Strip away the noise, and CoreWeave remains an undifferentiated, heavily levered GPU rental scheme stitched together by timing and financial engineering, not lasting innovation." Kerrisdale thinks the fair value stock price of CoreWeave is $10. CoreWeave, of course, does not agree with this assessment. "CoreWeave is the essential cloud for AI, a software solution, built on cutting-edge physical infrastructure and designed from the ground up to provide the most effective and efficient super computers to those whose workloads demand the most computing power," said Lia Davis, CoreWeave's head of global communications, in a written statement. "We are seeing strong support from existing customers, who continue to extend and expand their agreements with us." All right. Time for the rabbit hole. I originally thought it might be fun to use CoreWeave as a glimpse at the economics of AI infrastructure. It's recently public, it doesn't have a second business line, and so whatever's going on there might give me a direct sense of what to expect out of the data center boom. I was not expecting what I found. Let's start with some very recent history. CoreWeave is a data center company that pivoted in 2022 from crypto. (In 2021, CoreWeave made its money by... mining Ethereum.) Essentially, CoreWeave is a landlord for compute: companies pay for the use of its server racks for AI projects. CoreWeave first came to my attention because it innovated in something that surprised me: using GPU as collateral for $2.3 billion in loans at an effective interest rate of 15 percent in the last quarter, according to the company's most recent quarterly filing. It turns out this was a pioneering innovation -- companies such as Crusoe, Fluidstack, and Lambda have also taken out similar loans. Even CoreWeave itself took out a second loan, for $7.5 billion, using more GPU for collateral. This time, the company got better terms -- only 10 percent interest, as of the third quartet. It also amended the loan to draw another $400 million -- getting an effective 9 percent rate in the process. The third loan was even better: 9 percent interest in the third quarter. (All three loans have floating interest rates.) The chips bring me to the main thing: CoreWeave simply isn't possible without Nvidia. The company said it owned more than 250,000 Nvidia chips, the infrastructure necessary to run AI models, in documents CoreWeave filed for its initial public offering. It also said it only had Nvidia chips. On top of that, Nvidia is a major investor in CoreWeave, and owned about $4 billion worth of shares as of August. Nvidia made the March IPO possible, according to CNBC: when there was lackluster demand for CoreWeave's shares, Nvidia swooped in and bought shares. Also, Nvidia has promised to buy any excess capacity that CoreWeave customers don't use. "These investments are not circular; they are complementary," CoreWeave's Davis wrote. "This is about an entire ecosystem all rowing in the same direction to accelerate the AI economy. There's nothing circular about it. Rather, these partnerships are about accelerating innovation and adoption. We are, collectively, defining the next-generation operating system for civilization." Okay, well, first of all: I notice that AI so far can't so much as turn on the lights -- and remember, when it comes to future claims, everything is vaporware until it ships. But CoreWeave isn't alone in its big bets on AI. So let's talk about the customers who are defining the "next-generation operating system for civilization." They're big names, sure, and they're paying a lot of money. But I'm not sure they're reliable long-term clients -- in fact, they're more likely potential rivals. CoreWeave's big buyer is Microsoft, which accounted for 71 percent of CoreWeave's income in the second quarter and 67 percent in the third. That makes CoreWeave more dependent on Microsoft than it was in 2024, when Microsoft accounted for 62 percent of revenue that year. But it kinda seems like Microsoft may have soured on CoreWeave. In March, the Financial Times reported that Microsoft "has walked away from some of its commitments" with CoreWeave "over delivery issues and missed deadlines." (CoreWeave disputed the reporting.) Around the same time, Microsoft chose not to buy $12 billion of CoreWeave capacity, even though it had the option, according to Semafor. Now, this may have more to do with Microsoft and OpenAI rejiggering their partnership rather than CoreWeave's service, but it nearly derailed the CoreWeave IPO. Beyond any immediate conflicts, Microsoft is building its own AI chips and says it wants to rely on them in the future. It has its own data centers, and is opening more. It is not difficult to imagine a future where CoreWeave's biggest customer suddenly becomes its biggest competitor; that possibility is listed in CoreWeave's risk factor section of its own filings. OpenAI entered its own five-year, $11.9 billion agreement with CoreWeave in March, and invested $350 million in CoreWeave to boot. That deal was expanded in May, and again in September, for what CoreWeave says is a total value of $22.4 billion. This isn't an airtight deal -- OpenAI can terminate part or all of it if CoreWeave repeatedly doesn't deliver on schedule. There are two other things that make this deal risky. First, OpenAI doesn't make money. Second, OpenAI is investing in its own data center project, Stargate, and says it will supply 75 percent of the compute OpenAI needs by 2030. So that's a five-year contract that might not renew. Meta recently signed a big contract with CoreWeave too, worth $14 billion through 2031. But Meta is also building its own data centers, and sold $30 billion in bonds in October to finance that buildout. So that's three major CoreWeave customers that may be temporary -- and, worse, turn into competitors. Naturally, I asked CoreWeave how they planned to compete, if necessary, with these customers who are also building their own data centers. I am going to give you the written response verbatim: Astute readers will notice that this does not answer my question. The dodge suggests to me that CoreWeave doesn't actually have a good answer. From the IPO documents, there is an unnamed fourth customer that, after Microsoft, is its second biggest. (None of CoreWeave's other customers represented 10 percent or more of CoreWeave's revenue in 2024, according to the S-1.) This is Nvidia, which "agreed to spend $1.3 billion over four years to rent its own chips from CoreWeave," according to The Information. In September, Nvidia signed a $6.3 billion contract that lets CoreWeave interrupt delivery, "giving us the ability to redirect capacity toward other opportunities without compromising overall commitments," Davis said in her written statement. "As a result, this contract allows CoreWeave to sell shorter-duration capacity to smaller companies that today are not positioned to sign the long-dated contracts we typically sign with our customers without sacrificing our disciplined approach to success-based capex." This has been broadly interpreted as Nvidia backstopping CoreWeave's demand, though it does not mean that Nvidia will buy all of CoreWeave's unused capacity. Spoiler: we've barely gotten started on the Nvidia links. There are some upsides to CoreWeave that aren't yet on its balance sheet: remaining performance obligations (RPO), or revenue CoreWeave expects that hasn't yet been paid. CoreWeave projects that almost half of its current $50 billion in RPOs will arrive by the end of September 2027, according to its most recent quarterly filing. There's also some revenue backlog. But that's the future, and CoreWeave's current debt is substantial. Unless there's a sudden huge uptick in CoreWeave's revenue over the next year, CoreWeave will likely need to borrow more money, Forbes reports. Which brings me to the thing that hangs over the company... What worries me most about CoreWeave is that I can't figure out how on Earth it outruns its debt. According to the three major ratings agencies, CoreWeave's credit rating isn't investment grade, which means they think there's a likelihood CoreWeave won't repay its debts. Junk bond status makes it more expensive for CoreWeave to borrow; junk bonds are more politely known as high-yield bonds, because lenders ask for extra money, ostensibly to compensate them for extra risk. Don't worry, the gap between interest and income is getting bigger. In its third quarter earnings, reported earlier this week, we've got $51.9 million in operating income -- and $310 million in interest expense. As of the end of September, CoreWeave was $14 billion in the hole with current and long-term debt, about $3 billion more than the quarter before. CoreWeave has several kinds of debt, but the main vehicles are the delayed draw term loans (DDTL), a type of loan that can be structured to have particularly complex requirements for the borrower to receive the loan payout. At the end of July, CoreWeave announced its third DDTL, to help finance services for OpenAI. All three loans are with CoreWeave subsidiaries, called special purpose vehicles, which make it easier for the loans to be chopped up and securitized. The loans are secured by CoreWeave's Nvidia chips and data centers. I'd like to linger on the special purpose vehicles for a minute. These SPVs aren't subject to the same kinds of regulations as the parent company. They allow funding for projects before a contract is made. They may have tax benefits. They can allow third-party partners to participate in financing. And, depending on how they are structured, they can protect assets in the event of a bankruptcy. In CoreWeave's case, they also allow for a lower cost of capital, Davis wrote. In the OpenAI loan, an SPV exists for GPU, high-speed fiber optics, and other assets that support the deal. The debt from the DDTL is secured by the assets and the OpenAI contract. OpenAI doesn't have a debt rating; a deal done this way for an investment-grade customer would likely be even lower. CoreWeave's SPVs "look pretty standard for de-risking incoming capital to an existing company," says May Hen, a fiscal sociologist at the University of Cambridge's Faculty of Law, in an email. CoreWeave chief executive officer Michael Intrator, a former hedge fund manager, is "probably ensuring each part of the influx of capital is protected separately as an investment vehicle." CoreWeave's loans also have some notable conditions. The first two DDTLs, as noted in the S-1, have variable interest rates, based partly on its customers. Investment-grade customers, for instance Microsoft, mean CoreWeave can borrow at lower rates. The third and most recent loan requires CoreWeave to bill or receive 85 percent of the revenue it projected in that month and the two before it. So if a big customer, for instance OpenAI, doesn't make its payment, that's a potential default. This isn't even all the debt. There's a $650 million revolving credit agreement with JPMorgan Chase, which expanded to $1.5 billion in May, and expanded again to $2.5 billion on Nov. 12th. (As of September, CoreWeave had drawn down $700 million of that money.) There are senior notes, a loan from Magnetar, and vendor financing. Each of these types of lending have different structures, says Gil Luria, an analyst with DA Davidson. "They have to keep borrowing more and more because they spend more money than they can get, structurally," Luria says. "They have to continue to borrow to pay interest on the last loan." Surely CoreWeave is running a tight ship on its complex financing operations, right? Actually, CoreWeave has already violated some of the terms of its $7.6 billion loan, triggering a technical default, which it had to ask Blackstone to waive. (Even though CoreWeave was paying its loans on time, violating some of the terms means lenders can demand repayment in full.) Fortunately, Blackstone didn't even ask for a penalty payment, according to the Financial Times. Still, this -- in combination with the disclosure of "material weaknesses in our internal control" in the S-1 -- seems significant. Those weaknesses, according to the third quarter report, have not yet been fully remediated. That makes it possible that CoreWeave could screw up its financial statements, though there is no evidence that has happened. "We believe we are making progress," the quarterly statement notes. But the company can't conclude it's fully fixed the problems until they've finished all their planned changes and evaluated them. Energy is one of the limiting factors that basically all AI data centers face. That's why you see a sudden resurgence of interest in nuclear power from players like Meta and Microsoft. The Stargate facility OpenAI is building might require almost seven gigawatts, which is more energy than required by the entire state of New Hampshire. CoreWeave, which is scrambling to build out -- including a deal with something called Poolside to build a new data center in Texas -- is no exception. There's a fun phrase that occurs twice in the CoreWeave IPO filing: "We have a proven track record of securing power." This track record means a deal with a company called Core Scientific (no relation) "for more than 500 MW of capacity" as of the end of 2024. Core Scientific is a formerly bankrupt Bitcoin miner. As part of its Bitcoin mining operations, it made long-term deals with power companies. Those long-term contracts are worth "many orders of magnitude more" than actually mining Bitcoin, because utilities have finite capacity, says Luria. So Core Scientific is now selling power to its only customer, CoreWeave. "These Bitcoin miners stumbled into a goldmine, completely accidentally," Luria says. CoreWeave has tried and failed to buy Core Scientific twice now. Sure, buying it would have given CoreWeave a gigawatt of data center capacity, but most of that capacity is already contracted out. And while it also would have meant that CoreWeave could stop paying Core Scientific for its services, I'm not really sure that matters. Core Scientific has its own problems, primarily that it doesn't make any money, either. "Core Scientific is spending a dollar and five cents to get a dollar from CoreWeave," says Luria. "And CoreWeave is spending a dollar to get 95 cents of revenue." Also, it's not like CoreWeave would be getting revenue from other Core Scientific customers, because Core Scientific doesn't have any other customers. So by cutting out the middleman, CoreWeave would be spending $1.05 to get 95 cents, Luria says. That deal fell through because Core Scientific shareholders thought their company was worth more than what was offered. "The outcome of the acquisition had no impact on CoreWeave's ability to execute against its roadmap," wrote CoreWeave's Davis. In the third quarter, CoreWeave's capacity increased more than 600 MW. "This increase in contracted capacity in the third quarter exceeds the entirety of our relationship with Core Scientific." She characterized Core Scientific as "nice-to-have, not need-to-have." But CoreWeave has nonetheless been on an acquisition spree. This year alone it's acquired Marimo, Weights & Biases, Monolith, and OpenPipe, all of which make assorted computing tools. "We've really been buying and building the AI cloud," Intrator said on Bloomberg TV. As near as I can tell, this is meant to differentiate CoreWeave from just being a data center company. "CoreWeave has built an AI cloud platform that includes both infrastructure and software," Davis wrote. Be that as it may, I don't think Microsoft, OpenAI or Meta are there for software tools -- they make their own. And I'm not convinced that the small companies that might need these tools are enough to keep CoreWeave's lights on. So how is CoreWeave going to get all the compute and power it needs? Part of the answer is that it's building its own data centers. But building is expensive. "CoreWeave has been at the tip of the spear of raising capital for the AI buildout," Intrator noted. "Our growth continues to rage along. It's amazing just how fast we're growing." He went on to say that CoreWeave would continue to raise capital to support its buildout as was necessary. Building is slow. As Intrator told Bloomberg, CoreWeave can't control every aspect of the build. And construction delays may seriously threaten CoreWeave, because there's one more expense lurking that isn't yet on the company's balance sheet: $34 billion in scheduled lease payments. Some of those payments are for "data centers and office buildings that have not yet begun to operate or bring in revenue," writes Forbes. So if CoreWeave can't build out fast enough, and customers cancel their contracts, it's particularly vulnerable. When I asked about CoreWeave's plans for meeting its contractual obligations, Davis said that all the contracts in CoreWeave's revenue backlog are matched to specific, existing data centers. But building fast seems particularly crucial to the company. Remember that if CoreWeave repeatedly fails to meet its obligations to OpenAI, OpenAI can yank some or all of its money. If that happens, CoreWeave goes bankrupt, Luria says. And CoreWeave has already made investors nervous. In its third quarter earnings, the company disclosed that a data center partner hit a delay. According to Intrator, that means a delay of revenue as well; he declined to tell Bloomberg which customer had been affected. CoreWeave's share price plunged. Even CNBC's Jim Cramer, usually a CoreWeave bull, seemed upset. "Some people might think it's one complex, but when I go over the numbers, we're talking about multiple places," CNBC's Jim Cramer said, noting that delays were in Texas, Oklahoma, and North Carolina. "And it just so happens that the places are all connected to an outfit called Core Scientific that you tried to buy." An uncomfortable-looking Intrator didn't confirm whether the partner in question was Core Scientific. CoreWeave isn't alone in its complex finances. Meta took on debt, using a SPV, for its own data centers. Unlike CoreWeave's SPVs, the Meta SPV stays off its balance sheet. Elon Musk's xAI is reportedly pursuing its own SPV deal. "Financial engineering is back in style, making some analysts wonder if all the commitments will be easy to spot," wrote Bloomberg. There are other massive debt commitments too -- Oracle sold bonds, Google parent company Alphabet has dipped into debt markets twice this year, and something called TeraWulf sold junk bonds for a New York data center. "Morgan Stanley estimates private credit markets could supply over half the $1.5 trillion needed for the data centre buildout until 2028," Reuters wrote. CoreWeave isn't going to be the only one of these companies that hits delays in their buildouts, either. It's just more vulnerable than they are. "We and many others have been quite forceful in making the case that there is a role in government in helping us," Intrator told Bloomberg, suggesting permitting processes could be made easier. This echoes remarks made by OpenAI's Sarah Frier, suggesting the government should "backstop" the debt for the AI buildout. Asking for government help does not strike me as a bullish sign. And as a taxpayer, I have no interest in helping private companies build data centers so they can keep the profits -- if, indeed, there are any. Perhaps you looked at the second quarter statement and you did some very basic math. CoreWeave had $1.2 billion in revenue, with only $313 million as the cost of revenue. That suggests a margin of almost 74 percent. The third quarter looks similar: $1.4 billion in revenue, with just $369 million as the cost of revenue -- also a margin of about 74 percent. Compare that to Oracle, another major AI data center provider. In October, co-CEO Clay Magouyrk said he expected its AI data center business to "eventually generate a gross profit margin of 30 percent to 40 percent," according to The Information. "Oracle has found it challenging to generate a gross margin of more than 25 percent from renting out Nvidia chips that came out one or two years ago," The Information added, citing an internal document. So let's consider CoreWeave's 74 percent margin. Is the company really executing that much better than Oracle? I don't think so, and neither does Luria. This is an accounting choice, and one I'm not sure I like. The biggest cost for a data center is the depreciation of assets, which happens fast. Chips get beat up, new chips get released, etc. CoreWeave is reporting this as "technology and infrastructure," which apparently also captures some other expenses too, and is considered an operating expense. But usually, this goes into the "cost of revenue" figure, Luria says. I checked 10-Q filings for Microsoft, Alphabet, and Oracle for a "technology and infrastructure"category, and didn't find it. Meta writes directly about putting depreciation of its data centers in its cost of revenue. So let's look at the second quarter again: Add the $559 million in depreciation to the cost of revenue, and suddenly, that's an estimated 28 percent margin, which isn't that far off from Oracle. The third quarter is worse. In it, we see $747 million in depreciation, which, added to the cost of revenue, gets us an estimated margin of 20 percent. I am calling this an estimate because when I asked CoreWeave about the decision, Davis told me "the substantial majority of our depreciation and amortization is within technology and infrastructure, with a small fraction in our cost of revenue." So the figures I'm giving here might not be quite right. She declined to comment on why CoreWeave was using the "technology and infrastructure" line, or why depreciation and amortization had been categorized this way. Most of CoreWeave's contract prices are fixed, though the company can get bonuses if it delivers compute early, Davis wrote. Operating costs aren't fixed in the same way. If, say, chips get more expensive, that knocks the margin even lower. It can't keep costs down by using its own chips, the way some companies, like Amazon, can. If the chips wear out faster, that's a problem, too. Companies like Oracle or Meta have other, higher-margin revenue streams. While CoreWeave sells some software, such as that it acquired from Weights & Biases, that's not its main business. Even if we ignore CoreWeave's debt load, its margins are fragile, and dependent on customers renewing or expanding contracts. After I noticed this, I got a little more interested in the 10-Qs. In the third quarter, there was a drop of almost $1 billion in net cash, over this period last year. This was partly attributed to an increase in "accounts receivable," which are basically customers' as-yet-unpaid bills. An increase in receivables means there's less actual U.S. American dollars on hand. Sometimes it means a business is stressed -- because customers that delay payments may be under financial strain themselves. Elsewhere in the 10-Q, CoreWeave notes that its customers usually have to pay within 60 days, but some can wait as long as 360 days to pay. I asked if the increase in receivables meant some customers were having difficulty paying. "CoreWeave does not have any issues with late payments from customers," Davis said. This brings us to the final problem: we don't know if companies will keep buying the main thing CoreWeave has to sell. It's possible they will. "What would allow them to dig out of the debt hole is that they have developed superior software to the rest of the market that makes them a magnet for leading AI providers, and the fact that they're able to charge hyperscaler-like prices while remaining flexible and able to offer early access to the latest GPUs," says Brendan Burke, an AI industry analyst. Davis, the CoreWeave spokesperson, noted that CoreWeave "recently earned SemiAnalysis' highest honor (again)." That seems to confer, at least, a pricing advantage. CoreWeave charges $68.80 per hour for a cluster of Nvidia B200s, according to Cloud GPUs, a compiler for pricing. The next-most expensive company, Cirrascale, charges from $38.32 to $47.92. The CoreWeave premium reflects that "not every data center is created equal," Burke says. "There's a wide spread in reliability between different data centers on the market." If CoreWeave maintains its engineering advantage, Burke thinks CoreWeave will continue to attract large customers. That's not a given in the long term. And if CoreWeave has to compete on price, that's a problem. There's also the question of demand. Right now, compute is one of the hottest commodities in AI. But the demand is mostly for training models, says Alex Hanna, the director of research at the Distributed AI Research Institute. Inference -- basically, customer use, a more predictable ongoing expense -- doesn't come close. Chirag Dekate, an analyst at Gartner, concurs. This might work out for CoreWeave in its best-case scenario. Let's say companies require more and more compute to train ever more powerful models, and customers basically immediately adopt AI and start using it heavily. In this scenario, no one ever has to stop building data centers, and there's always going to be more demand for compute than Google, Amazon et al. can provide. If CoreWeave can somehow stabilize its debt or charge more for compute, maybe things turn out fine. Or maybe CoreWeave can just keep borrowing and building forever. Obviously if AI flops, CoreWeave is fucked along with everyone else. What happens if it's important but basically normal technology? Let's just assume AI isn't going to surpass human intelligence in 2030 and upend everyone's job like Sam Altman's saying. Let's say it's more like online shopping. Online shopping, of course, is now ubiquitous. But its adoption curve left behind a graveyard of companies from the first dot-com boom. Take Pets.com, which shut down in 2000, and was, at the time, the mascot for dot-com failure. In 2017, Chewy, which was basically the same idea, was bought by PetSmart for $3.35 billion, which at the time was the largest-ever ecommerce acquisition. The main difference between Chewy and Pets.com is that in the 17 years between Pets.com's failure and Chewy's massive sale, people got comfortable with buying online. Or consider streaming video. Pseudo.com, an early audio and video streaming service founded by Josh Harris, filed for bankruptcy in 2000. Now, 25 years later, video streaming is eating both television and movie theaters. The difference is that... among other factors, people aren't on dial-up anymore. Harris' idea was right, but the tech wasn't ready yet. So if AI is a normal technology, it's going to take time for people to adopt it and companies to work out the kinks. A lot of demand assumptions have presumed that AI will flow out to low-tech industries, but so far its use is concentrated in fairly sophisticated ones, Burke says. The physical world is still relying on old computing techniques, and not every business can afford the investment it would take to shift to AI, even if it were a less risky technology than it is now. "The idea that this is a complement to our existing systems rather than a fundamental thing suggests we don't need to exponentially increase the volume of computing," Burke says. Big companies like Microsoft don't have to bet their entire company; they can just bet a portion, leaving them flexible to adapt to whatever the future holds. But companies like CoreWeave that are all in on AI have taken on a huge debt load that ties them to one specific future, Burke says. Meanwhile, whether training models keep sucking up more and more power is anyone's guess. Remember several months ago, when everyone panicked over DeepSeek because it was so effective and so cheap to train? People have calmed down since then, but 90 days ago, Luria says that major companies told him that compute supply was catching up with demand. A few weeks ago, these companies reversed themselves: demand surged again. What will happen next week? There's been lots of talk about how many Americans use AI daily -- a fifth of Americans, if Menlo Ventures is to be believed -- but many of these products are available for free, which may not be sustainable forever. The highest rate of use is among students, Menlo found. Whether people will pay for ChatGPT seems like it's still an open question. I am skeptical that AI is really a consumer technology -- it's not clear the new web browsers and "AI agents" are actually all that useful. The fact that OpenAI has released a slop social network doesn't seem like a great sign. What's more, most surveys of consumer behavior suggest that people don't really trust AI. There's good reason for that! There are probably real enterprise uses for AI, since that category is significantly broader than fact-challenged chatbots. The people I know who are most excited about AI say that it makes them significantly faster at coding. But 95 percent of businesses are getting zero return on their AI investments, according to the MIT Media Lab. What's more, enterprise-grade systems "are being quietly rejected" by organizations because they tend to, well, fail. That could change. But it might change slowly, and it might mean specialized small AI programs, rather than big general ones. If the demand for AI grows slowly, the breakneck pace of data center expansion will also slow. "There are a million ways CoreWeave can fail, but people believe in this very rosy scenario where we build data centers forever," Luria says. "I talk to a lot of investors, and they tell me it doesn't matter. As long as there's insatiable demand for compute, CoreWeave will continue to grow." Luria told me that many of the big players in the space are willing to overbuild, because other parts of the business can use those centers. All they really have to do is pause spending for a couple years. CoreWeave doesn't have that luxury. If the current data center frenzy results in supply meeting or outstripping demand, it's fucked -- because it's the overflow facility for the big players. It is perhaps time to discuss the enormous stock sales from CoreWeave's management team. Before the company even went public, its founders sold almost half a billion dollars in shares. Then, insiders sold over $1 billion more immediately after the IPO lockup ended. Magnetar, one of CoreWeave's biggest shareholders -- and one of the companies financing CoreWeave's GPU deals -- also sold almost $2 billion. "It's noteworthy that people who have a good view on that business are cashing out," says Leevi Saari, a fellow at the AI Now Institute. If you thought CoreWeave was going to keep growing in a once-in-a-lifetime fashion, wouldn't you want to hold onto those shares and see if you could out-rich, like, Larry Ellison or Elon Musk? Of course, if you think CoreWeave's stock price is a flash in the pan, you might want to get out while the getting is good. "Stock sales of this kind are common and routine for newly public companies," CoreWeave's Davis wrote. "The recent selling has primarily come from early investors. Any founder sales were made under non-discretionary pre-scheduled 10b5-1 trading plans for standard tax and wealth planning purposes and represent a small portion of their overall holdings." Despite their sales, CoreWeave founders retain control of the company through Class B shares. Much like Meta, the dual-class structure means that founders effectively get extra votes through these shares, which then convert to regular Class A shares for sales. Davis declined to say how much voting power the founders had after the most recent sales. So some people have gotten very rich from AI, largely from selling shares in speculative investments. "There's a bubble, and CoreWeave is likely to fall in the very near future," says Saari. "I think it is retail that will be left holding the bag." I started kicking the tires on CoreWeave because I thought it might be a good place to think about the economics of AI data centers. But the company seemed like such a weird outlier that I, uh, got kind of derailed. It seems to me like CoreWeave is a great idea for everyone except CoreWeave. AI's actual end users, the real clients, are coming to Microsoft, OpenAI, and the other big tech companies. CoreWeave lets these companies take on more demand without splashing out on depreciating assets by supplying extra data center capacity until their new centers get built. Once that happens, they may not need CoreWeave at all. CoreWeave doesn't have meaningful lines of business besides selling compute. It is dependent on a few big contracts. People who are bullish on CoreWeave are essentially saying that we are going to have to build data centers forever and ever, amen. That seems unlikely, even if there is massive demand for AI, which isn't necessarily a given. While all of CoreWeave's clients benefit from CoreWeave eating their risk, there's a company that benefits most: Nvidia. It is, after all, CoreWeave's investor, customer, and vendor. Two emails requesting comment to Nvidia's press contact -- and two more to Ken Brown, its head of corporate communications -- were not returned. CoreWeave's dependence on Nvidia isn't a secret -- Intrator has bragged about CoreWeave's relationship with Nvidia, saying he is "not bashful about reaching out" to Nvidia CEO Jensen Huang. CoreWeave's Davis declined to comment on how often the two talked, or what about. It makes a certain kind of cynical sense to view CoreWeave itself as, effectively, a special purpose vehicle for Nvidia. Here's a stylized way of looking at it: Nvidia can spend one dollar -- in investment or other payments -- to prop up an entity that promptly buys five dollars of Nvidia chips, using money borrowed from other companies. Nvidia directly profits from the sale, and also creates competition for its chips, giving it leverage over the Microsofts and Amazons of the world. All without Nvidia taking on CoreWeave's debt, because CoreWeave is a separate company. "If Jensen can keep the competitive fire on, that might discourage hyperscalers from developing their own chips," says Saari. CoreWeave and other so-called neocloud companies help Nvidia keep its market share and push itself as the de facto infrastructure in the market. "CoreWeave and Nvidia have a long-standing partnership that's consistent with Nvidia"s relationships across the industry and important for building out access to AI infrastructure to meet today's relentless demand," CoreWeave's Davis said in a statement. She added that CoreWeave is an independent company that doesn't receive preferential treatment from Nvidia. She's absolutely right about the relationships across the industry. Nvidia has invested in a number of neoclouds, such as Crusoe and Lambda. Both of those companies have taken on significant debt, sometimes using their Nvidia chips as collateral, to buy -- you guessed it! -- more Nvidia chips. In Europe, there's Nebius, running Nvidia chips with an Nvidia investment. It too has taken on debt to buy more Nvidia chips. "There is no neocloud that exists without Jensen," says Saari. That makes neoclouds, in effect, extensions of Nvidia, he says. And none of them make money, so to expand, they must take on debt. If we look at these as being, metaphorically, Nvidia's special purpose vehicles, then it doesn't really matter if the companies are any good or will survive in the long term. Their job is to boost Nvidia's sales. Even OpenAI, also an Nvidia investment, kind of falls into this category -- because the massive data center buildout that OpenAI wants the government to backstop sure involves an awful lot of Nvidia chips. If you are old enough, or possessed of a certain kind of disposition, you may be thinking, Wait a minute, aren't you describing Enron? And uh, in some sense, yes! Enron's whole thing was special purpose vehicles with extremely speculative valuations that were used to take on debt, Luria notes. But Enron lied about what it was doing, and that's fraud and illegal. (It also got up to other illegal stuff besides.) Nvidia's relationship with CoreWeave is all happening in plain sight. So are all the relationships with the other neocloud companies. It kind of seems like the tech company version of the GameStop open pump-and-dump. "It's not good behavior, and it's not healthy behavior," Luria says. "But it's legal. Any investor can see this. Many are just choosing not to." Nvidia's investments have been accelerating -- $3.8 billion in nonmarketable equity securities as of the end of July, a $2 billion increase from the year before. Nvidia made four investments in 2020; it has made 51 as of September 2025, according to CNBC. "The majority of Nvidia's portfolio companies have some strategic connection to the company's business," CNBC noted. Huang has been selling shares -- more than $1 billion since June. To put that in perspective, Huang has sold almost $3 billion of Nvidia shares since 2001, according to Bloomberg. It's not just Huang. Nvidia insider sellers have been unloading throughout the third quarter. With Huang's sales, the insider figures stand at $1.5 billion in that quarter. "While many companies have benefited from the AI gold rush, Nvidia stands alone in a virtually unheard-of feat of wealth creation with seven billionaires, including Huang, among its ranks," Bloomberg wrote of the recent sales. Maybe I am being unfair. Nvidia has been making a lot of money over the last few years, and generally investors expect companies to do something with that money. One could argue Nvidia's profits are so outsized it's exhausted all the usual methods, like share buybacks and acquisitions. But by investing in AI companies with ties to Nvidia itself, Nvidia has added more risk -- because if something goes wrong for Nvidia, it goes wrong for all these investments, too. I'll tell you something: diving into CoreWeave financials has made me feel fucking crazy. Like, here I am suggesting that a bunch of independent neocloud companies look like Nvidia SPVs. But the real thing I learned from CoreWeave has been that Nvidia is basically propping this company up, with some help from Microsoft, while Huang signs titties or whatever. If that's indeed what's happening, then -- as the song goes -- enjoy yourself, enjoy yourself, it's later than you think.
[2]
The 'S&P 493' reveals a very different U.S. economy
A trader works on the floor of the New York Stock Exchange on Nov. 19. (Charly Triballeau/AFP/Getty Images) On its face, 2025 has been a good year for the stock market. The S&P 500 was dragged out of its tariff-induced springtime slump by a small subset of AI-forward power players whose spectacular gains defied an otherwise softening economy. Even now, despite a rocky November, the benchmark index is up more than 12 percent since the start of the year. A group of trillion-dollar brands known as the "Magnificent Seven" -- Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia and Tesla -- has been at the forefront of those gains, thanks in large part to corporate spending and intense interest in artificial intelligence. But economists and investors are raising concerns about the companies that aren't part of the AI investment boom -- in other words, most businesses in the United States. An index that leaves out the seven high-flying tech firms -- call it the S&P 493 -- reveals a far weaker picture, as smaller and lower-tech companies report lackluster sales and declining investment. "You have the headwind of de-globalization and tariffs, and the tailwind of AI ... those forces are battling to a draw, and in that crosswind you get winners and losers," said Moody's Analytics chief economist Mark Zandi. "Anything that is not connected to AI is throttled lower." When OpenAI unveiled its chatbot ChatGPT in late 2022, it sparked a surge in AI investment, and a handful of tech companies that provide infrastructure and support around the new algorithms -- the picks and shovels for an AI gold rush -- caught the wave. (The Washington Post has a content partnership with OpenAI.) AI spending supercharged the valuations of the Magnificent Seven ever since. Shares of Nvidia, a longtime manufacturer of video game graphics cards that became the AI chipmaker of choice, have soared more than 1,000 percent since January 2023 as of Friday's close. The pace of growth cooled this year -- Nvidia is up 29 percent in 2025 -- and it's leveled off at Amazon, Meta and Tesla. (Amazon founder Jeff Bezos owns The Post.) The data company Palantir has carved out a niche helping large organizations apply AI to their operations, doubling its market value since the start of the year. Micron rose more than 130 percent on the strength of its memory chips, and Ohio-based Vertiv rose 35 percent for its data-center cooling systems. The chipmaker Intel rose 70 percent even as it laid off thousands of employees. Some experts are worried that the S&P 500, an index of large-company stocks that underpins the fortunes of millions of Americans with 401(k) and other retirement accounts, has become too reliant on the Magnificent Seven; they collectively account for about a third of its value, leaving the broader stock market heavily dependent on the continued success of "the AI trade," says Torsten Slok, chief economist at the private equity firm Apollo Global Management. "There is no diversification in the S&P 500 anymore in my view ... it is all the AI story now," Slok said. S&P Dow Jones Indices spokeswoman Alyssa Augustyn said the objective of the S&P 500 index is to track the performance of large companies. She added that the index is "consistent with the sector diversification for the broader market," referring to 11 industry sectors including health care, financials and information technology. Publicly traded small and midsize companies have taken a beating by comparison. The Russell 2000, an index made up of the smallest 2,000 companies on the public markets, lost 4.5 percent in the one-month period leading up to Friday, compared with a loss of around 2 percent for the S&P 500. Smaller companies have posted lackluster financial results recently, said Wells Fargo senior market strategist Scott Wren, who notes that a little more than a third of the companies in the Russell 2000 index either don't make money or are losing money. Smaller companies are being hit harder by a slowing economy, he said, as they have less of a cushion to absorb import price increases resulting from tariffs, and less flexibility to avoid the new duties by shifting their supply chains. One analysis from JPMorgan and Moody's shows capital expenditures -- a measure of how much businesses are spending on physical assets like buildings, machinery or patents -- is close to flat for companies not connected to AI, which economists see as a worrying sign that low-tech businesses aren't growing. AI Icon New!Get more context or dive into the details with Ask The Post AI. Smaller companies are also more likely to rely on debt to fund their operations, Wren said, something that makes them more sensitive to changes in interest rates. Investors in recent weeks have become more skeptical that the Federal Reserve will cut rates again in December, driving small-company stocks lower, Wren said. In a sign of its sensitivity to interest rates, the Russell rallied 2.8 percent Friday after a Fed official hinted in a speech that "further adjustment in the near term" for interest rates could be needed, spurring a broader stock market rally. "If the concern also is that the employment picture is not very good, and inflation remains sticky, that could be extra bad for small caps," said CFRA chief investment strategist Sam Stovall, referring to companies that have smaller market capitalizations. Broader uncertainty in the markets has also taken a toll on stocks in recent weeks, analysts said, as some investors have rushed to the relative safety that larger companies can provide. Risky assets such as cryptocurrency have suffered in recent weeks, with bitcoin recently sinking below $90,000 for the first time since April. Gold has surged in value this year while the dollar sank, reflecting a lack of investor confidence in traditional safe havens. "Given the greater sensitivity of small companies to domestic economic trends, investors have been reducing exposure to small caps in favor of large cap companies who continue to benefit from global growth related to AI-based technologies," said Wayne Wicker, president of Opal Capital. The market's concentration in Big Tech has also given rise to concerns about what would be left if an AI bubble were to burst. Those fears have been amplified in recent weeks as Big Tech names suffered a modest sell-off, with some analysts raising concerns that the AI industry has overspent on infrastructure at a time when the technology's actual profit-generating potential is still nascent. Advanced Micro Devices, which manufactures a wide range of data-center electronics, lost 16 percent in a rough week of trading, trimming a rally that brought it up nearly 70 percent for the year. Michael Burry, the hedge fund manager who grew to prominence by betting against the housing market before its 2008 collapse, has accused leading AI companies of overstating the long-term value of certain assets. AI Icon New!Get more context or dive into the details with Ask The Post AI. Burry and others have argued that an inordinate amount of AI spending seems to come in the form of different corporate entities spending on one another, while the timing for a return on those investments is unclear. Tech stocks have endured a series of rocky sell-offs since late October, with the tech-heavy Nasdaq index falling around 7 percent from its Oct. 29 peak. Markets rebounded Friday, with the index trimming some of its losses from earlier in the week. Slok, the Apollo economist, says he is particularly worried about the recent AI losses because so much of the recent economic growth has been shored up by free-spending wealthy households. A deep correction in AI stocks, if it ever arrived, could threaten the "wealth effect" that is doing so much to prop up the economy, Slok warned. "Consumers and corporates alike have become vulnerable to whether the AI story continues or not," Slok said.
[3]
Nvidia's rise seemed unstoppable, but cracks may be appearing in the strategy that built its $4.5 trillion empire | Fortune
In late October, Nvidia cofounder and CEO Jensen Huang took the stage at the company's annual GTC conference to make a typically sweeping declaration. Nvidia, he pronounced, sits "at the epicenter of the largest industrial revolution in human history," eclipsing the advent of the steam engine and electricity. He went on to unveil a stunning array of partnerships: Nvidia plans to build 100,000 driverless cars alongside Uber and join Palantir to supply software and chips to accelerate the transfer of products from warehouses to doorsteps; it has also hatched a blueprint showing "hyperscalers" how to build "AI factories," giga-scale data centers that only operate at their most potent and efficient deploying, guess what, Nvidia systems. Huang's speech so pumped Nvidia fans that they sent its market cap soaring 5%, a one-day, $250 billion jump that's 60% bigger than Boeing's entire worth. The move made Nvidia the world's first company to touch a $5 trillion valuation. According to MarketBeat, 46 out of 47 analysts have a "strong buy" or "buy" rating on Nvidia stock. A small chorus of extremely insightful skeptics, however, aren't so sure everything is quite as rosy as Huang and investors seem to believe. Their worries coalesce around how Nvidia is striving to maintain its supremacy, by assembling what we've never seen before on this scale: a complex superstructure encompassing investments and financing for its own customers designed to boost and perpetuate demand for its own products. The strategy -- centered mainly on OpenAI and "neocloud" CoreWeave -- is as much about financial engineering as accelerated compute engineering. As Jay Goldberg of Seaport Global Securities, who's issued the only Nvidia "sell" on the Street, puts it, "Nvidia is buying demand here." Lisa Shalett, chief investment officer at Morgan Stanley Wealth Management, says the issue is all about excessive debt, not Nvidia's but its customers'. "Nvidia is in a position to prop up customers so that it's able to grow," avows Shalett. "It's getting more and more complicated because the ones they're funding are weaker, and Nvidia's enabling them to take on borrowing." Nvidia declined to comment for this story. But to understand how this powerful symbol of the new American economy got here, you have to grasp three things: the power dynamics at play among the big hyperscalers like Amazon, Microsoft, Alphabet, and Meta that are erecting giant AI data centers; the neoclouds rising to challenge them; and Jensen Huang's most gripping fear. Nvidia's path to AI dominance During most of its 32-year existence, Nvidia focused on making 3D graphics chips for the PC gaming market -- and struggled in the early days, at one point running out of cash. In 1999, it unveiled a super-advanced GPU (graphics processing unit) that generated images by handling complex computations at a far higher speed than its predecessors. A series of breakthroughs in the 2000s enabled a kind of AI known as "deep learning" -- and allowed the technology to do things it had struggled to do previously -- first translation, then image recognition. By around 2012, Nvidia had gone all in on GPUs and other hardware optimized for AI, which it coupled with its custom CUDA software that empowers developers to write applications for its groundbreaking chips. Once OpenAI released ChatGPT in late 2022, the AI boom was starting, and Nvidia had brilliantly positioned itself to ride the wave. Its GPUs ranked as the industry's most powerful and, crucially, most efficient per unit of power consumption in running the stupendous number of mathematical calculations deployed in AI. In fiscal year 2023, it made a modest $4.4 billion in profits. That figure has soared to $86.6 billion over the past four quarters, making Nvidia the largest tech earner behind Alphabet ($124.3 billion), Apple ($112 billion), Microsoft ($104.9 billion), and ahead of Amazon and Meta. Unlike past high-fliers that got heavily touted and collapsed -- notably drivers of the telecom bubble of the late 1990s and early 2000s -- nobody is questioning that Nvidia makes tons of money. But it's apparent that the chip colossus is reliant on a few huge customers. In the second quarter of this year, Nvidia collected 52% of its sales from three customers it didn't disclose, but which analysts identify as Microsoft, Amazon, and Alphabet. Meta and Elon Musk (via xAI and Tesla) are also big buyers of its systems. Diversifying internationally is not easy, especially as the Chinese government cut off imports of Nvidia's lowerpowered chips as a riposte in the trade war, and the Trump administration's export controls have blocked sales of its most coveted and sophisticated GPUs. Last year, the world's second-largest economy contributed more than one in eight dollars of sales via such clients as Baidu and Alibaba. That number has already been shrinking and is bound to fall further if not disappear, potentially heightening Nvidia's reliance on the biggest of big players. Huang believes that leaning so heavily on a few, superpowerful buyers is dangerous. According to a source who's been active in developing AI infrastructure, Nvidia's prime motive is avoiding the fate of almost all tech hardware suppliers: getting commoditized. "Hardware is a low-margin business, and Nvidia knows that you can only maintain a technological edge in hardware for so long," says this source. "They're afraid of getting commoditized like the makers of CPUs [the basic "brains" of today's computers]. They're afraid their bargaining power will decline over time and that their huge margins will collapse." Right now, Nvidia's a wondrous exception in tech hardware. In fiscal year 2025, it registered fantastic gross margins of nearly 80%, easily beating AMD, at just under 50%, not to mention Intel, at 30%. Until now, Nvidia has held the whip hand. "Its GPU is so superior that it has a near monopoly," says analyst Goldberg of Seaport. As a result, an Amazon or a Meta hasn't been able to deploy the tremendous negotiating clout their immense size gives them versus their hardware suppliers. The big hyperscalers are striving hard to regain the advantage. How? The major players don't just want to be Nvidia customers; their long-term goal is to compete with Nvidia by fashioning their own in-house alternatives. Microsoft is working on a version dubbed Maia, and Amazon has deployed two -- Trainium and Inferentia. Google has already largely shifted to its own silicon, which it calls a TPU. All this competition is likely to depress prices. "Most AI to date has been built on one chip provider. It's pricey," Amazon CEO Andy Jassy wrote in its 2024 annual report. "It doesn't have to be as expensive as it is today, and it won't be in the future." Microsoft's top executives have also stressed that they want their own silicon for powering their data centers. Gil Luria, head of technology research at D.A. Davidson, puts it simply: "Nvidia wants to diversify away from the Big Four, and the Big Four want to diversify away from them." ("Big Four" here meaning Microsoft, Amazon, Google, and Meta.) But plenty of other players want to diversify into Nvidia's world. AMD is becoming a big player in inference, where Nvidia dominates; Qualcomm is trying to get into the data center chip game, as are a slew of startup AI chip companies like Groq, which are beginning to get some traction, especially in the Middle East. Even OpenAI, which is forging tight bonds with Nvidia, is simultaneously going all in on making its own silicon as it tries to diversify away from Nvidia. Dylan Patel, who writes the widely respected SemiAnalysis newsletter, wrote in mid-November that OpenAI's entrant may be so good that Microsoft may choose to use it over its own product, Maia. So the relationships are increasingly complicated as these players strike deal after deal. CoreWeave's biggest customer by miles is Microsoft; it wants to broaden its reach to serve OpenAI and other AI startups. OpenAI meanwhile is a CoreWeave customer, a CoreWeave investor, and soon to be a CoreWeave competitor once the Stargate consortium gets up and running. But within this web of relationships, it's two that Nvidia has developed -- with OpenAI and CoreWeave -- that have caught the attention of the "what could go wrong" crowd. "Nvidia's been clever in propping up another ecosystem of their own so that they can compete with the big hyperscalers in data center capacity," says the person involved in AI infrastructure. To do so, Nvidia is using something akin to "vendor financing." It's the long-standing, totally legitimate practice of making loans to customers that spur them to buy your products -- it's how automakers boost sales by providing car loans. It works fine as long as there's real demand for your products. The Nvidia approach might be characterized as a push for "direct" or "curated demand." That formula's downside: The lavish financing can trigger excessive borrowing and push AI computing capacity way beyond the point where demand can catch up. However, it's important to note that Nvidia's campaign doesn't amount to the notorious "roundtripping" that sunk such telecoms as Nortel and Lucent, which made big loans to strapped telecom fiber-optic purveyors so they could buy their products, even though they were just piling up unused inventory. So far, Nvidia's boldest move in challenging the behemoths is its breathtaking deal with OpenAI. Huang considers the still-private inventor of ChatGPT as the spearhead of the new paradigm, and by implication, Nvidia's largest customer going forward. He recently predicted that OpenAI will become the "next multitrillion-dollar hyperscale company." Prior to the new tie-up, OpenAI was just getting started as a hyperscaler via its participation in Stargate. It didn't operate data centers on its own and rented capacity from the Big Four, notably Microsoft. The Nvidia partnership aims at making OpenAI a major competitor, in addition to its status as a customer, to the giant incumbents. Under the arrangement, announced in late September, Nvidia would purchase up to $100 billion in OpenAI equity, enabling it to outfit numerous data centers offering an astounding 10 gigawatts of capacity at an estimated total cost of roughly $500 billion. The build-out would happen in tiers, and for each one Nvidia provides $10 billion in financing, OpenAI throws in $40 billion, and an estimated $30 billion gets spent on Nvidia chips. But Luria cautions that a lot could go wrong. OpenAI needs to find the extra $40 billion required for each installment on its own. That's a high bar. To this point, OpenAI has succeeded in funding its operations by raising equity. But it's now entering a new era of big capital investments at the same time it's bleeding loads of cash; in the first half of this year, it lost $13.5 billion on just $4.3 billion in revenue. A recent report suggests the company will see operating losses of $74 billion in 2028 and not break even till 2030. Hence, it's likely that OpenAI will need to borrow most of the additional $400 billion. OpenAI has now made total commitments of around $1.4 trillion for its AI infrastructure projects, including $300 billion worth to be built by Oracle (which incorporates huge purchases of Nvidia GPUs). Given the frenzy of optimism surrounding OpenAI, it may succeed in borrowing hundreds of billions to amass all those GPUs. But as Nick Del Deo, analyst at MoffettNathanson, notes, "OpenAI is an unproven business model." Significantly, Nvidia is only an equity investor so far. It hasn't offered its balance sheet to backstop the immense debt OpenAI will need to push the program forward. Adding one more wrinkle to the competitive landscape: OpenAI doesn't want to be completely beholden to Nvidia, either; it is talking to other chip providers and has even begun efforts to design its own bespoke chips. Nvidia secured its second marquee partnership with CoreWeave. It's the largest by far of the "neoclouds" that are rapidly installing AI infrastructure. The former crypto miner, however, is struggling under a mountain of debt that analysts are skeptical it can "scale" out of. So Nvidia is giving the company lots of help to buy its chips -- in far larger quantities than would be possible without those billions in support. Nvidia invested $250 million to bolster CoreWeave's troubled IPO this year and holds an ownership stake of over 6%. CoreWeave, like OpenAI, is so far an "Nvidia-only" house. It already operates 33 data centers, all holding its big backer's GPUs, and boasts a $56 billion backlog of contracts for AI infrastructure. Nvidia is also using its muscle to help CoreWeave attract promising startups that, without the chipmaker's backing, couldn't raise the financing to rent space from the rising hyperscaler. In late September, Nvidia signed an agreement guaranteeing to purchase $6.3 billion in computing capacity if CoreWeave can't lease or sell it. "It's like cosigning on a loan," says Del Deo. According to a source familiar with CoreWeave's thinking, that support will empower the operator to provide computing capacity to such outfits as Mistral and Cohere that could evolve into big AI success stories. For Nvidia, CoreWeave fits the OpenAI mold: attract purchasers that used Microsoft or Amazon centers -- powered by GPUs they bought from Nvidia -- to CoreWeave facilities. In an excellent 80-page report on CoreWeave released in March, Del Deo praises its technical expertise and customer service. The potential problem: When CoreWeave signs a contract with a financially sound major hyperscaler that needs lots of capacity beyond what it can handle in-house, those rental payments are "money good." CoreWeave can raise financing, since it's certain the client (say, Microsoft) will pay. But as part of growing its own and the new Nvidia complex, CoreWeave is taking on cash-burning newcomers that lack credit ratings. The biggest case in point, in fact, is its arrangement with OpenAI itself -- another indication of how deeply the Nvidia deal-grid is intertwined. The terms call for CoreWeave to supply $22.4 billion in AI infrastructure capacity, certain to run on Nvidia GPUs, over about five years. Based on the great expectations for OpenAI, CoreWeave already financed part of the arrangement through bank loans. But the borrowings were expensive at over 8%. "The deal with OpenAI and other non-investmentgrade clients raises the risks for CoreWeave," concludes Del Deo. In addition, CoreWeave, like OpenAI, harbors big cash flow deficits and is borrowing heavily to finance its mushrooming footprint. Its cofounder and CEO, Michael Intrator, claims he's driving a "race car, not minivan," and that "debt is the fuel for this company." But could it also be its downfall? A debt debacle? To see where the dangers lie, it's important to grasp who owes all this debt, and to what parties. The brick-and-mortar data center shells are generally built by real estate companies, chiefly REITs. They finance the projects with construction loans provided by, say, banks or private equity firms, and lease the space to the hyperscalers. The biggest among them may use their own cash flow to buy the systems that fill the campuses, but a CoreWeave or OpenAI doesn't have the resources and must borrow the funds from a variety of lenders. Once again, if demand from Microsoft's enterprise and app customers proves much lower than anticipated, it will simply keep paying the lease on the building to the REIT and interest and principal on the loans for the tech gear to that lender. No risk there. The new, selfassembled Nvidia galaxy is heavily populated with the likes of OpenAI and startups that haven't proved they can make money based on their AI offerings, but are poised to shoulder big borrowings for creating and expanding AI infrastructure empires. CoreWeave and OpenAI are indeed signing leases with their AI customers to rent that forthcoming deluge of computing capacity as it comes online. But to keep renting, these customers need to mint rich profits from what's advertised as the greatest technological leap and force multiplier for profitability ever. If that doesn't happen, the market could get flooded by partly vacant data centers and by GPUs that securitized the borrowings, and get reclaimed by the lenders, sending all those unused Nvidia chips back to the market. In this scenario, the new hyperscalers Nvidia backed won't be able to pay the rent on the campus buildings, nor the interest and principal on the debt that bought the GPUs. That could wallop lending for data centers and demand for Nvidia's GPUs. That scenario isn't dissuading true believers. "Nvidia's stayed ahead time and time again," says one executive from a large Nvidia customer who was not authorized to speak publicly. "They keep shortening the product cycles. Everyone is playing catch-up with their continued innovation." But any clearheaded industry watcher has to at least raise an eyebrow about how much money is flooding into AI, and how little payoff there has been so far. All told, the four big players will spend around $330 billion on AI infrastructure in 2025, according to their own forecasts, and all are pledging significant increases next year. Citigroup forecasts that the figure for all hyperscalers next year will reach $490 billion. So let's take the conservative view that the Big Four will spend $700 billion over the two-year span. To garner even a decent 15% return on those expenditures alone, they'd need to pocket an extra $105 billion a year in AI profits. That additional $100 billion-plus amounts to nearly one-third of the $350 billion in total GAAP net profits that they generated in their past four quarters. Notes one executive who's raised large sums for AI infrastructure in the past: "I share the concern that such massive amounts of capital are going into it without a clear line of sight on profitability. We have a sense for productivity savings and research breakthroughs. But what are the killer apps that are going to drive massive value? We don't know yet." This executive asks the question that Huang may ponder when he stops selling and thinks deep thoughts: "If the return on all that capital is disappointing, where does the pain go?" As long as the AI hype narrative continues, Nvidia's money-spinning machine will keep whirring. But if the story changes, Nvidia will feel the pain -- and so will its investors. This article appears in the December 2025/January 2026 issue of Fortune with the headline "Nvidia is invincible. Unless it isn't."
[4]
Beating the AI bubble
You can't help but feel uneasy when looking at market concentration. Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla now make up more than a third of the S&P 500, more than twice the level seen before the dot-com bust. AI-related capital spending has outpaced the U.S. consumer as the main driver of gross domestic product growth. OpenAI alone plans trillions in data-center investments while exiting 2025 with about $20 billion in annualized revenue. Of course, there are physical limitations to how fast we can build. Data centers require enormous energy, land, and skilled labor -- more than trade schools produce today -- a concern raised in the Trump administration's U.S. AI Action Plan. On top of this is a web of circular financing among major players. Companies are using complex structures to fuel the investment wave, adding opacity and risk. Investors like Masayoshi Son and Michael Burry are heading for the exits. In a new Bank of America survey, 45% of investors cite an AI bubble as the top tail risk for the economy and markets. Many believe AI stocks are already in bubble territory. When a bubble bursts, it is like a balloon losing air. Prices fall, investors pull back, and companies that depended on constant capital inflows often fail. The slowdown can ripple across the industry. But a burst forces a reset, where work with real value continues and the rest falls away.
[5]
Inside the AI Bubble
On Wednesday evening, Nvidia, the chip firm at the center of the world, reported its quarterly earnings. It was by any measure a blowout for the world's largest company: the company made 65 percent more profits than in the same quarter last year, sales were even higher than analysts expected, and leadership is forecasting at least $500 billion in AI chip sales by the end of 2026. Permanently pumped CEO Jensen Huang bragged that the company was "sold out" before going oracular: "We've entered the virtuous cycle of AI. The AI ecosystem is scaling fast -- with more new foundation model makers, more AI startups, across more industries, and in more countries. AI is going everywhere, doing everything, all at once." Things couldn't be going much better for Nvidia, which is one of the few large companies making serious profits that are primarily and unambiguously attributable to AI. The response from investors, though, was strange. The next morning, the stock popped a few percent but remained below recent highs, and ended the day slightly down. For many analysts and industry watchers, this wasn't a story about the greatest quarter for the greatest company of all time. It was merely a relief. The "AI trade" was still alive, and the party could continue; more broadly, the anomalous sector propping up economic indicators would, for at least another quarter, and maybe even a bunch of quarters, continue to do so. It was, above all, an assurance and occasion to talk about it. You know. The bubble. In late 2025, AI bubble talk isn't just for outsiders, skeptics, and short-sellers. Increasingly, it's the frame through which the industry's most important figures, and biggest boosters, talk about their technology, their companies, and the industry around them. "When bubbles happen, smart people get overexcited about a kernel of truth," OpenAI's Sam Altman told a group of reporters in August. "Are we in a phase where investors as a whole are overexcited about AI? My opinion is yes." Mark Zuckerberg, while suggesting there were "compelling arguments" that AI could be an "outlier," drew parallels to bubbles past. "I do think there's definitely a possibility, at least empirically, based on past large infrastructure buildouts and how they led to bubbles, that something like that would happen here," he said on the ACCESS podcast in September. There's some transparent positioning here, of course -- both Altman and Zuckerberg were implying that their companies were unique and would be fine either way -- but inside-the-bubble bubble talk has since morphed into an odd strain of conventional wisdom, a premise from which high-level conversations about AI now proceed, or at least a possibility that has to be acknowledged. Google CEO Sundar Pichai invoked the dotcom crash. "I expect AI to be the same. So I think it's both rational and there are elements of irrationality through a moment like this," he said this month. In the event of a major correction, he said, "I think no company is going to be immune, including us." The CEO of Google DeepMind, Demis Hassabis, emphasized Google's particular strength but conceded on Hard Fork that there are "some parts of the AI industry that are probably in a bubble." Jeff Bezos has said that while AI is "real," and "is going to change every industry," it's also showing signs of an "industrial bubble." Against the backdrop of all this hedging and narrower speculation about markets, the remaining practitioners of wide-open AI CEO futurism -- that is, tech leaders still speaking the way most of them did as recently as last year - suddenly sound like outliers. At the Saudi Investment Forum, onstage with Huang, Elon Musk confidently stated that AI, with humanoid robots, will "eliminate poverty" and "make everyone wealthy." In the future, he added on X, the "most likely outcome is that AI and robots make everyone wealthy. In fact, far wealthier than the richest person on Earth." For the last few years, the public has been left to interpret competitively extreme visions of the future floated by strangely cavalier tech executives, who agreed on little but the inevitability of total change: mass unemployment; luxurious post-scarcity; human obsolescence; hyper-accelerated scientific progress; and, perhaps, total annihilation. Now, markets are concerned with narrower questions, with more specific answers, and more immediate consequences: How many GPUs has Nvidia sold? How many can it make? (Or, rather, how many can Taiwan Semiconductor Manufacturing Company manufacture for it?) There are plenty of theories about how generative AI might diffuse into the economy and change the world, and as more people use it, and companies start to deploy it, a few of them are snapping into focus (buy a drink for any young programmers in your life). But after years of boosterish warnings about the extraordinary and esoteric risks posed by mysterious and profound technology -- we're creating software so powerful even we can't control it -- tech executives are instead trying to get out in front of a profound non-technological risk that may be manifesting much sooner: that if they lose even a little bit of momentum, they might end up tanking the American economy. If Huang's everything, everywhere, "all at once" line was a reference to the 2022 absurdist multiverse movie, it's a funny one: the film opens with its protagonist shuffling through a pile of papers, anxiously preparing for a financial audit (and features a villain who "got bored one day" and decided to collapse the entirety of creation into a bagel-shaped singularity). As the AI boom has sprawled into a larger and more complicated financial story, scrutiny of the businesses behind the models has become as intense as scrutiny of the models themselves. To raise money and finance data center deals, OpenAI, which is both the leading consumer AI company and one of the industry's most aggressive and, let's say, inventive dealmakers, has manifested some truly dizzying arrangements, many of which involve Nvidia, a circular deal innovator in its own right. Take CoreWeave, a crypto-mining company that pivoted to AI data centers in 2022. CoreWeave rents access to Nvidia chips to firms that need them for AI inference and training. OpenAI is a CoreWeave customer, but also a Coreweave investor. Nvidia is a CoreWeave vendor -- it supplies the GPUs - but also an investor and, somehow, a customer. Coreweave also loses a lot of money, and its stock price has, after peaking in July, collapsed. Lately, the deals are getting more brazen and less convoluted. In September, Nvidia announced it would invest $100 billion in OpenAI, which OpenAI said it would use to build data centers full of Nvidia hardware. This month, alongside Microsoft -- OpenAI's biggest early investor and primary partner -- Nvidia announced the companies would invest up to $15 billion in OpenAI competitor Anthropic in exchange for a $30 billion commitment from the company to buy computing capacity from Microsoft, powered, naturally, by Nvidia hardware. Altman's moments of candor about a possible bubble have been scattered between more defensive messaging from the company, which may be losing as much as $12 billion per quarter. In a recent podcast interview, investor Brad Gerstner asked Altman, "How can a company with $13 billion in revenues make $1.4 trillion of spend commitments?" Altman shot back: "If you want to sell your shares, I'll find you a buyer. Enough." That insiders seem to agree that we could be in a massive bubble is, counterintuitively, not very useful: whether or not they mean it, and whether or not they're right, their incentives as leaders of mega-scale startups and public tech companies are such that raising, spending, and committing as much money as possible for as long as possible is probably the rational, self-interested choice either way. Anxious, skeptical, or merely satisfied investors looking for excuses to pull back or harvest gains don't have to look hard, and there's evidence some are; before its earnings report, Peter Thiel's investment firm unloaded its position in Nvidia, and Softbank cashed out of the chipmaker at around the same time. Similarly, OpenAI's ability to send public companies' stocks soaring by announcing massive "commitments" seems to be fading -- Oracle's recent $300 billion valuation bump, based on some shockingly optimistic guidance it offered investors in September, has since gone negative. But focusing on the flagrant circularity of AI financing can feed the impression that the risks are contained within Silicon Valley. The bigger problem is the ways in which they're already not. If it exists, you might call it a load-bearing bubble. In the first half of 2025, "investment in information processing equipment and software" -- a sort of informal, private stimulus package -- accounted for 92 percent of GDP growth for the United States, while AI-related tech stocks account for nearly all recent growth in the S&P 500. Early funding for companies like OpenAI came from venture capitalists and incumbent tech giants, while Google and Meta pushed into AI with their own massive revenue and cash, but multi-hundred-billion-dollar commitments mean they're getting more creative, both in how they raise money and how they distribute risk. Companies like Meta are funding data centers with "special purpose vehicles," which may sound familiar if you were reading the financial news in 2008, and with massive corporate bond sales. As the investor Paul Kedrosky has argued, the AI boom has traits, at least, of every major financial bubble in modern history: a narrative-driven tech bubble, a credit bubble, a real estate bubble, and an infrastructure bubble. To tie it all together, you've got OpenAI's CFO floating, then frantically backtracking on, the idea of a government backstop for financing AI expansion, almost instantly elevating the prospect of an AI bailout into fodder for conservative and progressive lawmakers. Huang has two typical responses to all this. One speaks for itself: look at all those GPUs we're selling. The other is more direct. "There's been a lot of talk about an AI bubble. From our vantage point," he said after earnings, "we see something very different." In other words: No it's not. The "virtuous cycle" is just beginning, and the accelerating potential of the most versatile technology the world has ever seen will one day expose complaints about incremental model updates and hand-wringing about data center deals as short-sighted and insignificant. Huang is still able to speak with authority and tell a story that, for investors, still has juice. For everyone else, though, neither side of this wildly polarized, high-stakes bet sounds ideal. If this really is a bubble, and it deflates even a little, it could send the American economy into a serious slump, with consequences for almost everyone, getting rid of plenty of jobs the old-fashioned way. If it doesn't -- and Huang's sanitized visions of mass automation rapidly start to spread across the economy, justifying all that CapEx, and all those strange deals, and then some -- well, aren't we getting laid off anyway?
[6]
Why Nvidia's Growth Is Now Tied To Debt‑Loaded AI Customers - CoreWeave (NASDAQ:CRWV), NVIDIA (NASDAQ:NVDA)
Nvidia (NASDAQ:NVDA) just turned in another strong quarter. Revenue reached 57 billion dollars, up 62% from a year earlier, and the stock rallied as investors leaned back into the AI trade. What those numbers don't show is how much that growth now depends on a small set of AI customers that are piling on debt and vendor financing to afford Nvidia's chips. For shareholders, that raises a different question than earlier in the AI boom: not whether demand exists, but whether highly leveraged customers like CoreWeave (NASDAQ:CRWV), OpenAI and xAI can keep refinancing the obligations that Nvidia is counting on. The stock jumped 2.85% at close to $186, then surged another 5.08% in after-hours The Problem: Nvidia Is Becoming a Bank (And It Knows It) Think about a car purchase. Most buyers do not pay the full amount in cash. They finance it and spread the cost across several years. Nvidia is now in a similar position with GPUs. The difference is that we are no longer talking about tens of thousands of dollars, but about multibillion dollar data center builds. Many of Nvidia's biggest customers cannot pay for those chips upfront (a single top‑end GPU can cost U$ 10,000 per unit, and a full rack or data center build can cost hundreds of millions or even billions). So, to keep orders flowing, Nvidia is helping them finance the hardware instead of demanding cash on delivery. The company has put about 110 billion dollars into direct investments and another 15 plus billion into GPU backed SPV debt. That combined exposure is roughly 67 percent of Nvidia's annual revenue. There is a clear historical parallel. Around the dot com bubble, Lucent Technologies leaned heavily on vendor financing. At its peak, about 24 percent of Lucent's revenue was tied to loans and guarantees extended to customers. When those customers could not pay, Lucent took large write downs and the stock collapsed from 80 dollars to 2 dollars. Nvidia's financing exposure, as a share of revenue, is now about 2.8 times larger than what helped sink Lucent. How The SPV Structure Works Nvidia does not usually lend directly to CoreWeave, xAI, OpenAI, or similar customers. Instead, it works through Special Purpose Vehicles (SPVs). The structure looks roughly like this: An AI infrastructure company needs around 12.5 billion dollars of GPUs. A separate SPV is created. The SPV raises equity and takes on a large amount of debt. The SPV uses that capital to buy Nvidia hardware and then leases the GPUs to the AI company over about five years. Under ASC 842, these leases show up on the SPV balance sheet and in Nvidia's disclosures. Nvidia recognizes most or all of the lease contract value as revenue when the transaction closes, not over the full term of the lease. The company may not receive the cash for several years, but the revenue is booked upfront. The AI company records lease payments as operating expense. The SPV debt does not sit directly on the AI customer's balance sheet. The benefit for the startup is cleaner reported leverage. The benefit for Nvidia is larger reported revenue today. The tradeoff is that more credit risk and duration risk now sits with Nvidia and the SPVs that finance its hardware. There is even another layer. Nvidia's cost of capital ranges from roughly 9 to 18 percent depending on the instrument. To make these leases attractive, the effective rate charged to customers is often close to zero or meaningfully subsidized. That spread, on a financing book of about 110 billion dollars, can produce an 8 to 18 percent annual margin drag that does not look like a traditional cost of goods line item. The Maturity Wall Is Close This is not a distant risk at all. Several large exposures are approaching key refinancing and funding milestones over the next few quarters. Event 1: CoreWeave CoreWeave is the largest single beneficiary of Nvidia linked financing. It has already drawn roughly 8 billion dollars of its 12.9 billion dollars in committed facilities. The company faces more than 1.5 billion dollars of debt payments by October 2025. CoreWeave's Q3 2025 S 1 filing shows that interest expense reached about 311 million dollars in the quarter, roughly triple the prior year. The credit agreements include a clause that any new debt raised must first reduce existing tranches rather than fund growth. That makes it harder to both refinance and continue to scale capacity. Covenants also require that contracted future revenues cover debt repayments. If customers delay deployments or cancel contracts, CoreWeave can quickly fall out of compliance with those covenants. If CoreWeave cannot secure an extension or new facilities by early 2026, the company will be pushed into restructuring talks, which is a big issue for the next several quarters. If CoreWeave fails, Nvidia's 7 percent equity stake is effectively written down to zero. On top of that, Nvidia could face pressure to support the value of GPU collateral across related SPVs, potentially by buying back hundreds of thousands of GPUs at weaker prices. Event 2: OpenAI OpenAI is burning cash at an estimated 57 percent of revenue. For every dollar of revenue, the company is spending around 1.57 dollars. Management expects 2025 losses in the range of 8 to 9 billion dollars on about 13 billion dollars of revenue. According to recent disclosures, OpenAI does not expect to turn cash flow positive until 2029 or 2030. That path assumes fundraising conditions remain supportive. The company projects cumulative cash burn of roughly 115 billion dollars through 2029, which implies raising at least 120 billion dollars from investors to bridge the gap. Nvidia has agreed to invest up to 100 billion dollars in OpenAI in ten pieces of 10 billion dollars each. OpenAI receives each new piece only after it hits certain deployment milestones, and the price of each piece depends on what the company is worth at that time. If OpenAI's next funding round comes in flat or lower than the March 2025 valuation, the economic value of later tranches falls. Down rounds dilute earlier investors and signal that the market is questioning the current growth and profitability assumptions. In that scenario, Nvidia has less incentive to deploy the full 100 billion dollar commitment on the original terms, and may face write downs on previously funded tranches. Event 3: xAI xAI is raising about 15 billion dollars at a valuation near 230 billion dollars, up from roughly 113 billion earlier in 2025. That is a doubling of valuation in under a year without a step up in revenue. The company is spending heavily on the Memphis Project Colossus data center while still in an early stage of monetization. If AI funding cools and valuations reset by 30 to 40 percent, the tranches Nvidia has tied to those valuations take an immediate hit. A 20 billion dollar commitment sized off a 230 billion valuation effectively shrinks if the next round prices the company at 160 billion. That can translate into several billion dollars of mark to market losses for Nvidia. What Happens If Customers Cannot Pay The risk is a real chain reaction across the SPVs and the customers tied to them. A simplified path looks like this: CoreWeave fails to refinance and defaults. Lenders seize GPUs that sit as collateral in SPVs. Those GPUs are sold into secondary markets at discounts of 30 to 50 percent. SPV collateral values fall and other AI infrastructure SPVs breach covenants. Those SPVs default in sequence. Nvidia reverses previously recognized revenue on undelivered or non performing contracts and records write downs on investments. When Lucent's vendor financing book deteriorated in 2001, the company took around 3.5 billion dollars of loan loss provisions till 2002. The stock crashed from $80 to $2 in less than 18 months. The company merged with Alcatel in 2006, five years later, and never truly recovered. Nvidia's vendor financing exposure is about seven times larger in absolute dollars. If even a modest share of the 110 billion dollar portfolio turns bad, Nvidia could easily face 10 to 15 billion dollars of lost revenue and write downs. That would likely compress the price to earnings multiple from the low 50s to the mid 30s or low 40s, which implies downside of roughly 25 to 35 percent from current levels. What To Listen For From Management Individual investors may not review detailed credit agreements, but they can still identify potential stress by paying close attention to the language Nvidia uses on earnings calls. Current language (2025 calls): "We have excellent visibility into $500B+ of Blackwell-Rubin revenue through 2026." Warning language (what to listen for in Q1 2026 calls): "Many customers are exploring financing options." Or "Our lease portfolio performed in line with expectations." Or vague language about "contract performance" without specifics on deployment velocity. Any shift toward finance/lease language signals management is managing expectations downward due to hidden customer stress signals. Once management uses that language, the market will immediately price in covenant violations and downgrades. Once that language starts to appear, analysts and rating agencies will begin to model covenant risks, and the equity market will move ahead of the actual write downs. The Margin Trap Even Without Defaults Besides the default risk and the timing of the maturity wall, there is a second problem that's even more dangerous: even if customers don't default, Nvidia is structurally losing money on a large slice of these deals. A simplified example: Nvidia finances a $1 billion GPU deployment at 0-2% (to make deals attractive) Nvidia's own cost of capital is roughly 9-18%, depending on the instrument That's a 7-16% annual margin loss on every financed deal On a $110 billion financing book, that implies $7.7-17.6 billion of annual margin drag, while revenue is recognized as if it were an all-cash sale This doesn't show up cleanly in gross margin. The money loss is buried in "investments," equity stakes, and SPVs, which Nvidia looks like it is selling extremely high-margin hardware; underneath, it is giving much of that margin back through subsidized financing. This gets worse if interest rates stay elevated. Every 1 percentage point increase in effective financing costs widens the annual margin gap by roughly $1.1 billion. Nvidia isn't financing these deals because the alternative is worse. Without cheap vendor financing, CoreWeave, xAI, and OpenAI would: Buy fewer GPUs, or delay deployments Shift part of their spend to cheaper AMD hardware or custom ASICs Lean more on used or secondary-market equipment That would mean tens of billions of dollars in lost near-term revenue for Nvidia. So instead, Nvidia chooses to: Finance the purchases on very generous terms Book massive upfront hardware revenue that pleases Wall Street And hope to earn back the lost economics over time via: Multi-year lock-in to Nvidia hardware Equity appreciation in customers like CoreWeave, OpenAI, and xAI High-margin software, networking, and platform fees layered on top Nvidia is not running a clean, high-margin equipment business here. It is pulling forward revenue, absorbing hidden financing losses, and betting that future equity upside and software lock-in will bail out today's concessions. If those bets fail, the margin trap becomes visible fast, and the stock will have to reprice to a much lower, more "normal" hardware multiple. How to Trade This: Three Specific Watch Points Watch #1: CoreWeave's Debt Refinancing Announcement (Due: December 2025) CoreWeave is scheduled to announce Q3 2025 earnings results in mid-November 2025. Any commentary hinting at refinancing challenges, delayed deployments, or "covenant discussions" is a red flag. Watch #2: OpenAI Funding Round Valuation (Due: Q1 2026) OpenAI's next Series E funding round will telegraph whether the market believes in the company's path to profitability. If OpenAI ends up taking capital at a flat or down-round valuation from March 2025 levels, it signals the market thinks AI startup burn rates are unsustainable. Watch #3: Major AI Startup Layoffs or Pivot Announcements When Anthropic, xAI, or OpenAI announce major cost-cutting or shift away from "at all costs" scaling, that's a signal that AI infrastructure demand is cooling and customers are hitting monetization concerns. Layoffs = cash burn concerns = covenant violations incoming. If You Own NVDA Avoid adding aggressively to positions until there is more clarity on CoreWeave's refinancing path. Consider stop losses or structured hedges that limit downside if a major AI customer or SPV shows signs of distress. Treat concentration risk carefully. A single vendor financing unwind could erase several quarters of earnings gains. If You Are Hedging Or Bearish One approach is to use 6 to 12 month put options slightly out of the money. For example, puts with strikes around 10 percent below spot can act as insurance against a sharp repricing if CoreWeave or another large customer trips covenants. A modest premium outlay, framed as a percentage of the underlying position, can create asymmetry if the stock gaps lower on financing news. Final Thoughts Nvidia has done extremely well on product and demand. The Blackwell ramp is real, and AI workloads are still growing quickly. At the same time, the company has leaned heavily on vendor financing and structured commitments to maintain that growth. The exposure is large relative to revenue and heavily concentrated in a small group of customers with aggressive burn profiles and significant leverage. Lucent's experience two decades ago shows how quickly vendor financing can flip from a growth tool to a balance sheet problem. Nvidia's 110 billion dollar vendor financing and investment portfolio is roughly 2.8 times larger relative to revenue than Lucent's was at its peak. Watch CoreWeave. Watch OpenAI. Watch xAI. Those three companies are sitting on Nvidia's balance sheet time bombs. The moment one defaults, the others follow. And NVDA follows after that. Disclaimer: This article is for informational purposes only and should not be construed as financial advice. Always consult with a qualified financial advisor before making investment decisions. Past performance of margin debt levels and market crashes does not guarantee future results. Benzinga Disclaimer: This article is from an unpaid external contributor. It does not represent Benzinga's reporting and has not been edited for content or accuracy. CRWVCoreWeave Inc$73.12-2.40%OverviewNVDANVIDIA Corp$182.25-2.29%Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
Growing concerns about an AI infrastructure bubble emerge as CoreWeave's debt-heavy business model and circular financing schemes highlight potential risks in the sector. Market concentration in AI stocks and unsustainable spending patterns raise questions about the industry's long-term viability.
CoreWeave, a data center company that pivoted from cryptocurrency mining to AI infrastructure in 2022, has emerged as a potential flashpoint for broader concerns about the AI industry's financial stability
1
. The company, which went public in March at $40 per share and peaked at $187 before settling around $75, exemplifies what critics describe as unsustainable financial engineering masquerading as innovation.
Source: The Verge
The company has pioneered an unusual approach to financing its operations by using GPUs as collateral for massive loans. CoreWeave secured $2.3 billion in loans at 15% interest rates, followed by a $7.5 billion loan at 10% interest, and additional financing totaling $400 million at 9% rates
1
. This debt-heavy structure has raised alarm bells among analysts, with Kerrisdale Capital describing CoreWeave as "an undifferentiated, heavily levered GPU rental scheme" and assigning a fair value of just $10 per share1
.Despite generating $1.4 billion in revenue during the third quarter—double the previous year's figure—CoreWeave faces significant challenges in achieving profitability outside of the most optimistic AI adoption scenarios
1
. The company's business model essentially involves acting as a middleman, taking on the risks and costs of building data centers that larger tech companies can rent while they construct their own competing facilities.The AI boom has created unprecedented market concentration, with seven companies—Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla—now representing more than one-third of the S&P 500's value
2
. This concentration level is more than double what was seen before the dot-com crash, raising concerns about the broader market's stability and diversification4
.The disparity between AI-connected companies and the broader economy has become stark. An analysis excluding the "Magnificent Seven" reveals a much weaker economic picture, with smaller and lower-tech companies reporting lackluster sales and declining investment
2
. The Russell 2000 index, representing smaller companies, lost 4.5% in a recent one-month period compared to just 2% for the S&P 5002
.Capital expenditures remain flat for companies not connected to AI, according to analysis from JPMorgan and Moody's, indicating that low-tech businesses aren't experiencing growth
2
. This bifurcation has led economists to describe the current environment as having "winners and losers" rather than broad-based economic strength.Nvidia, despite its remarkable financial performance with $86.6 billion in profits over the past four quarters, faces growing scrutiny over its strategy of financing its own customers to maintain demand
3
. The company has assembled what analysts describe as a "complex superstructure encompassing investments and financing" designed to boost and perpetuate demand for its products.Source: Fortune
Jay Goldberg of Seaport Global Securities, who issued the only "sell" rating on Nvidia among 47 analysts, argues that "Nvidia is buying demand here"
3
. Lisa Shalett of Morgan Stanley Wealth Management warns that "Nvidia is in a position to prop up customers so that it's able to grow," creating increasingly complex arrangements as the funded customers become weaker and take on more borrowing3
.The company's customer concentration adds another layer of risk, with 52% of second-quarter sales coming from just three undisclosed customers that analysts identify as Microsoft, Amazon, and Alphabet
3
. This dependence on a few large buyers, combined with reduced Chinese market access due to trade restrictions, heightens Nvidia's reliance on the domestic AI infrastructure buildout.Related Stories
Remarkably, the discussion of bubble conditions has moved from external critics to industry leaders themselves. OpenAI's Sam Altman admitted that "investors as a whole are overexcited about AI," while Meta's Mark Zuckerberg drew parallels to past infrastructure bubbles
5
. Google CEO Sundar Pichai invoked the dot-com crash, stating he expects AI to follow a similar pattern with "elements of irrationality"5
.
Source: Fast Company
This acknowledgment from within the industry represents a significant shift from the previously unanimous optimism. Even Amazon's Jeff Bezos, while maintaining that AI is "real" and transformative, has noted signs of an "industrial bubble"
5
. A Bank of America survey found that 45% of investors now cite an AI bubble as the top tail risk for the economy and markets4
.Summarized by
Navi
[1]
[2]
[4]
[5]
16 Aug 2025•Business and Economy

29 Oct 2025•Business and Economy

18 Jul 2025•Business and Economy

1
Business and Economy

2
Technology

3
Policy and Regulation
