Curated by THEOUTPOST
On Sat, 23 Nov, 12:01 AM UTC
3 Sources
[1]
36% of Nvidia's $35 Billion in Q3 Revenue Came From Just 3 Mystery Customers | The Motley Fool
Nvidia's growth potential might be in the hands of just a few of its top customers. Nvidia (NVDA -1.15%) is the world's leading supplier of graphics processing units (GPUs) for data centers. Demand for the chips, which are used to develop artificial intelligence (AI), is significantly outpacing supply, and that trend is likely to continue for the foreseeable future. That's great news because it gives Nvidia incredible pricing power, which translates into massive profits and a higher stock price for investors. It's a key reason the company has added more than $3 trillion to its market value over the last two years alone. However, there is a risk brewing beneath the surface of this cash-generating machine. During its fiscal 2025 third quarter (ended Oct. 27), more than one-third of Nvidia's total revenue came from just three customers. If those customers pull back on their AI spending, the company's powerful sales growth could grind to a halt. Nvidia recently launched its new Blackwell GPU architecture, which offers a significant leap in performance from the previous Hopper architecture. Blackwell-based systems like the GB200 NVL72 can perform AI inference at 30 times the speed of the equivalent H100 system, which empowers developers to deploy bigger, more advanced large language models (LLMs). However, one GB200 GPU within the NVL72 system costs over $80,000, and the most powerful AI models could require over 100,000 of them. Very few companies can afford to spend billions of dollars building data center infrastructure of that size. As a result, a handful of tech giants including Microsoft, Amazon, and Oracle are the biggest buyers of Nvidia's GPUs. They use them to develop their own AI models, but they also rent the computing power to developers at scale. It has become a very profitable business for them, and it benefits the developers who can access computing capacity without fronting billions to build it themselves. Naturally, this is also a big win for Nvidia. With all of that said, affordability is improving with each new generation of GPU. A single H100 costs around $40,000, so even though a GB200 is twice as expensive, the 30-fold increase in inference performance makes it a huge net win for the end user. Still, it could be years before it's economically viable for the average company to build its own infrastructure. Nvidia generated a record $35.1 billion in total revenue during its fiscal 2025 third quarter, which was a 94% increase from the year-ago period. The majority of that revenue ($30.8 billion) was attributable to the data center segment, which accounts for its AI GPU sales. According to the company's 10-Q filing for the third quarter, three unidentified customers accounted for 36% of its $35.1 billion in total revenue: Data source: Nvidia. In the fiscal 2025 second quarter three months earlier, four customers provided 46% of Nvidia's total revenue, so it appears sales were less concentrated in the third quarter. However, the company only singles out the customers that account for 10% or more of its revenue, so there were likely other material buyers of its GPUs that didn't meet the reporting threshold in the recent quarter. For instance, we know there was a fourth customer that represented 12% of the total revenue over the last nine months but fell below the 10% threshold for the third quarter specifically. Therefore, there's a chance that third-quarter revenue concentration only changed by a couple of percentage points compared to the second quarter. Here's why that's a concern: Customers B and C have each spent $10 billion with Nvidia over the last three quarters alone, and only a small number of companies in the entire world can afford to sustain that level of spending on AI infrastructure and chips. If even one of them decides to cut back, Nvidia will suffer a loss in revenue that will be very difficult to replace, thus potentially denting the company's incredible run of growth. Microsoft is a consistent buyer of Nvidia's GPUs, and it's reportedly the undisputed No. 1 buyer of the new Blackwell GB200, which just started shipping. Therefore, it's almost certainly one of the chipmaker's top three customers. Some combination of Amazon, Alphabet, Oracle, Meta Platforms, and OpenAI likely fill the other spots. Based on public filings, here's how much some of those companies are allocating to capital expenditures (capex), most of which relates to AI infrastructure: Morgan Stanley estimates that Microsoft, Amazon, Alphabet, and Meta alone will spend a combined $300 billion on AI infrastructure in 2025, so Nvidia's sales pipeline looks rock solid for at least the next year. However, the longer-term picture is a little less clear, because it's unlikely those companies can maintain such a high rate of spending in perpetuity. In fact, all four of those companies have designed their own chips in an effort to reduce costs. Nvidia has a lengthy head start, so it could take years before those tech giants can build a comparable product. However, it represents a key risk in the future since those companies represent such a huge chunk of Nvidia's revenue. None of this is an immediate threat to the chipmaker, but investors should definitely keep an eye on how its top customers plan to allocate their capex beyond 2025. That will be the earliest indication of whether this chip powerhouse can maintain its incredible run of growth.
[2]
Three mystery whales have each spent $10 billion-plus on Nvidia's AI chips so far this year
AI microchip supplier Nvidia, the world's most valuable company by market cap, remains heavily dependent on a few anonymous customers that collectively contribute tens of billions of dollars in revenue. The AI chip darling once again warned investors in its quarterly 10-Q filing to the SEC that it has key accounts so crucial that their orders each crossed the threshold of ten percent of Nvidia's global consolidated turnover. An elite trio of particularly deep-pocketed customers for example individually purchased between $10-$11 billion worth of goods and services across the first nine months that ended in late October. Fortunately for Nvidia investors, this won't change any time soon. Mandeep Singh, global head of technology research at Bloomberg Intelligence, says he believes founder and CEO Jensen Huang's prediction that spending will not stop. "The data center training market could hit $1 trillion without any real pullback," by that point Nvidia share will almost certainly drop markedly from their current 90%. But it could still be in the hundreds of billions of dollars in revenue annually. Outside of defense contractors living off of the Pentagon, it's highly unusual that a company has such a concentration of risk among a handful of customers -- let alone one poised to become the first worth the astronomical sum of $4 trillion. Strictly looking at Nvidia's accounts on a three-month basis, there were four anonymous whales that, in total, comprised nearly every second dollar of sales in the second fiscal quarter, this time at least one of them has dropped out since now only three still meet that criteria. Singh told Fortune the anonymous whales likely include Microsoft, Meta, and possibly Super Micro. But Nvidia declined to comment on the speculation. Nvidia only refers to them as Customers A, B, and C, and all told they purchased a collective $12.6 billion in goods and services. This was more than a third of Nvidia's overall $35.1 billion recorded for the fiscal third quarter through late October. Their share was also divided up equally with each accounting for 12%, suggesting they were likely receiving a maximum amount of chips allocated to them rather than as many as they might have ideally wanted. This would fit with comments from founder and CEO Jensen Huang that his company is supply constrained. Nvidia cannot simply pump out more chips, since it has outsourced wholesale fabrication of its industry-leading AI microchips to Taiwan's TSMC and has no production facilities of its own. Importantly, Nvidia's designation of major anonymous customers as "Customer A", "Customer B," and so on is not fixed from one fiscal period to the next. They can and do change places, with Nvidia keeping their identity a trade secret for competitive reasons -- no doubt these customers would not like their investors, employees, critics, activists and rivals being able to see exactly how much money they spend on Nvidia chips. For example, one party designated "Customer A" bought around $4.2 billion in goods and services over the past quarterly fiscal period. Yet it appears to have accounted for less in the past, since it does not exceed the 10% mark across the first nine months in total. Meanwhile "Customer D" appears to have done the exact opposite, reducing purchases of Nvidia chips in the past fiscal quarter yet nevertheless representing 12% of turnover year-to-date. Since their names are secret it's difficult to say whether they are middle men like the troubled Super Micro Computer, which supplies data center hardware, or end users like Elon Musk's xAI. The latter came out of nowhere for example to build up its new Memphis compute cluster in just three months time. Ultimately, however, there are only a handful of companies with the capital to be able to compete in the AI race as training large language models can be exorbitantly costly. Typically these are the cloud computing hyperscalers such as Microsoft. Oracle for example recently announced plans to build a zettascale data center with over 131,000 Nvidia state-of-the-art Blackwell AI training chip, which would be more powerful than any individual site yet existing. It's estimated the electricity needed to run such a massive compute cluster would be equivalent to the output capacity of nearly two dozen nuclear power plants. Bloomberg Intelligence analyst Singh really only sees a few longer term risks for Nvidia. For one, some hyperscalers will likely reduce orders eventually, diluting its market share. One such likely candidate is Alphabet, which has its own training chips called TPUs. Secondly, its dominance in training is not matched by inference, which run generative AI models after they have already been trained. Here the technical requirements are not nearly as state of the art, meaning there is much more competition not just from rivals like AMD but also companies with their own custom silicon like Tesla. Eventually inference will be a much more meaningful business as more and more businesses utilize AI. "There are a lot of companies trying to focus on that inferencing opportunity, because you don't need the highest-end GPU accelerator chip for that," Singh said. Asked if this longer term shift to inferencing was the bigger risk than eventually losing share of the market in training chips, he replied: "Absolutely".
[3]
3 Nvidia customers have each spent $10 billion on chips this year
AI can do the work of countless stock market analysts and more, exec says Earlier this week, the chipmaker reported $35.1 billion in revenues for its fiscal third quarter, surpassing Wall Street's expectations. According to the company's Form 10-Q, 36% of that revenue -- or $12.6 billion -- came from only three unnamed customers. For the three months ended in October, Customers A, B, and C each accounted for 12% of Nvidia's quarterly revenue, the filing said. The filing also shows that, in the first nine months of fiscal year 2025, sales to Customers B and C each made up 11% of revenue, while a Customer D accounted for 12% of revenue. That means Customers B and C have each spent around $10 billion on Nvidia's tech so far this year, while Customer D has spent almost $11 billion. Nvidia reported revenues of $26 billion for the fiscal first quarter, and $30 billion for the fiscal second quarter. The sales are primarily attributable to Nvidia's Compute & Networking segment, according to the filing. In September, Nvidia's 10-Q regulatory filing showed four anonymous customers made up nearly half of its $30 billion revenues for the second fiscal quarter. Sales from the four customers represented about 46% of total quarterly revenue, or $13.8 billion. "We have experienced periods where we receive a significant amount of our revenue from a limited number of customers, and this trend may continue," Nvidia said in both 10-Q filings. Although Nvidia will not disclose the customers, its top buyers are likely to include Google parent Alphabet (GOOGL-1.35%), Meta (META-0.39%), Microsoft (MSFT+0.49%), and Tesla (TSLA+4.19%) -- all of whom are major players in the AI boom. Nvidia's record fiscal third-quarter revenue was up 94% from a year ago. The company set its fiscal fourth-quarter revenue guidance at $37.5 billion, plus or minus 2%, which was slightly above expectations.
Share
Share
Copy Link
Nvidia's Q3 revenue soars to $35.1 billion, with 36% coming from just three unidentified customers. This concentration highlights both Nvidia's market dominance and potential risks in the AI chip industry.
Nvidia, the world's leading supplier of graphics processing units (GPUs) for data centers, reported a staggering $35.1 billion in revenue for its fiscal 2025 third quarter, marking a 94% increase from the previous year 1. This unprecedented growth is primarily driven by the surging demand for AI chips, with the data center segment accounting for $30.8 billion of the total revenue 1.
In a surprising revelation, Nvidia's 10-Q filing disclosed that 36% of its Q3 revenue, approximately $12.6 billion, came from just three unidentified customers 12. Each of these customers, referred to as Customers A, B, and C, contributed 12% to the quarterly revenue 3. Over the first nine months of the fiscal year, two of these customers (B and C) have each spent around $10 billion on Nvidia's technology 3.
While Nvidia keeps the identity of its top customers confidential, industry experts speculate that they likely include tech giants such as Microsoft, Meta, Amazon, Alphabet, and possibly Super Micro 2. These companies are investing heavily in AI infrastructure, with Morgan Stanley estimating that Microsoft, Amazon, Alphabet, and Meta alone will spend a combined $300 billion on AI infrastructure in 2025 1.
Nvidia recently launched its new Blackwell GPU architecture, offering a significant performance leap over the previous Hopper architecture. The GB200 NVL72 system, based on Blackwell, can perform AI inference 30 times faster than its H100 predecessor 1. However, this cutting-edge technology comes at a premium, with a single GB200 GPU costing over $80,000 1.
Nvidia currently holds a dominant position in the AI chip market, with an estimated 90% share in the data center training segment 2. Mandeep Singh, global head of technology research at Bloomberg Intelligence, predicts that the data center training market could reach $1 trillion without any significant pullback 2.
Despite Nvidia's impressive growth, the concentration of revenue from a few key customers presents potential risks:
Dependency on large tech companies: Any reduction in AI spending by these major customers could significantly impact Nvidia's revenue 1.
Supply constraints: Nvidia outsources chip fabrication to Taiwan's TSMC, limiting its ability to rapidly increase production 2.
Emerging competition: Some hyperscalers, like Alphabet, are developing their own training chips, which could dilute Nvidia's market share in the long term 2.
Shift to inference: As the AI market matures, there may be a greater focus on inference chips, where Nvidia faces more competition from companies like AMD and custom silicon providers 2.
The demand for AI chips is driving massive investments in data center infrastructure. Oracle, for example, has announced plans to build a zettascale data center with over 131,000 Nvidia Blackwell AI training chips 2. This trend underscores the scale of AI adoption and the critical role Nvidia plays in powering the AI revolution.
Nvidia, the AI chip giant, reported impressive Q2 earnings that beat Wall Street estimates. However, despite the strong performance, the company's stock experienced a slight dip, reflecting the sky-high expectations set by investors.
13 Sources
13 Sources
Nvidia's continued success in the AI chip market has led to record-breaking financial results and market valuation, with analysts predicting further growth driven by new GPU architectures and expanding AI applications.
100 Sources
100 Sources
As Nvidia's stock surges due to AI chip demand, experts warn of potential slowdown. Meanwhile, tech giants like Apple and Google develop in-house AI chips, challenging Nvidia's market position.
3 Sources
3 Sources
Nvidia, the AI chip giant, is projected to report a doubling of sales in Q2. However, even a slight miss could negatively impact its soaring stock price, as investor expectations are at an all-time high.
16 Sources
16 Sources
Nvidia's meteoric rise in the AI chip market faces scrutiny as competitors emerge and market dynamics shift. This story explores the company's current position, future prospects, and potential challenges in the evolving AI landscape.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved