6 Sources
6 Sources
[1]
The A.I. Spending Frenzy Is Propping Up the Real Economy, Too
The trillions of dollars that tech companies are pouring into new data centers are starting to show up in economic growth. For now, at least. It's no secret by now, as investors await an earnings report on Wednesday by the chip behemoth Nvidia, that optimism around the windfall that artificial intelligence may generate is pumping up the stock market. But in recent months, it has also become clear that A.I. spending is lifting the real economy, too. It's not because of how companies are using the technology, at least not yet. Rather, the sheer amount of investment -- in data centers, semiconductor factories and power supply -- needed to build the computing power that A.I. demands is creating enough business activity to brighten readings on the entire domestic economy. Companies will spend $375 billion globally in 2025 on A.I. infrastructure, the investment bank UBS estimates. That is projected to rise to $500 billion next year. Investment in software and computer equipment, not counting the data center buildings, accounted for a quarter of all economic growth this past year, data from the Commerce Department shows. (Even that probably doesn't reflect the whole picture. Government data collectors have long had trouble capturing the economic value of semiconductors and computer equipment that large tech companies like Meta and Alphabet install for their own use, rather than farming out to contractors, so the total impact is likely to be higher.) The big tech companies are the largest financiers of the frenzy, but private equity firms have been pouring in capital, too. Brookfield Asset Management, which manages a vast real estate portfolio, estimates that A.I. infrastructure will sop up $7 trillion over the next 10 years. The torrent of cash comes as the effects from Biden-era infrastructure subsidies fade, erratic tariffs freeze corporate decision making and high borrowing costs deter less lucrative real estate projects such as housing and warehouses. In 2025, spending on data center construction -- not including the cost of all the technology they house -- will exceed investment in traditional office buildings, according to the Dodge Construction Network. "The expectations of very high returns in this industry are trumping the high interest rates that we are facing today," said Eugenio Alemán, chief economist with the financial services company Raymond James. Companies are promising even more spending, but their ability to deliver, he noted, depends on whether their expectations are fulfilled. Most A.I. tools are not profitable currently, and they will have to generate huge cash flows over time for the tech companies to recoup their investments. "There is always a risk that very little of what they say is going to pan out," Dr. Alemán said. "So whenever they figure out that it is not what they thought, there is going to be a large correction." For now, everyone wants a piece of the spending. To understand the excitement around booming A.I. use, it's helpful to take a spin through corporate quarterly earnings calls. Publicly traded construction firms, electricity providers and electronics manufacturers are eagerly telling investors they can get a piece of the action: * Duos Technologies, which provides analytics and imaging for railroads and other infrastructure, has recently expanded into building small data centers. "Our business is commercially and financially in a great position to take advantage of the superhot demand coming from the data center computing gold rush," said Charles Ferry, the company's chief executive. * "These data center managers and big A.I. providers need energy and energy storage in an insatiable way," said Dennis Calvert, the chief executive of BioLargo, an environmental services company that sells a large-scale battery system. * "With data center growth and climate mandates accelerating demand for clean, reliable baseload power, the opportunity for advanced nuclear has never been stronger," said James Walker, the chief executive of NANO Nuclear Energy, which makes small reactors. Data centers are also attractive to traditional construction companies, which see the opportunity to shift from their typical real estate development projects into a new asset class with ample capital behind it. Skanska, a large contracting firm, forecasts that data center construction will average 13.2 percent annual growth through 2029, a far speedier rate than any other sector it tracks. The American Cement Association, an industry group, estimated that the sector would require a million metric tons of cement over the next few years. One of North America's largest building materials suppliers, Amrize, developed an "A.I. optimized" concrete mix with Meta that has lower carbon emissions. Amrize said data center construction was a "bright spot" in its otherwise soft second quarter. The boom has also been good for electricians, engineers and heavy-equipment operators. Although data centers that are up and running typically employ only a small number of people, the construction phase can put thousands to work. That's part of the reason that U.S. construction employment has remained steady even as housing, office and warehouse projects have dried up. How long can the spending last? The intensity of the A.I. investment wave has raised uncomfortable parallels to the last time the tech industry funneled billions of dollars into infrastructure to support a new technology with high expectations of future profits. In 2001, after the stock market crash brought on by the collapse of speculative dot-com companies, the telecommunications sector crumpled, too: Companies that had taken on debt to build out fiber-optic networks failed, creating an implosion that rippled through the global economy. Already, there are a few signs of caution. The chief executive of OpenAI, Sam Altman, raised eyebrows this month with remarks that the sector is "overexcited" and that some players will lose a lot of money. UBS, while generally positive on the industry, wrote in a note to clients that there could be some "indigestion" over the capital expenditures underway. At the moment, investors are reassuring themselves that a pullback would not be catastrophic. For one thing, data centers are financed by a diverse group of lenders, reducing the exposure of any one part of the banking system. Leases generally have long terms with hard-to-escape contracts, which could insulate landlords even if their deep-pocketed tenants had to walk away. For another, even if A.I. use doesn't live up to the hype, the internet is expanding quickly. The flood of data center capacity in the pipeline is still likely to be absorbed, even if more slowly than it is now. Vacancy in leased data centers -- that is, those that aren't owned by their users -- is currently close to zero. Future developments are usually spoken for ahead of time, according to JLL, the real estate professional services firm. "If we think about just general data creation and data storage, that has been growing at a rapid pace for decades, and that will continue to grow," said Andrew Batson, JLL's head of data center research for the Americas. "At some point there will be some natural slowdown in demand, but that's not in our near-term forecast." He expects the sector will keep growing about 20 percent annually through at least 2030. In the coming years, the most significant constraint on data center growth is more likely to be supply: The energy, water, workers and technical equipment required to construct and run them are all getting more expensive. At the same time, local communities, once eager to attract data centers, have occasionally soured on them. In the latest example, the City Council of St. Charles, Mo., placed a one-year moratorium on new facilities over concerns about drinking-water contamination. "There's a ton of money going into it, but at some point the cost is going to bite," said Eric Gaus, chief economist with Dodge Construction Network, which closely tracks new developments. "You've got the locals who are saying, 'If you're going to put something here, you need to do more than just build it and walk away.'" Cade Metz and Ben Casselman contributed reporting.
[2]
Nvidia's $46.7B Q2 proves the platform, but its next fight is ASIC economics on inference
Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now Nvidia reported $46.7 billion in revenue for fiscal Q2 2026 in their earnings announcement and call yesterday, with data center revenue hitting $41.1 billion, up 56% year over year. The company also released guidance for Q3, predicting a $54 billion quarter. Behind these confirmed earnings call numbers lies a more complex story of how custom application-specific integrated circuits (ASICs) are gaining ground in key Nvidia segments and will challenge their growth in the quarters to come. Bank of America's Vivek Arya asked Nvidia's president and CEO, Jensen Huang, if he saw any scenario where ASICs could take market share from Nvidia GPUs. ASICs continue to gain ground on performance and cost advantages over Nvidia, Broadcom projects 55% to 60% AI revenue growth next year. Huang pushed back hard on the earnings call. He emphasized that building AI infrastructure is "really hard" and most ASIC projects fail to reach production. That's a fair point, but they have a competitor in Broadcom, which is seeing its AI revenue steadily ramp up, approaching a $20 billion annual run rate. Further underscoring the growing competitive fragmentation of the market is how Google, Meta and Microsoft all deploy custom silicon at scale. The market has spoken. ASICs are redefining the competitive landscape in real-time Nvidia is more than capable of competing with new ASIC providers. Where they're running into headwinds is how effectively ASIC competitors are positioning the combination of their use cases, performance claims and cost positions. They're also looking to differentiate themselves in terms of the level of ecosystem lock-in they require, with Broadcom leading in this competitive dimension. The following table compares Nvidia Blackwell with its primary competitors. Real-world results vary significantly depending on specific workloads and deployment configurations: *Performance-per-watt improvements and cost savings depend on specific workload characteristics, model types, deployment configurations and vendor testing assumptions. Actual results vary significantly by use case. Hyperscalers continue building their own paths Every major cloud provider has adopted custom silicon to gain the performance, cost, ecosystem scale and extensive DevOps advantages of defining an ASIC from the ground up. Google operates TPU v6 in production through its partnership with Broadcom. Meta built MTIA chips specifically for ranking and recommendations. Microsoft develops Project Maia for sustainable AI workloads. Amazon Web Services encourages customers to use Trainium for training and Inferentia for inference. Add to that the fact that ByteDance runs TikTok recommendations on custom silicon despite geopolitical tensions. That's billions of inference requests running on ASICs daily, not GPUs. CFO Colette Kress acknowledged the competitive reality during the call. She referenced China revenue, saying it had dropped to a low single-digit percentage of data center revenue. Current Q3 guidance excludes H20 shipments to China completely. While Huang's statements about China's extensive opportunities tried to steer the earnings call in a positive direction, it was clear that equity analysts weren't buying all of it. The general tone and perspective is that export controls create ongoing uncertainty for Nvidia in a market that arguably represents its second most significant growth opportunity. Huang said that 50% of all AI researchers are in China and he is fully committed to serving that market. Nvidia's platform advantage is one of their greatest strengths Huang made a valid case for Nvidia's integrated approach during the earnings call. Building modern AI requires six different chip types working together, he argued, and that complexity creates barriers competitors struggle to match. Nvidia doesn't just ship GPUs anymore, he emphasized multiple times on the earnings call. The company delivers a complete AI infrastructure that scales globally, he emphatically stated, returning to AI infrastructure as a core message of the earnings call, citing it six times. The platform's ubiquity makes it a default configuration supported by nearly every DevOps cycle of cloud hyperscalers. Nvidia runs across AWS, Azure and Google Cloud. PyTorch and TensorFlow also optimize for CUDA by default. When Meta drops a new Llama model or Google updates Gemini, they target Nvidia hardware first because that's where millions of developers already work. The ecosystem creates its own gravity. The networking business validates the AI infrastructure strategy. Revenue hit $7.3 billion in Q2, up 98% year over year. NVLink connects GPUs at speeds traditional networking can't touch. Huang revealed the real economics during the call: Nvidia captures about 35% of a typical gigawatt AI factory's budget. "Out of a gigawatt AI factory, which can go anywhere from 50 to, you know, plus or minus 10%, let's say, to $60 billion, we represent about 35% plus or minus of that. ... And of course, what you get for that is not a GPU. ... we've really transitioned to become an AI infrastructure company," Huang said. That's not just selling chips. that's owning the architecture and capturing a significant portion of the entire AI build-out, powered by leading-edge networking and compute platforms like NVLink rack-scale systems and Spectrum X Ethernet. Market dynamics are shifting quickly as Nvidia continues reporting strong results Nvidia's revenue growth decelerated from triple digits to 56% year over year. While that's still impressive, it's clear the trajectory of the company's growth is changing. Competition is starting to have an effect on their growth, with this quarter seeing the most noticeable impact. In particular, China's strategic role in the global AI race drew pointed attention from analysts. As Joe Moore of Morgan Stanley probed late in the call, Huang estimated the 2025 China AI infrastructure opportunity at $50 billion. He communicated both optimism about the scale ("the second largest computing market in the world," with "about 50% of the world's AI researchers") and realism about regulatory friction. A third pivotal force shaping Nvidia's trajectory is the expanding complexity and cost of AI infrastructure itself. As hyperscalers and long-standing Nvidia clients invest billions in next-generation build-outs, the networking demands, compute and energy efficiency have intensified. Huang's comments highlighted how "orders of magnitude speed up" from new platforms like Blackwell and innovations in NVLink, InfiniBand, and Spectrum XGS networking redefine the economic returns for customers' data center capital. Meanwhile, supply chain pressures and the need for constant technological reinvention mean Nvidia must maintain a relentless pace and adaptability to remain entrenched as the preferred architecture provider. Nvidia's path forward is clear Nvidia issuing guidance for Q3 of $54 billion sends the signal that the core part of their DNA is as strong as ever. Continually improving Blackwell while developing Rubin architecture is evidence that their ability to innovate is as strong as ever. The question is whether a new type of innovative challenge they're facing is one they can take on and win with the same level of development intensity they've shown in the past. VentureBeat expects Broadcom to continue aggressively pursuing new hyperscaler partnerships and strengthen its roadmap for specific optimizations aimed at inference workloads. Every ASIC competitor will take the competitive intensity they have to a new level, looking to get design wins that create a higher switching costs as well. Huang closed the earnings call, acknowledging the stakes: "A new industrial revolution has started. The AI race is on." That race includes serious competitors Nvidia dismissed just two years ago. Broadcom, Google, Amazon and others invest billions in custom silicon. They're not experimenting anymore. They're shipping at scale. Nvidia faces its strongest competition since CUDA's dominance began. The company's $46.7 billion quarter proves its strength. However, custom silicon's momentum suggests that the game has changed. The next chapter will test whether Nvidia's platform advantages outweigh ASIC economics. VentureBeat expects technology buyers to follow the path of fund managers, betting on both Nvidia to sustain its lucrative customer base and ASIC competitors to secure design wins as intensifying competition drives greater market fragmentation.
[3]
Is the tide turning on the AI boom's myth of destiny?
For the last year and a half, AI hasn't just been a technology -- it's been a worldview. Nvidia's stock tore through Wall Street expectations and crowned the company more valuable than Microsoft and Apple. Microsoft pledged to spend like a sovereign wealth fund to bulk up Azure. Google rewired its entire roadmap around Gemini. Meta, never shy about a grand narrative, promised that "superintelligence" was within reach -- and CEO Mark Zuckerberg spent like it. The numbers matched the rhetoric: a trillion of market cap here, a trillion there, tens of billions in quarterly capital expenditures. The refrain was simple and contagious: inevitability. But inevitability can have a short shelf life in the world of technology. Over the past month, three jolts in particular have rattled the story. OpenAI CEO Sam Altman, who has made a career out of selling a version of the future, said the quiet part out loud: Asked if he thinks we're in an AI bubble, he said, "Yes." Meta, after months of splashy AI hiring and rhetoric, has reportedly frozen recruitment and chopped up its "super" lab. And MIT published research that went viral on LinkedIn, estimating that 95% of enterprise AI pilots return no business value. That trifecta -- a prophet hedging, a zealot pausing, and academics bringing receipts -- turned inevitability into a question. Markets noticed. Nvidia, the totem of the AI boom, will be treated less like a stock and more like a stress test for the entire economy when it reports its quarterly earnings on Wednesday. Its earnings call isn't just another quarterly update -- it's the hinge on which the hype rests. Wall Street expects another record, roughly $46 billion in revenue. But at a $4 trillion valuation, "better than expected" may not be good enough. If the golden child stumbles, even slightly, the talk on whether the tide is turning on AI gets louder. Already, cracks have shown. After a mid-August surge, tech stocks, including Nvidia and other AI-heavy firms, pulled back by about 1.6%, even as energy and real estate rose. Analysts warn that the Nasdaq's 41% gain since April may have inflated valuations -- a frothiness that makes every earnings print feel like a cliffhanger. And that's why Nvidia's earnings call has been elevated into a kind of secular rite. Take Humane's AI Pin. The $700 wearable was hyped as the next iPhone -- an AI-native device to liberate us from screens. It lasted less than a year before its assets were offloaded to HP in what amounted to a mercy sale. Or, take Microsoft's Recall, the feature billed as a photographic memory for your PC. Privacy watchdogs called it a surveillance nightmare, and the company had to walk back its rollout plans. For an industry that loves to declare "this changes everything," the first wave of consumer products has changed very little, except investor patience. The corporate numbers don't look much better. MIT's study put hard math on what many CIOs already suspected: Nearly all of those shiny AI pilots amount to little more than slideware. That finding rippled through boardrooms and social media feeds because it finally gave executives cover to say what they'd been whispering: AI demos are impressive, but they're not showing up in the P&L. Even among developers, the ground feels shaky. Stack Overflow's 2025 survey found that while 84% of coders now use AI tools, only 3% say they "highly trust" the outputs. Adoption is skyrocketing, but confidence is collapsing. The result is a paradox: AI is everywhere, and yet no one quite depends on it. Meta's reported recent pivot has only added to the sense of recalibration. After a year of Zuckerberg touting "superintelligence" and stuffing its payroll with AI hires for mind-boggling sums, the company suddenly froze recruitment, restricted transfers, and broke its mega-lab into four groups. The official line was focus. Some analysts called it discipline. But to most people watching, it looked like fatigue. For an industry that treats "more" as a strategy, a pause from one of the biggest spenders was its own kind of confession. Wall Street hasn't pulled the plug. Wedbush analyst (and raging tech bull) Dan Ives insists this is still "the second inning" of an AI bull market. But the market's edginess shows up in the tape: Palantir shares plunged more than 9% in a single session amid bubble chatter, while Nvidia dropped about 3.5% the same week. And Erik Gordon, a University of Michigan professor known for his bubble calls, warned Business Insider that the AI bust could prove even uglier than the dot-com collapse -- pointing to CoreWeave's stunning 33% valuation plunge, a $24 billion wipeout in just 48 hours, as a canary in the coal mine. Why? Because the infrastructure race is the only part of the AI boom that still feels like a sure thing. Nvidia's GPUs are the scarcest resource in tech. CoreWeave, a cloud startup that barely existed three years ago, is now buying up data centers as if they're beachfront property. Analysts may debate the future of copilots and chatbots, but no one questions the future of compute. The line from Big Tech is consistent: The returns are there in cloud, ads, and developer services; the spending is the bottleneck. That's why the market can wobble on sentiment and still finance another data center. But there's also concentration risk baked in. Tech giants now make up roughly 40% of the S&P 500. If AI sentiment turns, it's not just a few stocks at stake -- the entire market could feel it. There's a macro-version of this paradox, too. A John Thornhill column in the Financial Times argues that we're in Carlota Perez's "installation" phase -- manic investment, messy results -- before a "golden age" can materialize. Deutsche Bank analysts have echoed the concern, warning that the AI buildout mirrors past bubbles from 18th-century canals to the 1990s dot-com and telecom frenzies: vast overbuilding justified by the promise of transformation, only to pop when belief thinned. The New Yorker made a similar case this week, saying we're in an AI profit drought: vast spending, thin P&L evidence, long J-curve. All three narratives map cleanly onto what operators are saying privately. But the physical costs are coming due. AI workloads demand so much power that Google signed a deal with the Tennessee Valley Authority and a nuclear startup just to keep its Southeast data centers running. Researchers have quantified the water consumption of model training, showing that every breakthrough comes with an invisible utility bill. The "cloud," it turns out, is built of concrete, copper, and cooling towers. Regulators have woken up, too. On Aug. 2, the EU's AI Act began applying obligations to general-purpose models: transparency about training data, mandatory risk assessments, and new safety disclosures. The stricter rules will come in 2026, but the first bite is already here. In the U.S., agencies are circling corporate filings, probing whether companies are "AI-washing" their earnings calls. Copyright fights rage in the courts -- The New York Times is suing OpenAI -- even as other media groups cut licensing deals. And then there's China. After Washington's on-again-off-again export bans, Beijing has reportedly discouraged domestic firms from buying Nvidia's China-compliant H20 chip. That's not trivial; Nvidia makes roughly a quarter of its revenue in China. That's the kind of geopolitical tremor that could ripple through Wednesday's earnings call. In sentiment, maybe. Altman's bubble line broke the spell of inevitability. MIT's study turned hype into numbers, and the numbers were ugly. Meta's reported reorganization suggests that even the loudest boosters know when to pause. Developers are adopting but not trusting. Artists are suing. Regulators are writing (and rewriting) the rules. The "AI will change everything" pitch no longer lands as gospel. But in capital, not yet. The hyperscalers are still writing historic checks. NVDA is still the most important ticker in the market. If Wednesday's earnings blow past expectations again, the spending spree will look vindicated. But if they disappoint -- even slightly -- the bubble chorus will likely grow louder. Silicon Valley has always run on myth as much as math. The myth of inevitability allowed companies to raise obscene sums, to spend like nation-states, to paper over the fact that 95% of AI pilots go nowhere. The myth was strong enough to make a $4 trillion company out of Nvidia, to have Microsoft reimagine itself as an infrastructure empire, to send Zuckerberg chasing "superintelligence" like it's a Marvel subplot. But myths don't last forever. Eventually, someone reads the balance sheet. A year ago, AI was destiny. Chatbots were oracles, GPUs were holy relics, and anyone questioning the frenzy was accused of missing the future. Now? Its prophet mutters "bubble," and its supposed killer apps are retreating under the weight of their own demos. The question isn't whether or not AI is important -- it obviously is. The question is whether or not importance is enough to sustain trillion-dollar valuations and multitrillion-dollar infrastructure bets. If inevitability is gone, then AI, like every industry before it, will have to survive the harder test: proving it.
[4]
Wall Street shrugs at Nvidia's record Q2 earnings report
Every frenzy has its moment -- a point where record-breaking stops feeling record-breaking and jaw-dropping starts to sound like last quarter's news. On Wednesday, the AI frenzy had one of those moments. Nvidia posted a huge quarter -- $46.7 billion in revenue, $26.4 billion in profit, gross margins back above 72%, and guidance for an even fatter $54 billion in the current quarter -- and Wall Street basically said: OK, and? Shares slipped about 1.5% through Thursday morning, as if the market had already grown a little bit bored with history. The dissonance says everything. The AI economy is still expanding at a breakneck pace, but it's colliding with the law of diminishing returns: Each additional dollar of capex buys growth but less thrill and wonder. The spectacle is giving way to the grind -- a phase where power grids, licensing regimes, and productivity metrics matter as much as GPUs. Nvidia's print wasn't just a corporate milestone. It was a mirror for an industry learning that revolutions can also plateau. Nvidia's earnings report is just about the cleanest mirror you could hold up to the industry right now. It shows breathtaking demand, yes, and it also shows that speed limits are creeping in. Data-center revenue -- the heart of the company's AI story -- hit $41.1 billion, up 56% from a year ago, massive by any measure, yet was treated like routine. The company lifted its outlook to $54 billion and pointedly left China out. The market's response wasn't disappointment in the business; it was fatigue with the physics, the politics, and the tab. Morgan Stanley analyst Joe Moore called the quarter "a clean beat and raise" in a Thursday note, noting that guidance of $54 billion excluding China topped his $52.5 billion estimate. But his conclusion cut to the bone: "Sentiment has largely caught up to the growth potential." Nvidia cleared the bar, but the bar is now moving faster than the numbers. And at its altitude -- trading at roughly high-20s to high-30s times forward earnings, depending on the estimate -- even record-shattering results can look more like table stakes than a windfall. The ROI optics are dazzling. But at scale, those same economics look like a treadmill -- falling inference prices invite more usage, which drives more spend, which lowers costs again. "A steady diet of beat and raise quarters should be enough for the stock to work at ~27x EPS," Morgan Stanley's Moore said. But he also admitted investors "don't have a wall of worry to climb" anymore. In other words, the marginal thrill is harder to deliver, and the economics are looping in on themselves. Inference costs are collapsing, prompting more usage, which drives more spend, which lowers costs again. It's Jevons paradox in silicon: Every gain in efficiency drives more demand, which drives more spending. Add the slowdown of Moore's Law -- chips no longer double their performance on schedule -- and the treadmill comes into focus. Customers are sprinting harder just to stay in place. CEO Jensen Huang, when asked about Nvidia's product cadence, said the company is on an annual product cycle to speed cost and performance gains so customers can better absorb soaring infrastructure and power costs. Each cycle delivers more compute but not necessarily more returns per dollar. And diminishing returns creep in even amid industrial scale. Hyperscalers are throwing money at GPUs like never before. Microsoft is preparing to spend nearly $30 billion this quarter on data-center capex, Alphabet lifted its 2025 budget to $85 billion, Meta nudged its plan to as much as $72 billion. That ballooning spend is both Nvidia's lifeblood and its risk factor. The company's growth is chained directly to hyperscalers' willingness to keep writing massive checks even as investors grow impatient for a payoff. For even if there's investor fatigue, Nvidia isn't slowing. The company has turned scale into strategy, matching ballooning hyperscaler budgets with the only supply chain that can keep up. That kind of dominance doesn't vanish in a quarter. "We're in every cloud for a good reason," Huang told analysts on the earnings call. "You've heard me say before that in a lot of ways, the more you buy, the more you grow. And because our performance per dollar is so incredible, you also have extremely great margins. So the growth opportunity with Nvidia's architecture and the gross margins opportunity with Nvidia's architecture is absolutely the best. And so there are a lot of reasons why Nvidia has been chosen by every cloud and every startup and every computer company." Huang told investors he's "working with the administration" to secure licenses, but analysts are treating China as a ghost market or a phantom limb. Morgan Stanley calls it "impossible to forecast." Jefferies pegs the upside at $2-5 billion in a single quarter if approvals come through. Wedbush went further, telling clients that a reopened China could propel Nvidia to a $5 trillion market cap by early 2026. For now, it's all phantom math -- a $50 billion prize that inflates models without hitting income statements. Beyond geopolitics, physics is the other governor. After years of flat consumption, U.S. electricity demand is surging toward record highs in 2025 and 2026, with AI data centers a primary culprit. PJM, the country's largest grid operator, says future capacity prices have jumped tenfold. Globally, the International Energy Agency expects data-center power use to more than double by 2030, equaling Japan's entire national consumption. But Nvidia has built its pitch around efficiency: "Not only are we the most energy-efficient, our perf per watt is the best of any computing platform," Huang said on the call. "And in a world of power-limited data centers, perf per watt drives directly to revenues." Packaging is another bottleneck. Advanced CoWoS capacity at TSMC is being doubled this year, but demand still outstrips supply. William Blair noted that Nvidia's Blackwell Ultra already delivered over $10 billion in second-quarter revenue -- faster than expected -- precisely because it cornered packaging slots early. Memory isn't faring better. SK Hynix and Micron have said their high-bandwidth DRAM (HBM) supply is sold out well into 2025. Without HBM, racks of GPUs are stranded assets. Meanwhile, networking -- once an afterthought -- has become Nvidia's unlikely breakout star. Revenue from interconnects surged 46% last quarter to $7.3 billion. On the earnings call, Huang waxed poetic about Nvidia's three-layer approach: "scale up" with NVLink, "scale out" with InfiniBand, "scale across" with Spectrum-XGS. Networking is now the glue that turns racks into supercomputers and clusters into "AI factories." It's also the latest reminder that bottlenecks don't vanish -- they just migrate. Nvidia's pivot is to sell the system, not just the chips. Spectrum-X Ethernet, which Kress said is now "an annualized revenue exceeding $10 billion," is the connective tissue. Sovereign AI has doubled year over year, now on track for over $20 billion in 2025. RTX Pro servers have nearly 90 early adopters, from Disney to Hyundai to Eli Lilly. Nvidia isn't just selling GPUs; it's selling the factory. Wall Street still loves Nvidia's story -- and for good reason. The company remains the crown jewel of the chip trade. Citi, JPMorgan, Oppenheimer, BNP Paribas, and Morgan Stanley all raised their price targets after the earnings report, clustering around $210-$225. Morgan Stanley's Moore still calls Nvidia his "most preferred name in large cap AI." William Blair highlighted traction beyond hyperscalers. Jefferies named Nvidia its "Top Pick," stressing that demand remains "rock solid." And Citi analysts argued Nvidia's dominance in data-center AI "is not only intact but expanding," underscoring why most of Wall Street still leans bullish on the company. So why does history-making growth feel so... ordinary? The irony of diminishing returns is that Nvidia is still hitting numbers that would make any other company blush. Gross margins are climbing. Product roadmaps are on time. Blackwell racks are shipping at an industrial pace. Networking is a profit center. The company just authorized another $60 billion in buybacks. This isn't a collapse. It's a normalization -- the shift from a frenzy that shocked the market to a business that has to satisfy it. Growth is still staggering, but each additional record buys less awe. Nvidia remains the metronome of the boom. It's still the only company that can sell out entire generations of chips before they ship. It's still the name that moves the Nasdaq. It's still the company that analysts pile superlatives on. It's still the must-own name whose CEO and CFO can frame servers as "factories of intelligence" and be taken literally. But Wednesday's report showed the beginnings of a new reality: Diminishing returns aren't a hypothetical -- they're visible in the market's reaction. The AI revolution hasn't slowed; it's entered the part of the curve where physics, politics, and investor patience set the tempo. And that's when the thrill of "more" starts to sound a lot like "enough."
[5]
NVIDIA's 56% revenue growth isn't enough for greedy short termists on Wall Street. But is there genuine reason for concern?
The world's most valued company turned in revenue growth in excess of 50% year-on-year yesterday - and its share price went down. Have we finally passed through the Looking Glass into an AI-powered realm where nothing can be taken for granted? The firm, of course, was NVIDIA, which turned In Q2 revenue of $46.7 billion, up 56% year-on-year, with net income of $26.4 billion. According to CEO Jensen Huang: This year is obviously a record-breaking year. I expect next year to be a record-breaking year. And while we continue to increase the performance of AI capabilities as we race towards Artificial Supe Intelligence on the one hand and continue to increase the revenue generation capabilities of our hyperscalers on the other hand. So, why was Wall Street disappointed in the kind of numbers that other firms would kill for? Well, in large part, let's do the eternal short termists no favors and call it for what it is - utter greed! It's AI, it's NVIDIA, give us more, more, more!!!! But the more conscientious among them may also have some concerns moving forward - revenue growth is projected to fall, albeit to a mere 52% year-on-year, but that's still a cooling off. Then there is the question of where NVIDIA's money is coming from and particularly how dependent it is on certain sources. The firm is, of course, benefitting from the much needed infrastructure spend and rollout by hyperscaler companies, as Huang acknowledges: The CapEx of just the top four hyperscalers has doubled in two years. As the AI revolution went into full steam, as the AI race is now on, the CapEx spend has doubled to $600 billion per year. There's five years between now and the end of the decade, and $600 billion only represents the top four hyperscalers. We still have the rest of the enterprise companies building on-prem. You have cloud service providers building around the world. The United States represents about 60% of the world's compute. And over time, you would think that Artificial Intelligence would reflect GDP scale and growth and would be, of course, accelerating GDP growth. So our contribution to that is a large part of the AI infrastructure. Out of a gigawatt AI factory, which can go anywhere from $50 billion to plus or minus 10%, let's say, $50 billion to $60 billion, we represent about $35 billion plus or minus of that and $35 billion out of $50 billion per gigawatt data center. Fair enough - and the willingness of governments all around the world to open their doors to NVIDIA in 'partnership' in the hope of playing catch-up with their own neglected national infrastructures - hi, UK, yes, we're looking at you! - suggests that this is a scenario that isn't changing soon. But the company is still currently being bankrolled by the needs of a small number of clients - Microsoft and Meta are estimated to provide nearly a third of revenue. According to Nigel Green, CEO of global financial advisory giant deVere Group, that's a dangerous vulnerability: The concentration of earnings is too high. When almost a third of revenue depends on just two of Nvidia's clients, the vulnerability is obvious. If either rein in spending, the shockwaves would hit immediately. Markets don't like that level of exposure. That said, neither Meta nor Microsoft - or other firms with need, such as Oracle - are going to be going away any time soon, and all have spoken of ongoin untapped AI demand that they need to satiate, so the vulnerability may be more academic than pragmatically threatening at present. More immediately worrying is the uncertainty caused by relations between the US and China. The Trump administration previously blocked exports of NVIDIA's H20 chip to China, although its qualms were seemingly overcome by being handed 15% of the revenues. That deal is struck, but has still not been codified. Assuming it is, that could add a few billion extra to revenue in coming quarters. If it isn't and Intel's latest shareholder continues to ramp up tensions with China, who knows what comes next? Huang knows the importance of clarity here: The China market, I've estimated to be about $50 billion of opportunity for us this year if we were able to address it with competitive products. And if it's $50 billion this year, you would expect it to grow, say, 50% per year. As the rest of the world's AI market is growing as well. It is the second largest computing market in the world, and it is also the home of AI researchers. About 50% of the world's AI researchers are in China. The vast majority of the leading open source models are created in China. And so it's fairly important, I think, for the American technology companies to be able to address that market. The open source models that have come out of China are really excellent. DeepSeek, of course, gained global notoriety. Qwen is excellent. Kimi's excellent. There's a whole bunch of new models that are coming out. They're multi-modal. They're great language models. And it's really fueled the adoption of AI in enterprises around the world because enterprises want to build their own custom proprietary software stacks. And so open source model's really important for enterprise. It's really important for SaaS who also would like to build proprietary systems. It has been really incredible for robotics around the world. So open source is really important, and it's important that the American companies are able to address it. It's going to be a very large market. We're talking to the Administration about the importance of American companies to be able to address the Chinese market. And as you know, H20 has been approved for companies that are not on the entities list, and many licenses have been approved. And so I think the opportunity for us to bring [our] Blackwell [GPU] to the China market is a real possibility. We just have to keep advocating the sensibility of and the importance of American tech companies to be able to lead and win the AI race and help make the American tech stack the global standard. The buzz is, everything sold out...A new industrial revolution has started. The AI race is on. Wall Street's wobbles aren't likely to be giving Huang any sleepless nights I suspect. The numbers being reported here are just astonishing - and no, it can't last. There will be a cooling off, hopefully a measured one that the markets can adapt to rather than an almighty crash that rips economies apart. But for now, the message to Wall Street ought to be - stick your snouts in the trough and get your spoons in the gravy right up to the elbow.
[6]
Nvidia Just Sounded the Silent Alarm -- but Are Investors Paying Attention? | The Motley Fool
Two subtle warnings in Nvidia's latest report point to possible trouble for Wall Street's most important artificial intelligence (AI) stock. Over the past three years, nothing has moved markets more than the rise of artificial intelligence (AI), and no stock has benefited more from the evolution of AI than Nvidia (NVDA -3.38%). The prospect of having software and systems empowered with AI making split-second decisions is a potential game-changer in most industries around the globe. Nvidia has found its niche as the leading supplier of graphics processing units (GPUs) used in enterprise data centers. GPUs are the "brains" that fuel split-second decision-making, as well as the training of large language models. No earnings report tends to garner more attention than Nvidia. While surpassing Wall Street's consensus revenue and profit forecasts has been something of the norm, Nvidia's latest report sounded a silent alarm that should serve as a warning to shareholders, as well as temper artificial intelligence hype. If there's one thing Nvidia has on its side, it's consistency. Its $46.7 billion in net sales (up 56% from the year-ago period), and its $1.05 in earnings per share (EPS), both handily leaped over the bars laid by Wall Street analysts. This was the 11th consecutive quarter that Nvidia's EPS surpassed expectations. The star of the show continues to be its data center segment, which accounted for more than 88% of reported sales. Exceptionally strong sales of Blackwell, coupled with "extraordinary" demand for next-gen chip Blackwell Ultra, according to CEO Jensen Huang, has powered the charge. Perhaps the most impressive operating metric in Nvidia's fiscal second quarter report (ended July 27) is its generally accepted accounting principles (GAAP) gross margin, which clocked in at 72.4%. Though this was down 270 basis points from the prior-year period, it represents the first sequential quarterly improvement in gross margin in more than a year. Huang hasn't been shy about his desire to maintain Nvidia's compute advantages and aims to bring a new advanced AI chip to market each year. The ramp-up of Blackwell Ultra will be followed by the expected debuts of Vera Rubin and Vera Rubin Ultra in the latter-halves of 2026 and 2027, respectively. This uptick we're seeing in GAAP gross margin indicates Nvidia is maintaining strong pricing on its AI hardware, at least for now. In addition, Nvidia's about-face with its gross margin may indicate ongoing AI-GPU scarcity. Enterprise demand outweighing supply has been one of Nvidia's core catalysts to this point. However, this earnings report isn't the slam-dunk it's made out to be, with two silent warnings alerting to potential trouble on the horizon for Wall Street's most valuable company and most-hyped innovation. The first issue has to do with Nvidia's concentration of revenue, which can be found in its quarterly filed 10-Q with the Securities and Exchange Commission. Per the company, "For the second quarter of fiscal year 2026, sales to one direct customer, Customer A, represented 23% of total revenue; and sales to a second direct customer, Customer B, represented 16% of total revenue, respectively, both of which were attributable to the Compute & Networking segment." In other words, approximately $18.2 billion of Nvidia's $46.7 billion in total sales came from just two companies during the latest quarter. Within its data center segment ("Compute & Networking"), this dynamic duo accounted for more than 44% of sales. Over the last year, Nvidia has become increasingly reliant on a narrower group of companies for its net sales and profit growth. On one hand, optimists can argue that it's a good thing Nvidia is intricately tied to these dominant businesses. Though Nvidia doesn't spill the beans on the names of its top clients, customers A and B are (in no particular order) probably Meta Platforms (META -1.69%) and Microsoft (MSFT -0.65%), with both "Magnificent Seven" members aggressively investing in AI-data center infrastructure. On the other hand, both Meta and Microsoft are internally developing AI-GPUs to use in their data centers. Even though these chips present no external threat to Nvidia, they're considerably cheaper and more readily accessible (not backlogged), and they can easily cost Nvidia valuable data center real estate in future quarters. Furthermore, Huang's emphasis on bringing a new advanced AI chip to market each year has the potential to rapidly depreciate prior-generation GPUs. This could quickly devalue the GPUs that companies such as Meta and Microsoft have already purchased, which in turn may delay upgrade cycles and/or weigh on Nvidia's gross margin in the future if buyers opt for cheaper hardware. The other silent and subtle warning investors will find in Nvidia's latest earnings release is its "reward" for long-term shareholders. With $14.7 billion remaining under its prior share repurchase authorization, Nvidia's board approved an additional $60 billion worth of buybacks. Share buybacks and dividends are two of the most direct ways public companies create incentives for long-term investing. For businesses with steady or growing net income (and Nvidia certainly qualifies), share buybacks have the ability to increase EPS as the outstanding share count declines. In short, share repurchases can make a company's stock more fundamentally attractive to investors. So why is a $60 billion share-repurchase allotment bad for Nvidia? For starters, the last thing investors of a company that just delivered 56% year-over-year sales growth should want to see is it spending its capital on buybacks. Nvidia stock has gained more than 1,100% since 2023 began and is sporting a historically high price-to-sales (P/S) ratio. Aside from the fact that its shares aren't particularly cheap, a $60 billion share buyback authorization suggests management is struggling to find other high-growth initiatives to invest its capital into. It's also incredibly odd to see Nvidia's board promote such a sizable repurchase program when no insider has purchased shares of the company since early December 2020. Over the trailing-five-year period, more than $4.7 billion of Nvidia stock has been sold by executives and board members. Between Nvidia's dangerous revenue concentration and its head-scratching buyback allotment, things may not be as perfect for the company or the AI revolution as they appear.
Share
Share
Copy Link
Nvidia reports record Q2 earnings, showcasing the ongoing AI boom. However, concerns arise about market saturation, geopolitical tensions, and the sustainability of AI investments.
Nvidia, the world's most valued chip company, reported staggering Q2 2026 earnings with revenue of $46.9 billion, up 56% year-over-year, and net income of $26.1 billion
1
.Source: Quartz
5
. Despite these impressive figures, Nvidia's stock price dipped slightly, indicating a potential shift in market sentiment.The AI spending frenzy is not just inflating stock prices but is also boosting the real economy. Companies are projected to spend $375 billion globally on AI infrastructure in 2025, rising to $500 billion in 2026
1
. This massive investment in data centers, semiconductor factories, and power supply is creating significant business activity, accounting for a quarter of all economic growth in the past year according to Commerce Department data1
.The rapid expansion of AI infrastructure is putting pressure on existing systems. U.S. electricity demand is surging towards record highs, with AI data centers being a primary driver. The International Energy Agency expects data-center power use to more than double by 2030, equaling Japan's entire national consumption
4
. This has led to concerns about the sustainability and scalability of AI investments.Source: The New York Times
Nvidia's growth is partially constrained by geopolitical factors, particularly U.S.-China relations. The company estimates the Chinese market represents a $50 billion opportunity this year, but export restrictions have limited access
5
. Huang emphasized the importance of addressing the Chinese market, noting that "about 50% of the world's AI researchers are in China"5
.As the AI boom matures, there are signs of market saturation and increased competition. Custom application-specific integrated circuits (ASICs) are gaining ground on performance and cost advantages over Nvidia GPUs
2
. Major tech companies like Google, Meta, and Microsoft are deploying their own custom silicon at scale, potentially challenging Nvidia's dominance in the long term2
.Related Stories
While Nvidia's growth remains impressive, there's a sense that the market has "largely caught up to the growth potential," according to Morgan Stanley analyst Joe Moore
4
. The company's high valuation (trading at 27-37 times forward earnings) means that even record-breaking results may be seen as merely meeting expectations rather than exceeding them4
.Source: Quartz
Nvidia's Q2 earnings report showcases the ongoing strength of the AI boom and its impact on the broader economy. However, it also highlights emerging challenges such as infrastructure limitations, geopolitical tensions, and potential market saturation. As the AI industry matures, companies like Nvidia will need to navigate these complexities to maintain their growth trajectory and market position.
Summarized by
Navi
[1]
[2]