10 Sources
10 Sources
[1]
How Broadcom's big OpenAI deal fits into the data center boom and what it means for the AI trade
Kicking off the week with a bang, OpenAI announced another massive data center buildout partnership. This time with Club name and custom semiconductor industry leader Broadcom. Shares of Broadcom soared as much as 10.7% on Monday after the two companies revealed that they have been working for 18 months on custom-designed chips optimized for inference, the process of running AI models on a day-to-day basis. Deployment of 10 gigawatts of the co-designed chips, which will be networked through Broadcom's Ethernet stack, is set to start late next year. Broadcom stock has gained roughly 54% year to date. The news is the latest in a series of monster deals over the past few weeks. On Sept. 22, OpenAI struck a deal for 10GW of Nvidia-based data centers . Two weeks later, OpenAI made 6GW deal with Nvidia competitor Advanced Micro Devices . Both of those deals, along with Monday's Broadcom announcement, follow the massive $500 billion Stargate Project announced by OpenAI, Softbank, and Oracle . Stargate is targeting 10GW of capacity but may exceed that initial goal over time as the initiative expands internationally, starting with Stargate UAE . Adding it all up, Open AI has announced roughly 36GW of data center capacity since the beginning of the year, with 26GW worth of deals coming in the past three weeks alone. To put that into perspective, the Hoover Dam's power-generating capacity is about 2GW, which is enough to power nearly 1.8 million homes for an entire year. So, what we're talking about here is the equivalent of about 18 Hoover Dams or enough energy to power more than 31 million homes for an entire year. Though the build-out of all these projects will take several years, OpenAI co-founder and president Greg Brockman, speaking with Jim Cramer on CNBC shortly after the news was announced, made clear that these projects, which combined represent the largest industrial undertaking in history, are needed to get ahead of a clear "avalanche" of demand. Asked if he meant that OpenAI was hoping to see immense demand, Brockman stressed it is not about hope but rather that the company is "currently being swept along by the avalanche." As an example, Brockman said that while ChatGPT was the fastest-growing consumer app in history, the company's new video generation app, Sora, is now growing even faster. ChatGPT, which was launched by Microsoft -backed OpenAI in late 2022, quickly went viral. The chatbot brought generative AI to the masses and launched the current boom in developing and deploying the technology. According to Broadcom's release , "By designing its own chips and systems, OpenAI can embed what it's learned from developing frontier [or cutting edge] models and products directly into the hardware, unlocking new levels of capability and intelligence. Brockman said on CNBC that it will also reduce the cost of running AI models. OpenAI's Brockman appeared on CNBC alongside Charlie Kawwas, president of Broadcom's semiconductor solutions group. CNBC's David Faber asked Kawwas during Monday's interview if OpenAI was the fourth new major custom-chip partner that Broadcom disclosed but didn't name during last month's post-earnings conference call. Kawwas said , "I would love to take a $10 billion PO [purchase order] from my good friend Greg [Brockman]; he has not given me that PO yet, so I hope that answers the question." So, this means that there is another major buyer out there looking to stand up significant data center computing power. While we can't say for sure who it is, analyst Ben Reitzes of Melius Research speculated last week that the fourth Broadcom customer may also be Amazon -backed Anthropic. Jim will ask Broadcom CEO Hock Tan, among other things, during Monday's interview for "Mad Money." The Broadcom-OpenAI news is just another example of how early we still are in the AI trade. While we are certainly seeing areas of froth and bubble-like activity in some parts of the market - Jim cautioned against some of those names in his Sunday column - we don't think that is the case for names like Broadcom and Nvidia, nor is it the case for the highly profitable megacaps driving much of the money behind the trade. Even at these levels, we think names like Broadcom and Nvidia still trade at reasonable levels given their growth outlooks. In terms of Nvidia's deal with OpenAI, Nvidia CEO Jensen Huang, at the time, said this one deal was equal to the entirety of the company's 2025 revenue. Jensen told Jim last week at the Club's October Monthly Meeting that "partnership with OpenAI is really incremental to all the work that we have done with Azure, OCI, and CoreWeave." Azure is Microsoft's cloud. OCI stands for Oracle Cloud Infrastructure. CoreWeave, another Nvidia-backed company, rents out computing power run on Nvidia chips. Lastly, we would be remiss if we didn't call out that with great power demand comes the need for great companies that enable power generation. Jensen told Jim last week that without meeting America's increasing energy demand, driven in large part by AI, there is no industrial growth, without industrial growth, there's no stock price growth, there's no economic growth, there's no national security." For the Club, that means GE Vernova , up more than 7% on Monday, and Eaton , up roughly 2.5%. GE Vernova makes natural gas turbines that can and are being hooked up to data centers to make electricity when the power grid is overtaxed. The company also builds small modular nuclear reactors, though meaningful additional nuclear power is still a few years away. Eaton, on the other hand, makes power management systems and components that enable data centers to run more efficiently, with greater uptimes and scalability. Eaton's cooling and airflow management solutions are also at the heart of maintaining environmental conditions in the data center. (Jim Cramer's Charitable Trust is long AVGO, NVDA, MSFT, AMZN. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust's portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.
[2]
Broadcom's 10GW AI Deal With OpenAI Could Generate Up To $300 Billion Revenue, Analysts Highlight Growth From Custom Silicon And Networking Solutions - Broadcom (NASDAQ:AVGO)
Broadcom (NASDAQ:AVGO) and OpenAI have launched a multi-year partnership to deploy 10 gigawatts of custom artificial intelligence silicon and rack-level systems, integrating compute Application-Specific Integrated Circuits (ASICs) along with Broadcom's Ethernet, Peripheral Component Interconnect Express (PCIe) and optical connectivity solutions. The deal expands beyond compute chips to networking silicon and server racks, positioning Broadcom to optimize performance and scale next-generation technologies. JP Morgan analyst Harlan Sur maintained an Overweight rating on Broadcom with a price forecast of $400. Goldman Sachs analyst James Schneider reiterated a Buy on Broadcom and raised the price forecast from $360 to $380. Also Read: Nvidia, Broadcom Highlighted As Sector Leaders Drive AI Growth Despite Buildout Risks: Analyst JP Morgan: Sur noted that Broadcom and OpenAI have partnered on a multi-year plan to deploy 10 gigawatts of custom AI silicon and rack-level systems, integrating compute XPU ASICs along with Broadcom's full portfolio of Ethernet, PCIe, and optical connectivity solutions. The analyst emphasized that the collaboration extends beyond compute chips to networking silicon and server racks, allowing Broadcom to optimize the stack for efficiency gains and maximize intelligence per watt. He said that next-generation technologies, including 3D chip stacking and co-packaged optics, could further scale computing capabilities. Sur estimated that each gigawatt of capacity could generate $25-$30 billion in revenue for Broadcom, implying a total potential revenue of $250-$300 billion for the 10 GW infrastructure. The analyst stated that deployment will commence in the second half of 2026 and is expected to be completed by 2029, with initial annual revenue projected to range from $70 billion to $90 billion over the four years. He also confirmed that the previously announced $10 billion order from Broadcom's last earnings call was likely for Anthropic, not OpenAI, involving Tensor Processing Unit (TPU) ASICs co-designed with Google and Broadcom for networking and rack-scale deployment. Sur reiterated Broadcom as his top semiconductor pick due to its diversified business model, strong margin profile, and exposure to growth trends in AI, data centers, and infrastructure. Goldman Sachs: Schneider highlighted Broadcom's strategic partnership with OpenAI to deploy 10GW of custom-designed AI accelerators and networking products. The analyst noted that deployments reinforced Broadcom's technology leadership in custom silicon. He also highlighted the company's strong position relative to its competitors, including Advanced Micro Devices' (NASDAQ:AMD) 6GW partnership with OpenAI, which involves a significant equity contribution. Schneider estimated that each gigawatt of AI datacenter deployment could generate $10-$15 billion in incremental revenue and contribute roughly $1.00-$1.50 in EPS for Broadcom. The analyst noted that the announcement signals growing adoption of custom silicon in the accelerator market and strengthens Broadcom's leadership in custom ASIC and Ethernet networking. He highlighted that AI networking and rack-level solutions for OpenAI present an underappreciated upside. Schneider reiterated Broadcom as a Buy, citing strong visibility into fiscal 2026/27, industry-leading margins, and expected stock outperformance. Price Action: AVGO stock was trading lower by 3.49% to $344.26 at last check on Tuesday. Read Next: Cisco Takes On Broadcom With Its Most Powerful Router Yet Photo: Shutterstock AVGOBroadcom Inc$346.83-2.77%OverviewAMDAdvanced Micro Devices Inc$222.192.67%Market News and Data brought to you by Benzinga APIs
[3]
Nvidia, Broadcom, and AMD Each Won Deals With OpenAI. Here's the Biggest Winner of the Bunch. | The Motley Fool
These chip rivals each aim to win as AI spending reaches into the trillions of dollars. Recent weeks have been ripe with deals among artificial intelligence (AI) players. Words from Nvidia's (NVDA 0.86%) Jensen Huang during the company's earnings report in August set the stage for this: At the time, Huang predicted that AI infrastructure spending would reach as much as $4 trillion by 2030. This signals that demand for compute is strong and that in the coming quarters and years, orders for AI chips may explode higher. That would benefit designers of chips, including Nvidia and rivals like Broadcom (AVGO -1.24%) and Advanced Micro Devices (AMD -0.63%). And this has already begun, as each of these players has struck a deal with one of today's biggest AI players: OpenAI. The AI research lab, creator of popular chatbot ChatGPT, needs enormous amounts of compute to power its platforms -- and so it recently turned to these top chip companies to prepare itself for the years to come. Let's check out the three deals and find out which chip designer is the biggest winner of the bunch. So, first, a quick note about these chip rivals. Nvidia, first to market with its graphics processing units (GPUs), dominates with these very powerful AI chips. AMD entered the market later but has been innovating to rival Nvidia and now offers chips that some analysts say can compete with those of the market leader. Finally, Broadcom is known as an expert in networking, and along with this, designs chips called XPUs -- these are custom accelerators, so they're made to excel at specific tasks. Nvidia and AMD chips are general purpose to be used in a wide range of AI contexts. All three companies have seen demand for their chips climb and have reported AI revenue growth in the double-digits in recent quarters. Now, let's consider the OpenAI deals, and we'll start with Nvidia. Last month, Nvidia said it would invest as much as $100 billion in OpenAI as the research lab builds out 10 gigawatts of Nvidia systems over the coming years -- deployment will begin in the second half of next year, based on Nvidia's upcoming Vera Rubin system. Nvidia will progressively invest in OpenAI as this deployment unfolds. For reference, one gigawatt is the equivalent of 294 utility-scale wind turbines, according to the Department of Energy. Then, earlier this month AMD and OpenAI struck a deal. OpenAI will deploy six gigawatts of AMD chips over a period of years, and as in the Nvidia agreement, this will begin in the second half of next year. As part of the plan, AMD has issued OpenAI a warrant for as many as 160 million shares -- representing 10% of AMD. The shares will vest according to specific milestones achieved. So, it's possible that OpenAI, as part of this deal, will eventually hold 10% of AMD. Finally, in recent days, OpenAI said it would co-develop systems that include Broadcom's chips and networking solutions. This is for 10 gigawatts of Broadcom's XPUs, and like the other aforementioned deals, rollout will begin in the second half of next year. They didn't release financial terms of the deal. Now, let's return to our question: Which chip designer may be the biggest winner? The opportunity to power OpenAI's projects is great news for all of these players, but I think Nvidia has scored the best deal of the bunch, and here's why. Nvidia, thanks to its pledge to invest in OpenAI, has ensured that deployment of its GPUs will happen -- as the company progressively invests in OpenAI, that offers the research lab the ability to pay for its infrastructure scale up. So, this is a great investment for Nvidia as it ensures its GPUs will be central to this next stage of AI growth. Also, Nvidia, with more than $56 billion in cash today, has the financial resources to commit to billion-dollar investments over time. All of this should continue to fuel revenue growth at Nvidia and push the stock price higher during this exciting phase of the AI story.
[4]
Morgan Stanley revamps Broadcom's price target with a twist
Broadcom (AVGO) just signed on to perhaps the biggest AI infrastructure push in recent memory. On October 13, OpenAI and Broadcom announced a mind-boggling new deal that has them co-developing and co-deploying a 10-gigawatt fleet of custom accelerator racks, starting in late 2026 and continuing through 2029. That's a far from standard chip order. The deal involves long-haul infrastructure, including custom chips, Ethernet fabrics, and power-hungry racks tailor-made for handling AI compute. Broadcom's role runs deep, and the stakes have only gotten bigger. Key facts on OpenAI-Broadcom deal: * Rack rollouts are expected to be phased out and milestone-based, rather than a single mega-shipment. * Broadcom's accelerators will link up with its powerful Ethernet switches and optics. * Delivery is linked to packaging capacity, fab supply, and site-level power. Wall Street clearly took notice, with Broadcom stock jumping almost 10% on the announcement of the deal. Now, Morgan Stanley just revamped its target on AVGO stock, but its bullish note comes with a catch. The firm's model introduces a twist that can potentially temper investors' expectations for the upside to materialize. Morgan Stanley raises Broadcom target but throws in a grain of salt Morgan Stanley is bullish on Broadcom's OpenAI deal, but it isn't exactly buying the hype wholesale. In a new note, analysts bumped their price target on AVGO to $409 from $382 while maintaining an overweight rating, hailing the power-packed collaboration with OpenAI "a clear positive" for long-term AI visibility. Still, Morgan Stanley is taking "the scale with a grain of salt." The firm acknowledges that the potential topline numbers may excite, but it's also imperative to avoid stacking expectations too high, since the plan is "clearly distinct" from the unnamed customer mentioned during Broadcom's earnings call. Accordingly, Morgan Stanley is looking to avoid double-counting demand, which the Street initially incorporated into models. Morgan Stanley is aiming to keep the models clean and avoid double-counting, even as it raises its 2027 AI data-center sales outlook by roughly $8 billion to a substantial $73.2 billion. The firm still sees a "constructive" path ahead. AVGO's upside story isn't linked to just one megadeal, and there's plenty of depth in its broader funnel, including custom accelerator ASICs, Ethernet switches, and optics, positioning it to benefit across multiple layers of infrastructure. Quick takeaways: * Morgan Stanley lifted Broadcom's price target to $409 but is taking the OpenAI deal projections "with a grain of salt." * The new 10-gigawatt AI rack plan is separate from the "new customer" mentioned on Broadcom's earnings call. * AVGO's AI upside is diversified, with growth expected from custom accelerator chips and Ethernet components, rather than relying on a single big-ticket deal. Broadcom's AI buildout finally hit the bottom line Broadcom stepped up its game this year in AI, earnings, and investor returns. While many of its peers chased AI headlines, Broadcom built the hardware stack to make it all work, delivering the chips and networking gear that's critical in running AI at scale. That foundation paid off immensely. AI sales jumped 77% year-over-year in Q1 to about $4.1 billion. In Q2, it grew 46% to more than $4.4 billion. By Q3, Broadcom's total sales jumped to a whopping $15.95 billion, up 22% from last year, with its adjusted EBITDA coming in at $10.7 billion, or 67% of sales. Management expects AI semiconductor sales to reach nearly $6.2 billion in Q4, a massive 66% jump, which marks 11 consecutive quarters of AI growth. Broadcom's product lineup also leveled up in a big way. It recently launched the Thor Ultra, an 800G Ethernet NIC (network interface card) that's custom-built for AI, and the Tomahawk 6 "Davisson," a lightning-fast 102.4-Tbps Ethernet switch, along with co-packaged optics cutting power and improving reliability in heavy AI data flows. Additionally, its software side (primarily VMware) transitioned to subscriptions, laying the foundation for long-term recurring sales, even as legal disputes persisted in Europe over the VMware merger. Meanwhile, Broadcom handsomely rewarded its shareholders with a $10 billion stock buyback, approved through the end of 2025, and announced a dividend per share of $0.59 in Q3. All that momentum helped take AVGO stock up by nearly 50% year-to-date by mid-October, and almost 90% in the past six months alone. The Arena Media Brands, LLC THESTREET is a registered trademark of TheStreet, Inc. This story was originally published October 15, 2025 at 11:49 AM.
[5]
Why Broadcom Stock Surged This Week | The Motley Fool
Shares of Broadcom (AVGO 0.72%) were up this week following a deal between the semiconductor giant and OpenAI to deploy a large quantity of custom artificial intelligence (AI) accelerators over a multiyear time frame. As of early Thursday afternoon, Broadcom stock was up about 8.6% for the week, according to data provided by S&P Global Market Intelligence. On Monday, Broadcom and OpenAI announced a partnership that will see OpenAI design custom AI accelerators that will be developed and deployed in conjunction with Broadcom. Connectivity between server racks will be handled by Ethernet and other connectivity solutions from Broadcom. Over multiple years, the plan is to deploy AI accelerators that will consume 10 gigawatts of power in aggregate. This deal rivals other recent deals that OpenAI has made with Nvidia and AMD for GPUs. Broadcom didn't disclose the revenue impact from this deal, but one analyst estimated that it could generate up to $100 billion in additional revenue for Broadcom over the next four years. While the megadeal with OpenAI is a huge win for Broadcom, the company is taking a meaningful risk by partnering with the AI start-up. OpenAI has struck multiple megadeals in recent months for AI infrastructure, but as of today, it doesn't have the cash or the revenue to pay for them. OpenAI is going to need to raise an unprecedented amount of capital to fund its ambitions. If it fails to do so, or if the AI boom fizzles out, Broadcom could be left holding the bag.
[6]
Opinion: Say Goodbye to Nvidia's Biggest Competitive Edge in 2026 | The Motley Fool
Investors should monitor Nvidia's competitive landscape closely. Nvidia (NVDA -4.33%) continues to be a key enabler of the global artificial intelligence (AI) infrastructure buildout, with over 94% share of the discrete GPU market in the second quarter of 2025. Its Blackwell architecture chips, Compute Unified Device Architecture (CUDA) software stack, and AI-optimized networking solutions together form a formidable competitive moat. The company's commitment to innovation -- releasing new hardware architectures annually while maintaining backward compatibility -- has increased customer loyalty. Unsurprisingly, demand for the company's GPUs from hyperscalers and enterprise AI giants has consistently outpaced available supply. Nvidia's dominance, however, may face serious challenges in 2026. Increasing competitive and geopolitical pressures, along with a rising focus on cost-effectiveness, may affect the company's topline and bottom-line growth prospects in the coming year. Here's how I believe these problems could evolve in 2026. The biggest challenge for Nvidia is the rapid emergence of alternatives to its GPUs, both from competitors offering chips with superior price performance and large clients developing proprietary silicon for specialized AI workloads. While still far behind in the discrete GPU market share, competitor Advanced Micro Devices (AMD 0.80%) is gearing up for the launch of Instinct MI450 series GPUs in 2026. These GPUs are based on CDNA 5 architecture and are built using Taiwan Semiconductor Manufacturing's (also known as TSMC) advanced 2-nanometer process technology. The MI450 series is expected to emerge as direct competition not only to Nvidia's Hopper and Blackwell GPUs, but also to the upcoming Rubin architecture GPUs built on 3-nanometer process technology. AMD's recent strategic partnership with OpenAI underlines the confidence in this new AI chip. According to this multi-year and multi-generation partnership, OpenAI will deploy 6 gigawatts of AMD Instinct GPUs. The first 1 gigawatt deployment based on MI450 GPUs will commence in the second half of 2026. With the deal positioning AMD as a core compute supplier for OpenAI's next-generation frontier models, AMD CEO Lisa Su expects to generate tens of billions of dollars in annual AI data-center revenue starting in 2027. AMD estimates this collaboration, along with other large customer deployments, could eventually generate over $100 billion in revenue in the next few years. CEO Lisa Su also claimed that AMD's chiplet-based GPU architecture (a processor made of several small chips) offers substantial advantages in memory capacity and bandwidth, which can be crucial for inference workloads. As hyperscalers push for unified infrastructure that can handle both training and inference, AMD's upcoming MI450 GPUs are being designed to serve both workloads efficiently. AMD's increasing technological prominence in the GPU market poses a significant challenge to Nvidia's supremacy. Broadcom's custom application-specific integrated circuits (ASICs) and other accelerators are also being increasingly adopted at hyperscaler data centers. Major cloud players such as Meta Platforms, Microsoft, Amazon, and Alphabet have also developed custom silicon, which reduces their reliance on Nvidia. Alphabet's Tensor Processing Units (TPUs) and Amazon's Inferentia chips already deliver better performance at a lower cost in specific training and inferencing tasks. As more hyperscalers scale these in-house solutions and partner with other semiconductor players, it could adversely impact Nvidia's share of the AI compute spending. AMD's competitive pricing may soon become a key differentiator, especially since the target addressable market for AI accelerators is now projected to surpass $500 billion by 2028. AMD claims that its MI355 accelerator (from the MI350 series accelerators) has demonstrated matching or better performance than Nvidia's Blackwell architecture-based GB200 chips for specific key training and inference workloads. MI355 was also said to deliver performance matching to that of GB200 for specific other workloads at a lower cost and capacity. According to Dell'Oro Group, global data center capex is estimated to reach $1.2 trillion by 2029. Hyperscalers are expected to account for nearly half of this spend. Faced with escalating infrastructure and energy costs, cloud giants are exploring lower-cost accelerators to reduce the total cost of ownership while ensuring high performance. In this backdrop, AMD's competitively priced Instinct accelerators could prove to be an appealing alternative for hyperscalers. This may even pressure Nvidia to take some pricing cuts to protect its market share. Nvidia's excessive reliance on TSMC's foundries has exposed it to significant geopolitical and supply chain disruption risk, considering that Taiwan is just roughly 100 miles from mainland China. The escalating U.S.-China tensions have already negatively impacted the company's chip exports to China. In July 2025, China's internet regulator, The Cyberspace Administration of China summoned Nvidia to explain the alleged security vulnerabilities in its H20 chips. Chinese authorities have also intensified customs inspections of Nvidia's AI chip imports, to reduce reliance on U.S. imports as of October 2025. According to Reuters, China's crackdown was initially focused on China-specific models like the H20 and RTX Pro 6000D. However, it has now been expanded to include all advanced semiconductor products that could fall under U.S. export restrictions. These events have negatively impacted the company's sales in the key Chinese market. The heightened geopolitical tensions have also spurred countries around the world to focus on localizing the semiconductor supply chain. Several incentives are being offered to semiconductor manufacturers under the U.S. CHIPS Act and similar programs in Europe and Japan. TSMC, Samsung, and Intel are building new foundries in the U.S., Europe, and Asia. While these foundries are not Nvidia's direct competitors, expansion of manufacturing capacity will help competitors such as AMD, Intel, and Broadcom, as well as hyperscalers designing custom AI silicon to scale production efficiently. This may erode Nvidia's supply advantage in the long run. Nvidia trades at a premium valuation of 28.5 times forward earnings. However, in the face of increasing adoption of open hardware ecosystems and alternative AI chips, the company may witness compression in valuation multiples. Coupled with potential margin compression and slower topline growth, these factors may weigh on the company's share prices in 2026. While none of these risks are certain to materialize, investors should remain vigilant about market share shifts and cost-sensitive deployments across the AI landscape. These are tangible risks, and Nvidia has to navigate them carefully to sustain its growth trajectory beyond 2026.
[7]
Broadcom's OpenAI Bet: Big Revenue Optionality, Bigger Balance-Sheet Questions | Investing.com UK
Investors love a clean AI story, and Broadcom (NASDAQ:AVGO) just handed them one. A multi-year pact to co-develop OpenAI-designed accelerators and wire up next-gen clusters promises double-digit gigawatts of compute and a fresh lane beside Nvidia's (NASDAQ:NVDA) dominance. The market reaction was swift, sending Broadcom shares nearly 10% higher in a single session. Beneath the pop sits a tougher question: does the funding math behind OpenAI's buildout support Broadcom's margin structure and valuation over a full cycle, or are investors underwriting a capital stack as much as a product roadmap? The timing of the announcement intersects with a broader re-rating of AI-linked equities, as investors rotate back into growth themes amid expectations of a 2025 Fed easing cycle. AI infrastructure spending has become one of the few secular narratives capable of offsetting tighter liquidity conditions and slowing global demand, amplifying market sensitivity to any new data-center commitment. The headline is scale. OpenAI and Broadcom outlined deployments targeting roughly 10 gigawatts of accelerator capacity through 2029, with Broadcom supplying the networking backbone that competes with Nvidia's InfiniBand. The agreement begins ramping in the second half of 2026, placing the revenue curve squarely into the out-years of this cycle. Importantly, this is not an isolated order -- it fits into OpenAI's broader procurement spree aiming for about 26 gigawatts across multiple chip partners, a footprint that dwarfs New York City's peak summer demand. Broadcom gains privileged exposure to that spend and a showcase for its Ethernet-centric fabrics in very large clusters. The strategic catch is design specificity. Custom silicon and tightly integrated systems are powerful moats when the customer scales; they're also less transferable if the customer falters. Broadcom's CEO has already acknowledged that very large AI systems lift earnings while diluting gross margins -- a reminder that scale and mix effects can cut in opposite directions. If competitive dynamics intensify around pricing or qualification windows, the earnings leverage investors expect could soften even as revenue prints look impressive. OpenAI's ambitions are not subtle. The company is layering deals with AMD (NASDAQ:AMD) and now Broadcom while keeping Nvidia in the tent, and it has floated infrastructure plans requiring unprecedented power and capital. Reports peg 2025 revenue around $13 billion, with profitability not expected before 2029 and cumulative cash burn potentially exceeding $100 billion through that date. That is not a problem if revenue compounds, capital remains cheap, and partners execute; it becomes a risk if rates stay elevated, energy buildouts slip, or the AI demand curve normalizes. For markets, the power variable matters as much as capex. Ten gigawatts for one partnership and mid-20s gigawatts across arrangements imply grid, permitting, and generation constraints that sit outside semiconductor control. Any delay in energy availability becomes a de facto push-out of Broadcom's revenue recognition profile and a drag on expected return on invested capital. That's not a thesis killer, but it adds real timing risk to the bullish glide path investors have internalized. Broadcom isn't bidding into a vacuum. Nvidia continues to define the frontier, AMD is tying its own multi-gigawatt roadmap to OpenAI with financial incentives, and hyperscale buyers are experimenting with alternative custom paths. Google's work with MediaTek on TPU roadmaps signals that major customers will cultivate multiple supply lines. The net effect is healthy for OpenAI's bargaining power, but less so for any supplier's sustained pricing umbrella. Broadcom's operational discipline is an asset here, yet the structure of this customer set argues for persistent margin negotiation rather than one-way operating leverage. Investors have been willing to pay a premium multiple for Broadcom, reflecting execution across chips and software and consolidated gross margins above 70% during the VMware integration phase. The OpenAI tie-up extends that narrative but doesn't eliminate macro discipline. If long yields grind higher or credit spreads widen, the cost of OpenAI's external capital rises just as Broadcom leans into a long-dated, custom buildout. In that scenario, even strong backlog could be discounted more heavily. Conversely, an easing cycle that re-flattens the curve, combined with steady AI demand, would support both OpenAI's financing and Broadcom's multiple. Today's premium -- at times above Nvidia's forward ratio -- suggests the execution bar remains elevated. The bull case is straightforward: Broadcom converts OpenAI's scale into multi-year revenue, deepens its networking moat in AI clusters, and taps an incremental custom-silicon vector that diversifies exposure beyond Nvidia's cycle. The bear case is equally clear: timelines slip with power and permitting, funding tightens, competitive pricing trims margins, and design specificity limits reuse if OpenAI retrenches. Positioning-wise, the setup favors holders who can tolerate timing variance and monitor three leading indicators: For everyone else, the stock now embeds a lot of things going right at once. That's not a reason to sell -- but it's a reason to size with respect for the denominator. Over the next 12 months, Broadcom's trajectory will likely mirror sentiment around AI capex durability and U.S. monetary conditions more than quarterly earnings cadence. For now, the trade is less about profit delivery and more about conviction in the infrastructure super-cycle itself.
[8]
Broadcom stock price target raised to $430 from $410 at Mizuho on OpenAI deal By Investing.com
Investing.com - Mizuho raised its price target on Broadcom Limited (NASDAQ:AVGO) to $430.00 from $410.00 on Monday, while maintaining an Outperform rating following OpenAI's announcement of a custom ASIC partnership with the chipmaker. The semiconductor giant, currently valued at $1.68 trillion, has shown impressive momentum with 26 analysts recently revising their earnings estimates upward, according to InvestingPro data. The partnership represents OpenAI's third gigawatt-scale deal in the past four weeks, with the AI company planning to deploy 10 gigawatts of its custom ASIC, code-named Titan. Mizuho estimates this could potentially be worth $150-200 billion for Broadcom over multiple years. Initial production ramps for the custom chips are expected to begin in the second half of 2026, according to Mizuho's research note. The firm has conservatively raised its Broadcom AI revenue estimates to $40.4 billion for fiscal 2026, $64.5 billion for fiscal 2027, and $78 billion for fiscal 2028. Mizuho also highlighted additional tailwinds from Broadcom's networking business with SUE/Tomahawk products, conservatively estimating $15-20 billion in incremental revenue per gigawatt from combined AI ASIC and networking sales. OpenAI has announced approximately 26 gigawatts worth of deals in the past four weeks with Broadcom, NVIDIA, and AMD. Mizuho continues to list Broadcom as its "TOP PICK" in the semiconductor sector. In other recent news, Broadcom Inc. has announced a multi-year partnership with OpenAI to supply custom silicon and networking chips. This collaboration, which involves the development of 10 gigawatts of AI accelerators, is expected to begin deployment in the second half of 2026 and continue through the end of 2029. CFRA has praised this partnership, noting Broadcom's "extraordinary" visibility and pipeline. Wolfe Research has reiterated its Peerperform rating for Broadcom, highlighting the company's provision of Ethernet-based solutions for scale-up and scale-out capabilities in these rack systems. Melius Research has also reiterated a Buy rating, emphasizing the significant revenue potential from the OpenAI deal. Aletheia Capital has initiated coverage with a Buy rating and a $400 price target, projecting that Broadcom's AI revenue could double year-over-year in fiscal years 2026 and 2027. These developments underscore Broadcom's strategic focus on AI technology and its potential impact on future revenue streams. This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
[9]
Why Broadcom Stock Skyrocketed Monday Morning | The Motley Fool
The catalyst that sent the networking specialist higher was word of a new partnership for its custom artificial intelligence (AI) chips. In a joint press release that dropped Monday morning, Broadcom and OpenAI announced a strategic partnership to deploy 10 gigawatts of Broadcom's application-specific integrated circuits (ASICs). These AI accelerators, called XPUs, can be customized to handle specific tasks, making them more energy efficient. In some cases, ASICs are being used as a viable alternative to graphics processing units (GPUs) to provide the computational horsepower needed to fuel AI development. The announcement notes that OpenAI "will design the accelerators and systems, which will be developed and deployed in partnership with Broadcom." The pair also plan to deploy rack systems that incorporate the custom AI chips. This is the latest in a flurry of deals between OpenAI and high-profile AI chipmakers. In late September, the company announced a 10-gigawatt deal with Nvidia (NVDA 2.63%), which included a $100 billion investment in OpenAI. Just weeks later, the AI specialist announced a 6-gigawatt deal with Advanced Micro Devices (AMD 0.60%), which included an agreement that gave OpenAI the right to purchase up to 160 million shares of AMD, representing a roughly 10% stake in the company. By inking deals with all the biggest chipmakers, OpenAI will have a steady stream of cutting-edge AI chips to power its large language models. In conjunction with its fiscal 2025 third-quarter financial report, Broadcom announced the addition of a fourth large hyperscale customer, which many analysts believed was OpenAI. This deal helps illustrate Broadcom's growing influence in the AI chip space. Melius Research analyst Ben Reitzes has gone on record saying he believes Broadcom will eventually capture about 30% of the AI chip market. Yet for all that opportunity, Broadcom stock is attractively priced, with a price/earnings-to-growth (PEG) ratio of 0.38, when any number less than 1 is the standard for an undervalued stock.
[10]
Broadcom stock rises as CFRA praises OpenAI partnership and AI pipeline By Investing.com
Investing.com - Broadcom Inc (NASDAQ:AVGO), a prominent player in the semiconductor industry with impressive gross profit margins of 77%, saw its shares gain Monday after CFRA highlighted the company's "extraordinary" visibility and pipeline following a multi-year partnership announcement with OpenAI. According to InvestingPro data, the company maintains strong financial health with 26 analysts recently revising earnings expectations upward. The newly announced partnership will involve Broadcom supplying custom silicon and networking chips to OpenAI, with deployment expected to begin in the second half of 2026 and continue through the end of 2029. With revenue growth of 28% over the last twelve months and an EBITDA of $32.75 billion, Broadcom appears well-positioned to execute this significant contract. According to CFRA's analysis, the two companies will collaborate to build 10 gigawatts of AI infrastructure, contributing to OpenAI's broader pursuit of over 30 gigawatts of new capacity in the coming years -- an initiative that could represent more than $1 trillion in spending. Get deeper insights into Broadcom's growth potential and 20+ additional ProTips with a InvestingPro subscription, including exclusive access to comprehensive Pro Research Reports. CFRA noted that Broadcom's previously announced $10 billion new customer was not OpenAI as many had speculated, with Broadcom's President of Semiconductor Solutions suggesting this distinction during a CNBC appearance. The research firm maintained its positive outlook on Broadcom, stating the company is "well-positioned to benefit from the massive AI infrastructure build from existing/new clients, with upside from additional prospects to place large-scale orders in the coming years." In other recent news, Broadcom Limited has announced a significant multi-year collaboration with OpenAI to develop and deploy 10 gigawatts of custom AI accelerators. This partnership is expected to begin deployment in the second half of 2026 and continue through the end of 2029. Wolfe Research reiterated its Peerperform rating for Broadcom, acknowledging the company's substantial deal with OpenAI. Melius Research maintained its Buy rating and set a price target of $415, emphasizing the potential revenue boost from the OpenAI agreement. Aletheia Capital initiated coverage on Broadcom with a Buy rating and a $400 price target, predicting that the company's AI revenue will double year-over-year in 2026 and 2027. Bernstein SocGen Group also reiterated an Outperform rating and a $400 price target, citing strong demand in the compute sector and confidence in Broadcom's growth trajectory. These developments reflect growing interest and investment in AI technologies, underscored by Broadcom's strategic partnerships and analyst support. This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
Share
Share
Copy Link
Broadcom and OpenAI announce a multi-year partnership to deploy 10 gigawatts of custom AI silicon and rack-level systems, marking a significant milestone in the AI infrastructure race.
Broadcom and OpenAI have announced a multi-year partnership to deploy 10 gigawatts of custom artificial intelligence (AI) silicon and rack-level systems, marking a significant milestone in the AI infrastructure race
1
2
.Source: The Motley Fool
This collaboration extends beyond compute chips to networking silicon and server racks, positioning Broadcom to optimize performance and scale next-generation technologies.
The partnership involves the integration of compute Application-Specific Integrated Circuits (ASICs) along with Broadcom's Ethernet, Peripheral Component Interconnect Express (PCIe), and optical connectivity solutions
2
. Deployment is set to commence in the second half of 2026 and is expected to be completed by 20292
. This announcement led to a surge in Broadcom's stock price, with shares rising as much as 10.7% following the news1
.Analysts estimate that each gigawatt of capacity could generate $25-$30 billion in revenue for Broadcom, implying a total potential revenue of $250-$300 billion for the 10 GW infrastructure
2
. Initial annual revenue is projected to range from $70 billion to $90 billion over the four years of deployment2
.
Source: Investing.com
OpenAI's deal with Broadcom is part of a larger trend of massive AI infrastructure investments. The company has announced roughly 36GW of data center capacity since the beginning of the year, with 26GW worth of deals coming in the past three weeks alone
1
. This includes partnerships with Nvidia for 10GW and Advanced Micro Devices for 6GW of data center capacity3
.
Source: The Motley Fool
The collaboration aims to embed OpenAI's learnings from developing frontier models and products directly into the hardware, potentially unlocking new levels of capability and intelligence
1
. Broadcom's expertise in networking and custom chip design is expected to play a crucial role in optimizing the AI infrastructure stack4
.Related Stories
This deal strengthens Broadcom's position in the AI chip market, competing with rivals like Nvidia and AMD
3
5
. The company's diversified business model, strong margin profile, and exposure to growth trends in AI, data centers, and infrastructure make it a top pick among analysts in the semiconductor sector2
.The partnership between Broadcom and OpenAI signifies the growing adoption of custom silicon in the accelerator market and underscores the increasing demand for AI compute power
2
. As AI infrastructure spending is predicted to reach as much as $4 trillion by 2030, this deal positions Broadcom as a key player in the ongoing AI revolution3
.Summarized by
Navi
[1]
[3]
[4]
[5]
1
Business and Economy

2
Business and Economy

3
Technology
