14 Sources
14 Sources
[1]
Nvidia invests $2 billion in Marvell to deepen NVLink Fusion partnership -- signs deal with one of its biggest competitors
The deal pulls one of the biggest custom ASIC designers into Nvidia's proprietary interconnect ecosystem. Nvidia announced today that it has invested $2 billion in Marvell Technology and entered a partnership connecting Marvell to Nvidia's AI factory and AI-RAN ecosystem through NVLink Fusion, the tech that allows third-party silicon to plug directly into Nvidia's proprietary interconnect fabric. Marvell is one of the two dominant custom ASIC design houses alongside Broadcom. Its clients include AWS, for which it has helped develop the Trainium series of AI accelerators, as well as Microsoft and Google. These custom chips exist, in large part, to give hyperscalers an alternative to buying Nvidia GPUs, making Nvidia's investment in the company somewhat noteworthy. Per the deal, Marvell will provide custom XPUs and NVLink Fusion-compatible scale-up networking, while Nvidia will supply Vera CPUs, ConnectX NICs, Bluefield DPUs, NVLink interconnect, and Spectrum-X switches. The two companies will also collaborate on silicon photonics and AI-RAN infrastructure for 5G and 6G networks. "The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories," said Jensen Huang, founder and CEO of Nvidia. "Together with Marvell, we are enabling customers to leverage Nvidia's AI infrastructure ecosystem and scale to build specialized AI compute." NVLink Fusion, first announced in May last year, enables heterogeneous AI infrastructure where non-Nvidia accelerators can communicate with Nvidia GPUs, CPUs, and networking hardware over NVLink's high-bandwidth, low-latency fabric. Platforms built through the program must include at least one Nvidia product, whether a CPU, GPU, or switch. Marvell's contribution, meanwhile, focuses on custom XPUs and high-speed optical interconnects. The company reported $8.2 billion in revenue for its fiscal year 2026 (ended January 2026), with data center revenue accounting for more than 74% of the total. Marvell's Celestial AI acquisition late last year added photonic fabric technology to its portfolio, and this deal now places that capability inside Nvidia's ecosystem. "By connecting Marvell's leadership in high-performance analog, optical DSP, silicon photonics and custom silicon to Nvidia's expanding AI ecosystem through NVLink Fusion, we are enabling customers to build scalable, efficient AI infrastructure," said Matt Murphy, chairman and CEO of Marvell. By pulling Marvell into the NVLink Fusion ecosystem, Nvidia ensures that custom XPUs designed by Marvell remain compatible with, and dependent on, Nvidia's broader infrastructure. Every NVLink Fusion platform requires at least one Nvidia component, so Marvell-designed ASICs that use the fabric still generate Nvidia revenue. Marvell joins an NVLink Fusion ecosystem that has grown steadily since its launch. Samsung Foundry joined in October to offer design-to-manufacturing support for NVLink-compatible custom chips. Arm entered the program in November, enabling its licensees to build CPUs with native NVLink connectivity. Nvidia rivals AMD, Intel, and Broadcom remain absent and are instead backing the open UALink standard as a competing rack-scale interconnect. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[2]
Nvidia invests $2bn in chipmaker Marvell to boost AI networking
Nvidia is investing $2bn in Silicon Valley chipmaker Marvell to boost the networking technology that connects its AI chips into ever-larger data centres. The two semiconductor groups will work together on silicon photonics as part of an effort to upgrade data centre networking systems with optical technology to speed up data flows, they said on Tuesday. Nvidia's deal with Marvell could also make it easier for Big Tech companies' own custom AI chips to be put into Nvidia's data centre systems. Marvell helps US hyperscalers including Amazon to design their own specialised AI accelerator chips, providing an alternative to Nvidia's general-purpose graphics processing units. The two companies said their tie-up would enable "seamless integration" between these kinds of custom AI chips and Nvidia's GPUs, networking and storage systems. Nvidia is trying to leverage its leading position in AI processors to become a broader platform for AI data centres. That would help entrench itself in the facilities that power Big Tech's multibillion-dollar race to dominate AI. Earlier this month Nvidia launched a new dedicated chip for AI inference -- when an AI model responds to users' queries and generates output such as text and computer code -- marking the first time it had broadened its core AI chip products beyond GPUs. Shares in Marvell jumped about 8 per cent in late morning trade in New York on Tuesday and have gained more than 50 per cent over the past 12 months, leaving it worth close to $83bn. Chief executive Matt Murphy said the alliance would help customers with scalability and efficiency: "Our expanded partnership with Nvidia reflects the growing importance of high-speed connectivity, optical interconnect and accelerated infrastructure in scaling AI." Both Nvidia and Marvell have been making acquisitions to consolidate their positions in the AI data centre market. Last month, Marvell completed its $3.3bn acquisition of Celestial AI, which developed photonics technology that can join hundreds of thousands of AI accelerator chips together. Bringing AI chips together in vast clusters has become vital to building and deploying cutting-edge AI systems such as those that power Google's Gemini, Anthropic's Claude Code and OpenAI's ChatGPT. Nvidia has been gradually expanding its capabilities beyond GPUs since its $6.9bn acquisition of networking provider Mellanox in 2019. In December, it struck a roughly $20bn deal with AI chip start-up Groq, hiring its top talent and licensing its technology, which helped launch its new inference chip in March.
[3]
Nvidia invests $2 billion in Marvell, launches AI partnership
March 31 (Reuters) - Nvidia (NVDA.O), opens new tab has invested $2 billion in Marvell Technology (MRVL.O), opens new tab, and Marvell will join the Nvidia AI ecosystem, the companies said on Tuesday. Shares of Marvell rose nearly 12%, while Nvidia shares were up 1.5% in premarket trading. The tie-up makes it easier for Marvell to design custom artificial intelligence with its own networking gear and processors, the companies said. Reporting by Kritika Lamba in Bengaluru; Editing by Tasim Zahid Our Standards: The Thomson Reuters Trust Principles., opens new tab
[4]
Nvidia's $2 billion Marvell bet is not an investment. It is a toll booth.
Nvidia has invested $2 billion in Marvell Technology and folded the chipmaker into its NVLink Fusion ecosystem, creating a partnership that covers custom AI accelerators, silicon photonics, and 5G/6G infrastructure. The deal ensures that every custom chip Marvell designs for hyperscalers like Amazon, Google, and Microsoft still generates Nvidia revenue through mandatory platform components, turning what looked like a competitive threat into an ecosystem tax. Nvidia announced on Monday that it has invested $2 billion in Marvell Technology and entered a strategic partnership centred on NVLink Fusion, the rack-scale platform that allows third-party silicon to plug directly into Nvidia's proprietary interconnect fabric. Marvell's stock surged nearly 13 per cent on the news. Nvidia's rose 5.6 per cent. The market read it as a deal. The more accurate reading is that it is infrastructure policy, written in silicon. The partnership has Marvell supplying custom XPUs and NVLink Fusion-compatible scale-up networking, while Nvidia provides everything else: Vera CPUs, ConnectX network interface cards, BlueField data processing units, NVLink interconnect, and Spectrum-X switches. The two companies will also collaborate on silicon photonics, the technology that uses light instead of copper to move data between chips at the speeds that next-generation AI clusters demand. Jensen Huang framed it in characteristically expansive terms. "The inference inflection has arrived," the Nvidia chief executive said. "Token generation demand is surging, and the world is racing to build AI factories." The strategic subtlety sits in the architecture of NVLink Fusion itself. Every NVLink Fusion platform must include at least one Nvidia product, whether a CPU, GPU, or switch. Nvidia also controls which partners receive NVLink IP licences. This means that the custom AI accelerators Marvell designs for hyperscalers, the very chips these customers commission specifically to reduce their dependence on Nvidia GPUs, will still generate Nvidia revenue on every rack deployed. It is, as Tom's Hardware put it, a tax on custom ASICs. The deal deepens a pattern that has become unmistakable. Nvidia has made a series of $2 billion investments in recent months, including stakes in CoreWeave, Nebius, Synopsys, Coherent, and Lumentum. Each targets a different layer of the AI infrastructure stack that is being built at unprecedented speed: cloud providers, chip design tools, optical networking components, and now custom silicon. The common thread is that each investment makes the recipient more dependent on Nvidia's platform while Nvidia gains both financial exposure to and architectural influence over potential competitors. Marvell is a particularly interesting target because its fastest-growing business is designing the custom AI accelerators that hyperscalers use to displace Nvidia GPUs. The company's custom AI XPU business generated $1.5 billion in fiscal 2026 revenue and is expected to double by fiscal 2028. Marvell currently has 18 active custom silicon projects, including 12 devices for Amazon, Google, Microsoft, and Meta, and six for emerging AI customers. Amazon's Trainium chips, Microsoft's Maia accelerators, and Google's TPUs all flow through Marvell's design capabilities. By investing $2 billion and pulling Marvell into NVLink Fusion, Nvidia has effectively ensured that the company building its competitors' weapons is also paying Nvidia for the ammunition. NVLink Fusion's partner roster has expanded rapidly since its debut at Computex. Samsung Foundry joined in October to offer manufacturing support on its 3nm and 2nm nodes. Arm entered in November, enabling its licensees to build CPUs with native NVLink connectivity. SiFive joined in January, bringing RISC-V into the ecosystem. Fujitsu, Qualcomm, MediaTek, Alchip, Astera Labs, Synopsys, and Cadence were among the original partners. The breadth of the list is the point: NVLink Fusion is becoming the default interconnect standard for custom AI silicon, not because it is open, but because Nvidia's software ecosystem, particularly CUDA, makes it the path of least resistance for customers who need their hardware to work immediately. The open alternative, the Ultra Accelerator Link consortium backed by AMD, Intel, Broadcom, Cisco, Google, HPE, Meta, and Microsoft, is designed to break exactly this kind of lock-in. But UALink faces what analysts describe as a crisis of the commons: its members have competing priorities, its 128G specification launch trails the pace of accelerator deployment, and several of its key members now have Nvidia money on their balance sheets. Nvidia's financial stakes in companies nominally committed to an open standard raise legitimate questions about whether that standard can develop at the speed needed to offer a genuine alternative. For Marvell's chief executive Matt Murphy, the deal addresses a practical constraint. "By connecting Marvell's leadership in high-performance analog, optical DSP, silicon photonics, and custom silicon to Nvidia's expanding AI ecosystem through NVLink Fusion," Murphy said, "we are enabling customers to build scalable, efficient AI infrastructure." The translation: Marvell's hyperscaler customers want custom chips that work seamlessly with the Nvidia infrastructure already deployed in their data centres, and NVLink Fusion is how that happens. The silicon photonics component may prove the most consequential element of the partnership in the medium term. As AI clusters scale to hundreds of thousands of GPUs, the copper interconnects that have served the industry for decades are approaching fundamental bandwidth and energy limits. Optical interconnects can move data faster and more efficiently, but the technology remains expensive and difficult to manufacture at scale. Nvidia and Marvell collaborating on silicon photonics positions both companies at the centre of what could become the next critical bottleneck in AI infrastructure, after chips and after power. The 5G and 6G dimensions of the partnership, encompassing what Nvidia calls AI-RAN infrastructure, signal an ambition that extends beyond the data centre entirely. If wireless networks increasingly rely on AI for signal processing and resource allocation, the base station becomes another compute node in the Nvidia ecosystem, running on Nvidia platforms with Marvell connectivity. It is the kind of horizontal expansion that turns a chip company into an infrastructure company. Nvidia still commands roughly 90 per cent of the data centre GPU and AI accelerator market. The semiconductor industry generated $791.7 billion in sales in 2025 and is forecast to grow another 26 per cent in 2026. Against that backdrop, the commercial AI market is accelerating faster than anyone projected, and the companies racing to build it need hardware that works now, not hardware that might work when an open standard catches up. That urgency is Nvidia's greatest asset and NVLink Fusion's most effective sales pitch. The $2 billion is a rounding error on Nvidia's balance sheet. What it buys is something no amount of R&D spending can replicate: the architectural certainty that even the chips designed to replace Nvidia will be built inside an Nvidia-controlled ecosystem. It is not a partnership in any conventional sense. It is a toll booth on the only road that leads to the fastest-growing market in technology.
[5]
Nvidia pulls Marvell into proprietary interconnect ecosystem
* Nvidia invests $2 billion to bring Marvell into the NVLink Fusion ecosystem * NVLink Fusion enables third-party accelerators to communicate with Nvidia GPUs efficiently * Marvell provides custom XPUs and scale-up networking for heterogeneous AI infrastructure Nvidia has invested $2 billion in Marvell Technology and entered a strategic partnership that connects the custom chip designer to Nvidia's AI factory ecosystem through NVLink Fusion. NVLink Fusion enables third-party accelerators to communicate with Nvidia components over a high-bandwidth, low-latency interconnect, while maintaining compatibility with Nvidia's rack-scale AI platforms. The move integrates Marvell's capabilities in high-performance analog, optical DSP, silicon photonics, and custom XPUs with Nvidia's GPU, CPU, and networking infrastructure. Nvidia expands its AI ecosystem "The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories," Jensen Huang, founder and CEO of Nvidia said. "Together with Marvell, we are enabling customers to leverage Nvidia's AI infrastructure ecosystem and scale to build specialized AI compute." NVLink Fusion was first launched in May 2025 as a platform for heterogeneous AI infrastructure. It allows non-Nvidia accelerators to communicate with Nvidia GPUs over a high-bandwidth fabric. Marvell will provide custom XPUs and NVLink Fusion-compatible scale-up networking for the partnership. Nvidia will supply Vera CPUs, ConnectX NICs, BlueField DPUs, and NVLink interconnect components. Every NVLink Fusion platform must include at least one Nvidia product to function properly, and this means Marvell-designed ASICs still generate revenue for Nvidia despite using custom silicon. "By connecting Marvell's leadership in high-performance analog, optical DSP, silicon photonics and custom silicon to Nvidia's expanding AI ecosystem through NVLink Fusion, we are enabling customers to build scalable, efficient AI infrastructure," said Matt Murphy, chairman and CEO of Marvell. Marvell reported $8.2 billion in revenue for its fiscal year 2026, which ended January 2026, with data center revenue accounting for more than 74% of the total. The company's acquisition of Celestial AI late last year added photonic fabric technology to its portfolio, and this deal now places that capability inside Nvidia's ecosystem. The two companies will also collaborate on silicon photonics technology and on transforming the world's telecommunication network into AI infrastructure using Nvidia's Aerial AI-RAN for 5G and 6G networks. Marvell is not the only company joining Nvidia's proprietary ecosystem, as Samsung Foundry joined the NVLink Fusion program in October last year. Arm followed shortly after, entering the program in November and enabling its licensees to build NVLink-compatible CPUs for a wider range of applications. However, not every major chipmaker has signed on, as Nvidia rivals AMD, Intel, and Broadcom remain notably absent from the program. These competitors have instead chosen to back the open UALink standard as a competing rack-scale interconnect, creating a clear divide in the industry. The absence of these rivals matters because Marvell already helps Amazon develop its Trainium series of AI accelerators, placing the company in an unusual position. That existing relationship with Amazon predates this new partnership with Nvidia by several years, creating potential tension between the two deals. The announcement from Nvidia and Marvell does not address whether Trainium collaboration will emerge from this deal, leaving the question unanswered. However, Nvidia's $2 billion investment successfully pulls a key custom silicon designer into a proprietary ecosystem where it controls the interconnect standard. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[6]
Nvidia invests $2 billion in Marvell as stock jumps 9%
Nvidia $NVDA has announced a $2 billion investment in semiconductor company Marvell Technology, adding another chip industry partner to its expanding AI ecosystem. Marvell's stock rose more than 11% following the announcement, according to CNBC. Through the partnership, Marvell will be integrated into Nvidia's AI ecosystem, lowering the barrier for customers who want to develop products on top of that infrastructure. Collaboration will extend to two additional technical areas: silicon photonics and telecom networking infrastructure. "The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories," Nvidia CEO Jensen Huang said in a statement. "Together with Marvell, we are enabling customers to leverage NVIDIA's AI infrastructure ecosystem and scale to build specialized AI compute." This is not an isolated move. Nvidia has been deploying $2 billion checks across the technology sector with notable regularity, with prior recipients including Synopsys $SNPS, CoreWeave, Coherent, Lumentum, and, most recently, Nebius Group. The Marvell deal is the latest example of what some analysts and investors have flagged as an increasingly circular AI economy, in which a small group of chip companies, cloud providers, and AI labs finance one another's buildouts and pre-sell infrastructure capacity among themselves. Goldman Sachs $GS has cited "the increasing circularity of the AI ecosystem" as a concern, and Morgan Stanley $MS has warned that such arrangements can inflate demand and valuations without generating new economic value. Nvidia has pushed back on that characterization, telling analysts that its cross-investments are small relative to its overall revenue and that the companies it backs earn most of their money from outside customers. In its most recent earnings, Nvidia said demand for its Blackwell chips was effectively sold out. Marvell describes itself as a provider of data infrastructure semiconductors serving enterprise, cloud, and carrier customers.
[7]
Nvidia invests $2B in Marvell as part of new interconnect partnership - SiliconANGLE
Nvidia invests $2B in Marvell as part of new interconnect partnership Nvidia Corp. today disclosed that it has invested $2 billion in Marvell Technology Inc., a publicly traded semiconductor designer. The cash infusion is part of a new partnership that will span several parts of the chip market. Santa Clara, California-based Marvell is a major supplier of data center chips. It makes processors optimized to power fiber optic networks and storage hardware. Additionally, Marvell has a business unit that helps other companies develop custom ASICs, or application-specific integrated circuits. That unit is the first focus of its parttnership with Nvidia. Data center operators link Nvidia graphics processing units to the other chips in their server racks using an interconnect called NVLink. Until recently, the interconnect only worked with the GPU giant's own chips. Last year, Nvidia introduced a technology called NVLink Fusion that enables custom ASICs such as those made by Marvell to use NVLink. Marvell will "provide custom XPUs and NVLink Fusion-compatible scale-up networking" to joint customers as part of the partnership. The main component of NVLink Fusion is a chiplet that companies can integrate into their custom processor designs. The chiplet enables the host processor to connect to other chips via NVLink connections. NVLink is designed to link together chips installed in the same server. Nvidia also offers a second interconnect, NVLink Switch, that can move data across the servers in a rack. NVLink Fusion supports both interconnects. Nvidia's partnership with Marvell also extends to other areas. The companies will collaborate on Nvidia Aerial, a collection of hardware modules and software tools for powering 5G networks. The product suite lends itself to tasks such as creating a digital twin of a carrier network to find optimization opportunities. In parallel, Nvidia will work with Marvell to "advance world-class networking for AI, including advanced optical interconnect solutions and silicon photonics technology." Last year, Marvell acquired an optical interconnect startup called Celestial AI Inc. in a deal worth up to $5.5 billion. The acquisition bought it an optical interposer that can be used to link together a processor's chiplets. An interposer is a silicon rectangle that functions as a processor's base layer. It contains tiny wires that transport data between chiplets. Marvell also competes in other parts of the optical networking market. It sells pluggable coherent transceivers for metro networks, long-distance fiber optic networks that link together remote data centers. Additionally, Marvell makes chips that can be used to move packets between servers installed in the same data center. "Token generation demand is surging, and the world is racing to build AI factories," said Nvidia Chief Executive Officer Jensen Huang. "Together with Marvell, we are enabling customers to leverage NVIDIA's AI infrastructure ecosystem and scale to build specialized AI compute."
[8]
US Stocks: Nvidia bets $2 billion on Marvell as rising AI adoption fuels competition; shares rise
Nvidia has invested $2 billion in Marvell Technology as part of efforts to make it easier for customers to use the custom artificial intelligence chips that the smaller company designs with Nvidia's networking gear and central processors. Nvidia has invested $2 billion in Marvell Technology as part of efforts to make it easier for customers to use the custom artificial intelligence chips that the smaller company designs with Nvidia's networking gear and central processors. Shares of Marvell rose about 7% on Tuesday, while Nvidia shares were up 2.7%. Through the deal, Nvidia aims to ensure it remains central to meeting the growing computing needs required by AI tools at a time when some companies are opting for custom processors instead of its pricey processors. "Nvidia gains access to Marvell's semi-custom silicon and advanced optical interconnect capabilities to help scale data center-level AI systems where bandwidth and power efficiency are key bottlenecks," said Jacob Bourne, analyst at EMarketer. "It also broadens Nvidia's ecosystem to include more specialized silicon, which helps Nvidia remain a key access point for increasingly diverse AI workloads." "Investors will likely see this deal as reducing friction as it allows AI chips from other suppliers to operate within Nvidia-dominated data centers. So Nvidia can maintain its dominant position while also expanding the scope and utility of the AI semiconductor sector," Bourne added. The companies will work on advanced networking solutions for AI, focusing on optical interconnects and silicon photonics technology, which enables high-speed, energy-efficient data transmission. Marvell will contribute custom chips and networking solutions compatible with Nvidia's NVLink Fusion, while the AI chip bellwether will supply supporting technologies including central processing units, network interface cards and interconnects. Big Tech firms including Alphabet and Meta are expected to spend at least $630 billion to build AI infrastructure this year, lifting demand for chips used in servers and networking equipment, and benefiting companies such as Marvell. Marvell has said it expects revenue to grow nearly 40% and approach $15 billion in fiscal 2028.
[9]
Marvell Stock Spikes On $2B Nvidia Deal To Build 'AI Factories' - Marvell Technology (NASDAQ:MRVL)
Marvell Stock Skyrockets On $2 Billion Nvidia Deal To Build 'AI Factories'Strategic AI Infrastructure Partnership The companies announced a strategic partnership to integrate Marvell into Nvidia's AI factory and AI-RAN ecosystem using NVLink Fusion, giving customers more flexibility to build next-generation AI infrastructure. Nvidia also invested $2 billion in Marvell to deepen the collaboration. Under the partnership, Marvell will supply custom XPUs and NVLink Fusion-compatible networking, while Nvidia will provide core technologies, including its CPU, networking, interconnect, and AI compute systems. Together, they will enable customers to build a scalable, heterogeneous AI infrastructure that integrates seamlessly with Nvidia's broader ecosystem. Expanding Into Networking and Telecom AI The companies will also collaborate on silicon photonics and advanced optical interconnects, while working to transform telecom networks into AI-driven infrastructure through Nvidia's Aerial AI-RAN platform for 5G and 6G. Technical Analysis Marvell is trading 6.9% above its 20-day simple moving average (SMA) and 11.7% above its 100-day SMA, maintaining an intermediate-term uptrend despite recent consolidation. Shares are up 42.62% over the past 12 months and are positioned closer to their 52-week highs than lows. RSI is at 49.61, which sits in neutral territory and suggests the stock isn't stretched after the latest run. Meanwhile, MACD is bullish with the MACD line at 2.9245 above the signal line at 2.8254, and a positive histogram of 0.0991 points to improving upside momentum. The combination of neutral RSI (around 50) and bullish MACD suggests mixed momentum. Key Resistance: $103.00 Key Support: $85.00 Earnings & Analyst Outlook Looking further out, the next major catalyst for the stock arrives with the May 28, 2026 (estimated) earnings report. EPS Estimate: N/A Revenue Estimate: N/A Valuation: P/E of 28.6x (Indicates premium valuation relative to peers) Analyst Consensus & Recent Actions: The stock carries a Buy Rating with an average price target of $121.64. Recent analyst moves include: JP Morgan: Overweight (Raises Target to $135.00) (Mar. 6) Craig-Hallum: Buy (Raises Target to $164.00) (Mar. 6) B. Riley Securities: Buy (Raises Target to $135.00) (Mar. 6) Top ETF Exposure Significance: Because MRVL carries significant weight in these funds, any significant inflows or outflows will likely trigger automatic buying or selling of the stock. Price Action MRVL Stock Price Activity: Marvell Technology shares were up 9.55% at $96.20 during premarket trading on Tuesday, according to Benzinga Pro data. Image via Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[10]
Nvidia doesn't stop: Invests $2 billion in AI hardware giant
According to an official report, NVIDIA and Marvell have signed a strategic partnership connecting Marvell to NVIDIA's AI system, focusing on advanced AI infrastructure and intelligent networks. The partnership is intended to give customers greater flexibility in developing next-generation computing infrastructure, using advanced connectivity and optical technologies. As part of the agreement, NVIDIA invested $2 billion in Marvell. The company will provide custom chips and communication networks, while NVIDIA will provide supporting technologies, including compute, storage, and networking components. The platform enables seamless integration of all components, allowing AI infrastructure to operate more efficiently and as a unified system. The two companies are also collaborating to transform global communication networks into AI infrastructure and advance next-generation networking solutions, including silicon photonics and optical interconnects. Jensen Huang, CEO and founder of NVIDIA, said: "Demand for token generation is rising, and the world is rushing to build AI factories. Together with Marvell, we enable customers to leverage NVIDIA's AI ecosystem and build advanced, customized AI computing." Matt Murphy, Chairman and CEO of Marvell, said: "The expanded partnership with NVIDIA highlights the importance of high-speed connectivity, optical interconnects, and large-scale accelerated AI infrastructure. The combination of Marvell's expertise in custom silicon with NVIDIA's AI ecosystem will enable customers to build efficient and scalable AI infrastructures." The report notes that the partnership will focus on providing solutions for enterprise customers and building large-scale customized AI computing infrastructure, while maintaining full compatibility with NVIDIA systems and expanding opportunities for innovation and development.
[11]
NVIDIA Invests $2B in Marvell Technology, Shares Jump 12% on AI Data Center Push
The main goal of this deal is to improve data center infrastructure, as AI systems require high-speed and smooth data flows. This partnership will help achieve this objective. Marvell will contribute through its networking tools and custom . NVIDIA will bring its powerful GPUs and platforms to the table. Together, they plan to create better and faster AI systems. A key focus area of this partnership is silicon photonics. This technology uses light to send data. It is faster than the previous electrical methods and saves energy. This makes AI data centers more efficient. The partnership also includes NVIDIA's NVLink Fusion platform to help companies build custom AI systems.
[12]
Nvidia bets US$2 billion on Marvell as rising AI adoption fuels competition
Nvidia has invested US$2 billion in Marvell Technology as part of efforts to make it easier for customers to use the custom artificial intelligence chips that the smaller company designs with Nvidia's networking gear and central processors. Shares of Marvell rose more than nine per cent in premarket trading on Tuesday, while Nvidia shares were up 1.5 per cent. Through the deal, Nvidia aims to ensure it remains central to meeting the growing computing needs required by AI tools at a time when some companies are opting for custom processors instead of its pricey processors. "Together with Marvell, we are enabling customers to leverage Nvidia's AI infrastructure ecosystem and scale to build specialized AI compute," said Nvidia CEO Jensen Huang. The companies will work on advanced networking solutions for AI, focusing on optical interconnects and silicon photonics technology, which enables high-speed, energy-efficient data transmission. Marvell will contribute custom chips and networking solutions compatible with Nvidia's NVLink Fusion, while the AI chip bellwether will supply supporting technologies including central processing units, network interface cards and interconnects. Big Tech firms including Alphabet and Meta are expected to spend at least $630 billion to build AI infrastructure this year, lifting demand for chips used in servers and networking equipment from companies such as Marvell. Marvell has said it expects revenue to grow nearly 40 per cent and approach $15 billion in fiscal 2028.
[13]
Marvell stock surges 11% on Nvidia partnership, $2B investment By Investing.com
Investing.com -- Marvell Technology (NASDAQ:MRVL) shares jumped 11% on Tuesday following the announcement of a strategic partnership with NVIDIA and a $2 billion investment from the chipmaker. The partnership connects Marvell to NVIDIA's AI factory and AI-RAN ecosystem through NVIDIA NVLink Fusion, a rack-scale platform that enables customers to develop semi-custom AI infrastructure using the NVIDIA NVLink ecosystem. Under the agreement, Marvell will provide custom XPUs and NVLink Fusion-compatible scale-up networking, while NVIDIA will supply supporting technologies including Vera CPU, ConnectX NICs, Bluefield DPUs, NVLink interconnect and Spectrum-X switches, and rack-scale AI compute. The partnership also includes collaboration on silicon photonics technology and aims to transform telecommunications networks into AI infrastructure with NVIDIA Aerial AI-RAN for 5G/6G. "The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories," said Jensen Huang, founder and CEO of NVIDIA. "Together with Marvell, we are enabling customers to leverage NVIDIA's AI infrastructure ecosystem and scale to build specialized AI compute." Matt Murphy, chairman and CEO of Marvell, commented that the expanded partnership reflects the growing importance of high-speed connectivity, optical interconnect and accelerated infrastructure in scaling AI. The NVLink Fusion platform enables customers developing custom XPUs to build heterogeneous AI infrastructure fully compatible with NVIDIA systems, allowing integration with NVIDIA GPU, LPU, networking and storage platforms.
[14]
Nvidia announces $2bn investment in Marvell Technology
Marvell Technology shares surged over 9% in pre-market trading following the announcement of a $2bn investment from Nvidia. The agreement provides for Marvell's integration into Nvidia's AI ecosystem, facilitating the implementation of technological infrastructure for mutual clients. Both companies also announced collaborations in key areas such as silicon photonics and telecommunications networking. This transaction is part of a broader strategy by Nvidia, which has been ramping up investments in tech firms to strengthen its central position in AI. The group recently committed similar amounts to several companies, including Synopsys, CoreWeave, Coherent, Lumentum, and Nebius, thereby strengthening its influence across the entire value chain. Through this approach, Nvidia seeks to meet exponential growth in demand for computing power driven by AI inference and content generation. By leveraging strategic partnerships, the company intends to accelerate the rollout of specialized infrastructure and secure the essential technologies required for the sector's growth.
Share
Share
Copy Link
Nvidia has invested $2 billion in Marvell Technology, pulling one of the largest custom AI chip designers into its NVLink Fusion ecosystem. The deal ensures that Marvell's custom accelerators for hyperscalers like Amazon, Google, and Microsoft remain dependent on Nvidia's infrastructure, effectively turning potential competition into a revenue stream through mandatory platform components.
Nvidia announced a $2 billion investment in Marvell Technology, establishing a strategic Nvidia Marvell partnership that integrates the custom chip designer into Nvidia's NVLink Fusion ecosystem
1
2
. The deal positions Nvidia to capture revenue from custom AI accelerators that hyperscalers commission specifically to reduce dependence on Nvidia GPUs4
. Marvell's stock surged nearly 13 percent following the announcement, while Nvidia shares rose 5.6 percent4
. The move represents a significant expansion of Nvidia's influence across AI infrastructure, from silicon design to networking technology for AI.
Source: Analytics Insight
NVLink Fusion, first announced in May 2025, enables heterogeneous AI infrastructure where non-Nvidia accelerators can communicate with Nvidia GPUs, CPUs, and networking hardware over a high-bandwidth, low-latency fabric
1
5
. The architecture requires every NVLink Fusion platform to include at least one Nvidia product, whether a CPU, GPU, or switch1
. This design ensures that Marvell-designed ASICs using the fabric generate Nvidia revenue regardless of their intended purpose4
. Under the partnership, Marvell will provide custom XPUs and NVLink Fusion-compatible scale-up networking, while Nvidia supplies Vera CPUs, ConnectX NICs, BlueField DPUs, NVLink interconnect, and Spectrum-X switches1
.Marvell is one of two dominant custom ASIC design houses alongside Broadcom, with clients including AWS, Microsoft, and Google
1
. The company helps these hyperscalers develop alternatives to Nvidia AI chips, including Amazon's Trainium series, making this investment particularly strategic1
2
. Marvell's custom AI XPU business generated $1.5 billion in fiscal 2026 revenue and is expected to double by fiscal 2028, with 18 active custom silicon projects including 12 devices for Amazon, Google, Microsoft, and Meta4
. The company reported $8.2 billion in revenue for fiscal year 2026, with data center revenue accounting for more than 74 percent of the total1
5
.
Source: Benzinga
The two companies will collaborate on silicon photonics and AI-RAN infrastructure for 5G and 6G networks
1
2
. Marvell's acquisition of Celestial AI late last year added photonic fabric technology to its portfolio, and this deal places that capability inside Nvidia's ecosystem1
. Silicon photonics uses light instead of copper to move data between chips at speeds that next-generation AI clusters demand4
. Jensen Huang, founder and CEO of Nvidia, stated: "The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories"5
. Matt Murphy, chairman and CEO of Marvell, emphasized that "by connecting Marvell's leadership in high-performance analog, optical DSP, silicon photonics and custom silicon to Nvidia's expanding AI ecosystem through NVLink Fusion, we are enabling customers to build scalable, efficient AI infrastructure".Source: Market Screener
Related Stories
Nvidia has made a series of $2 billion investments in recent months, including stakes in CoreWeave, Nebius, Synopsys, Coherent, and Lumentum
4
. Each targets a different layer of AI infrastructure being built at unprecedented speed, from cloud providers to chip design tools and high-speed optical interconnects4
. The common thread is that each investment makes the recipient more dependent on Nvidia's platform while Nvidia gains both financial exposure to and architectural influence over potential competitors4
. This approach effectively functions as what analysts describe as a toll booth on custom ASICs, ensuring revenue generation even when customers build alternatives to Nvidia GPUs4
.Marvell joins an NVLink Fusion ecosystem that has expanded steadily since launch. Samsung Foundry joined in October to offer design-to-manufacturing support for NVLink-compatible custom chips, while Arm entered in November, enabling its licensees to build CPUs with native NVLink connectivity
5
. Nvidia rivals AMD, Intel, and Broadcom remain absent and instead back the open UALink standard as a competing rack-scale interconnect5
. However, UALink faces what analysts describe as a crisis of the commons, with competing priorities among members and a 128G specification launch that trails the pace of accelerator deployment4
. NVLink Fusion is becoming the default interconnect standard for custom AI silicon, not because it is open, but because Nvidia's software ecosystem, particularly CUDA, makes it the path of least resistance for customers who need hardware to work immediately4
.Summarized by
Navi
[1]
20 May 2025•Technology

02 Mar 2026•Business and Economy

29 Dec 2025•Business and Economy

1
Policy and Regulation

2
Technology

3
Technology
