4 Sources
4 Sources
[1]
Samsung Targets SK Hynix AI Lead With $73 Billion Blitz - Samsung Electronics Co (OTC:SSNLF)
Heavy Investment And AI Partnerships Drive Push The company plans to invest over $73 billion this year -- more than Taiwan Semiconductor's budget -- as it expands capacity and accelerates research into advanced chips. Samsung is also shifting its focus toward next-generation AI chips and advanced manufacturing processes to meet rising demand from AI workloads. Nvidia selected Samsung to manufacture its latest AI chips. Nvidia CEO Jensen Huang said the chips, built using Groq technology, are already in production and expected to ship in the second half of the year. AMD said it is expanding its partnership with Samsung to develop next-generation AI memory solutions, as both companies align with growing demand for AI infrastructure and higher memory performance. Still Trails TSMC In Foundry Leadership Despite these efforts, Samsung remains well behind Taiwan Semiconductor in the foundry market. Taiwan Semiconductor holds nearly 70% market share, driven by strong AI demand, while Samsung's share stands at about 7%, highlighting the significant gap it still needs to close. Through increased investment, new product rollouts, and deeper industry partnerships, Samsung is working to strengthen its position in the AI chip race and compete more effectively across both memory and manufacturing segments. Image via Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[2]
Samsung Stock Jumps As Nvidia CEO Jensen Huang Confirms New AI Chip Production Deal - Apple (NASDAQ:AAPL), Micron Technology (NASDAQ:MU)
Huang announced this at the GTC developer conference in California on Monday. The CEO launched Nvidia's new AI inference processor built with technology from Groq, crediting Samsung for manufacturing the chips, which are already in production and set to ship in the second half of the year. NVIDIA acquired chip startup Groq for $20.6 billion in December. Huang doubled the AI demand outlook to $1 trillion at the GTC conference, reflecting the surging demand for the tech giant's next-generation platforms, including Blackwell and Vera Rubin, as companies ramp up spending on AI infrastructure. Samsung Gains Edge In AI Chip Race Samsung has already started mass production of its HBM4 AI memory chips and is reportedly negotiating prices of about $700 per unit. Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by a Benzinga editor. Image via Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[3]
Samsung to Spend $73 Billion in 2026 to Expand AI Chip Capacity
Samsung plans a $73 billion 2026 chip investment to expand AI capacity and challenge SK Hynix Samsung Electronics plans to spend more than 110 trillion won in 2026 to strengthen its position in AI chips. The budget equals about $73.3 billion and marks a 22% increase from a year earlier. The company said it will direct the money toward chip capacity, research, and advanced production as competition in AI semiconductors intensifies. The spending plan shows how sharply the race has escalated. Samsung wants to regain momentum against SK Hynix, which built a strong position in high-bandwidth memory, or HBM, for Nvidia systems. At the same time, Samsung is expanding both memory and foundry capabilities to capture more AI demand across the supply chain.
[4]
Samsung Shares Rise After Next-Generation Chip Showcase
Samsung Electronics shares rose after the South Korean technology giant showcased advances in next-generation high-bandwidth memory for artificial intelligence chips. The stock climbed as much as 4.9% in Seoul trading Tuesday after Samsung unveiled its HBM4E--its most advanced AI memory component--ahead of rivals at Nvidia's annual AI conference, which began Monday in San Jose, Calif. Samsung said its sixth-generation HBM4, designed for Nvidia's Vera Rubin platform, delivers speeds of 11.7 gigabits per second, with potential to reach 13 Gbps, above the industry standard of 8 Gbps. The upgraded HBM4E runs at 16 Gbps. Samsung shares outperformed the Kospi's 2.7% gain in Tuesday afternoon trading. The showcase is seen as reinforcing Samsung's position in the emerging HBM4 market amid the AI boom. Last month, it said it was the first to mass-produce and ship HBM4 products, and plans to provide HBM4E samples to clients in the second half. Despite being the world's largest memory-chip maker, Samsung had trailed smaller rivals such as SK Hynix and Micron Technology in supplying earlier-generation HBM3 and HBM3E chips to Nvidia. At Nvidia's GPU Technology Conference, Samsung said it would highlight its comprehensive AI computing technologies and capabilities as well as its partnership with the U.S. chipmaker. "As the industry's only semiconductor company offering a total AI solution spanning memory, logic, foundry and advanced packaging, Samsung will exhibit its full suite of products and solutions that enable customers to design and build groundbreaking AI systems," it said Tuesday. Nvidia Chief Executive Jensen Huang outlined a broad array of new hardware and software products at the conference and said the company expects to generate $1 trillion in sales from Blackwell and Rubin chips by the end of 2027.
Share
Share
Copy Link
Samsung Electronics unveils a $73 billion investment plan for 2026, marking a 22% increase from last year, as it accelerates AI chip production and high-bandwidth memory development. The company secured a manufacturing deal with Nvidia for new AI inference processors and showcased its HBM4E technology, but still faces significant challenges in closing the gap with rivals SK Hynix and Taiwan Semiconductor in the competitive AI semiconductor market.
Samsung Electronics plans to invest more than $73 billion in 2026, representing a 22% increase from the previous year and exceeding Taiwan Semiconductor's budget, as the company intensifies its push to capture a larger share of the AI chip market
1
3
. The massive investment will target production capacity expansion, advanced research, and next-generation AI chips as competition in the AI semiconductor market reaches new heights. This strategic move comes as Samsung works to regain momentum against SK Hynix, which has established a dominant position in high-bandwidth memory (HBM) for Nvidia systems, and aims to strengthen both its memory and foundry capabilities across the AI infrastructure supply chain.
Source: Benzinga
The company is shifting focus toward advanced manufacturing processes to meet surging demand from AI workloads, positioning itself to compete more effectively in both memory and manufacturing segments. Samsung's ambitious spending plan reflects how rapidly the race has escalated, with companies ramping up investments to secure their position in what Nvidia CEO Jensen Huang projects will become a $1 trillion market by the end of 2027
4
.Nvidia selected Samsung to manufacture its latest AI inference processor built with Groq technology, a significant validation of Samsung's manufacturing capabilities
1
2
. Jensen Huang announced the partnership at the GTC conference in California, confirming that the chips are already in production and expected to ship in the second half of the year. Nvidia acquired chip startup Groq for $20.6 billion in December, and the new AI chip production deal marks an important milestone for Samsung as it seeks to expand its foundry business.
Source: Benzinga
Additionally, AMD announced it is expanding its partnership with Samsung to develop AI memory solutions, aligning both companies with growing demand for AI infrastructure and higher memory performance
1
. These partnerships signal industry recognition of Samsung's capabilities, though the company still faces considerable challenges in closing the gap with established leaders.Samsung unveiled its HBM4E—its most advanced AI memory component—at Nvidia's annual AI conference, showcasing technical capabilities ahead of rivals
4
. The sixth-generation HBM4, designed for Nvidia's Vera Rubin platform, delivers speeds of 11.7 gigabits per second with potential to reach 13 Gbps, significantly above the industry standard of 8 Gbps. The upgraded HBM4E runs at 16 Gbps, representing a substantial performance leap.Samsung shares climbed as much as 4.9% in Seoul trading following the announcement, outperforming the Kospi's 2.7% gain
4
. The company said it was the first to mass-produce and ship HBM4 products and plans to provide HBM4E samples to clients in the second half of the year. Samsung is reportedly negotiating prices of about $700 per unit for its HBM4 AI memory chips2
.Related Stories
Despite these strategic moves, Samsung remains well behind Taiwan Semiconductor in the foundry market, with TSMC holding nearly 70% market share driven by strong AI demand while Samsung's share stands at about 7%
1
. This significant gap highlights the substantial ground Samsung still needs to cover to compete effectively in AI chip production. The company also trails smaller rivals such as SK Hynix and Micron Technology in supplying earlier-generation HBM3 and HBM3E chips to Nvidia, despite being the world's largest memory-chip maker4
.Samsung emphasized its unique position as "the industry's only semiconductor company offering a total AI solution spanning memory, logic, foundry and advanced packaging," showcasing its comprehensive AI computing technologies and capabilities at the GTC conference
4
. Through increased investment, new product rollouts, and deeper industry partnerships with companies like Nvidia and AMD, Samsung is working to strengthen its position in the AI chip race. The question remains whether this massive capital deployment will be sufficient to close the gap with established leaders or if the company's late entry into certain segments will prove too difficult to overcome in a market where technological leadership and customer relationships have already been established.Summarized by
Navi
[1]
[3]
[4]
17 Nov 2025•Business and Economy

31 Jul 2024

30 Oct 2025•Business and Economy
