2 Sources
2 Sources
[1]
Samsung Stock Jumps As Nvidia CEO Jensen Huang Confirms New AI Chip Production Deal - Apple (NASDAQ:AAPL), Micron Technology (NASDAQ:MU)
Huang announced this at the GTC developer conference in California on Monday. The CEO launched Nvidia's new AI inference processor built with technology from Groq, crediting Samsung for manufacturing the chips, which are already in production and set to ship in the second half of the year. NVIDIA acquired chip startup Groq for $20.6 billion in December. Huang doubled the AI demand outlook to $1 trillion at the GTC conference, reflecting the surging demand for the tech giant's next-generation platforms, including Blackwell and Vera Rubin, as companies ramp up spending on AI infrastructure. Samsung Gains Edge In AI Chip Race Samsung has already started mass production of its HBM4 AI memory chips and is reportedly negotiating prices of about $700 per unit. Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by a Benzinga editor. Image via Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[2]
Samsung Shares Rise After Next-Generation Chip Showcase
Samsung Electronics shares rose after the South Korean technology giant showcased advances in next-generation high-bandwidth memory for artificial intelligence chips. The stock climbed as much as 4.9% in Seoul trading Tuesday after Samsung unveiled its HBM4E--its most advanced AI memory component--ahead of rivals at Nvidia's annual AI conference, which began Monday in San Jose, Calif. Samsung said its sixth-generation HBM4, designed for Nvidia's Vera Rubin platform, delivers speeds of 11.7 gigabits per second, with potential to reach 13 Gbps, above the industry standard of 8 Gbps. The upgraded HBM4E runs at 16 Gbps. Samsung shares outperformed the Kospi's 2.7% gain in Tuesday afternoon trading. The showcase is seen as reinforcing Samsung's position in the emerging HBM4 market amid the AI boom. Last month, it said it was the first to mass-produce and ship HBM4 products, and plans to provide HBM4E samples to clients in the second half. Despite being the world's largest memory-chip maker, Samsung had trailed smaller rivals such as SK Hynix and Micron Technology in supplying earlier-generation HBM3 and HBM3E chips to Nvidia. At Nvidia's GPU Technology Conference, Samsung said it would highlight its comprehensive AI computing technologies and capabilities as well as its partnership with the U.S. chipmaker. "As the industry's only semiconductor company offering a total AI solution spanning memory, logic, foundry and advanced packaging, Samsung will exhibit its full suite of products and solutions that enable customers to design and build groundbreaking AI systems," it said Tuesday. Nvidia Chief Executive Jensen Huang outlined a broad array of new hardware and software products at the conference and said the company expects to generate $1 trillion in sales from Blackwell and Rubin chips by the end of 2027.
Share
Share
Copy Link
Samsung shares jumped 4.9% after Nvidia CEO Jensen Huang confirmed the South Korean tech giant is manufacturing new AI inference processors at the GTC conference. Samsung also unveiled its most advanced HBM4E memory chips, positioning itself as a comprehensive AI solution provider amid intensifying competition from SK Hynix and Micron Technology.
Samsung shares climbed as much as 4.9% in Seoul trading after Nvidia CEO Jensen Huang publicly confirmed a new AI chip production deal with the South Korean semiconductor giant at the GTC conference in California on Monday
1
. Huang announced that Samsung is manufacturing Nvidia's new AI inference processors built with technology from Groq, which Nvidia acquired for $20.6 billion in December1
. The chips are already in production and set to ship in the second half of the year, marking a significant milestone for Samsung's foundry business1
.
Source: Benzinga
At the GTC conference, Huang doubled the AI demand outlook to $1 trillion, reflecting surging demand for next-generation platforms including Blackwell and Vera Rubin as companies ramp up spending on AI infrastructure
1
. Nvidia expects to generate $1 trillion in sales from Blackwell and Rubin chips by the end of 20272
.Samsung showcased its most advanced next-generation high-bandwidth memory component, the HBM4E, ahead of rivals at Nvidia's annual AI conference
2
. The sixth-generation HBM4, designed for Nvidia's Vera Rubin platform, delivers speeds of 11.7 gigabits per second, with potential to reach 13 Gbps, above the industry standard of 8 Gbps2
. The upgraded HBM4E runs at an impressive 16 Gbps, positioning Samsung at the forefront of the AI memory chip market2
.Samsung has already started mass production of its HBM4 AI memory chips and is reportedly negotiating prices of about $700 per unit
1
. Last month, the company announced it was the first to mass-produce and ship HBM4 products, and plans to provide HBM4E samples to clients in the second half2
.Related Stories
The showcase reinforces Samsung's position in the emerging HBM4 market amid the AI boom, particularly as the company had previously trailed smaller rivals such as SK Hynix and Micron Technology in supplying earlier-generation HBM3 and HBM3E chips to Nvidia
2
. Despite being the world's largest memory-chip maker, Samsung's earlier struggles in the AI chip space made this partnership and technological advancement particularly significant2
.Samsung emphasized its unique position as "the industry's only semiconductor company offering a total AI solution spanning memory, logic, foundry and advanced packaging," showcasing its full suite of products and solutions that enable customers to design and build groundbreaking AI systems
2
. Samsung shares outperformed the Kospi's 2.7% gain in Tuesday afternoon trading, indicating strong investor confidence in the company's AI strategy2
. The combination of confirmed AI chip production contracts and leadership in next-generation memory chips positions Samsung to capture significant market share as AI infrastructure spending accelerates globally.Summarized by
Navi
[2]
1
Technology

2
Technology

3
Business and Economy
