Curated by THEOUTPOST
On Fri, 15 Nov, 8:01 AM UTC
2 Sources
[1]
Samsung working on 'custom HBM4' solutions for Meta and Microsoft, to fight TSMC and SK hynix
Samsung has reportedly started HBM4 development with the company rumored to offer customized HBM4 memory solutions to tech giants Meta and Microsoft. The company has started developing "custom HBM4" memory that will be "customized" for Microsoft and Meta according to Korean outlet MK. According to a semiconductor company official, Microsoft "has its own artificial intelligence (AI) chips called Maia 100 and Meta Artemis". Samsung Electronics is the best partner for these big tech companies because it has a memory division and an LSI division that can design computational chips directly". Most of the big tech companies have their own AI data centers and AI supercomputers, so there is a huge need for the reduction of chips purchased (which saves lots of money). This is why these companies design and use AI accelerator chips separately, while buying AI chips made by the likes of NVIDIA and AMD. Samsung is expected to begin mass production immediately after completing the development of HBM4, which is expected by the end of 2025. The company hasn't been liberal with specs just yet, but Samsung did unveil its HBM4 specifications through the semiconductor conference "ISSCC 2024" back in February 2024. The new HBM4 memory from Samsung is expected to have 2TB/sec of memory bandwidth, which is an increase of 66% over HBM3E, and up to 48GB capacities with HBM4, up 33% from the 36GB with HBM3E, by stacking 16 DRAM stages. NVIDIA's next-gen Rubin R100 AI GPU will be debuting with next-gen HBM4 memory, with TSMC and SK hynix partnering for the new HBM4 memory development. Samsung is behind, but with customized HBM4 solutions offered to companies like Meta and Microsoft, things could get interesting (but also remember, Samsung has been stumbling over itself for a while now, especially in the semiconductor business).
[2]
Samsung To Develop "Custom HBM4" Solutions For Meta & Microsoft, Signaling A Massive Breakthrough
Samsung has reportedly initiated HBM4 development, as the Korean giant is rumored to offer customized solutions to Microsoft and Meta. Samsung's Sluggish HBM Business Might Finally Get Momentum With Next-Gen HBM4, Rumored To Be Integrated By Top AI Giants It looks like the Korean giant might have witnessed a massive breakthrough with its HBM business. According to the local media outlet MK, Samsung is developing its HBM4 products that will act as customized memory for Meta and Microsoft, two of the industry's largest AI giants. Interestingly, it is said that HBM4 integration is very much a possibility in Microsoft and Meta's next-gen AI solutions, and this might formally mark the first moment where Samsung's HBM4 process will see mainstream adoption. Talking about Samsung's HBM4 specifications, it is said that the firm will employ logic and semiconductor dies, a new route discovered by the industry for progression in HBM capabilities. It is disclosed that the Korean giant will employ its own 4nm process from the foundry division and will utilize the 10nm 6th-generation 1c DRAM, which is known as one of the highest end in the markets. On paper, Samsung's HBM4 solution will be on par with what competitors like SK hynix will offer, but we will have to wait and see. The report by MK suggests that Samsung's HBM4 could be used in Microsoft's custom AI chip, the "Maia 100", and Meta's Artemis AI processor might also employ Samsung's HBM4 since the Korean giant's memory and LSI division are ideal for companies looking to acquire custom memory solutions. And since every major AI giant is looking towards switching to "custom-built" AI chips, Samsung's HBM4 process and the expertise they have might prove to be quite beneficial for the Korean giant in the longer run. Samsung's HBM business ambitions aren't going too well, given that the firm reportedly didn't manage to secure NVIDIA as a customer in the time expected by investors, which is why the firm reported bearish earnings, at least in the HBM segment. However, with the latest development, Samsung can achieve a potential comeback, but with the competition from the likes of SK hynix, the Korean giant will need to do a lot more moving ahead.
Share
Share
Copy Link
Samsung initiates development of customized HBM4 memory solutions for tech giants Meta and Microsoft, potentially marking a significant breakthrough in the company's HBM business and intensifying competition in the AI chip market.
Samsung has reportedly begun developing "custom HBM4" memory solutions, specifically tailored for tech giants Meta and Microsoft. This move signals a potential breakthrough for Samsung's High Bandwidth Memory (HBM) business, which has been facing challenges in recent times 1.
The upcoming HBM4 memory from Samsung is expected to offer significant improvements over its predecessor, HBM3E. Key enhancements include:
Samsung plans to utilize its own 4nm process from its foundry division and employ 10nm 6th-generation 1c DRAM, considered one of the highest-end offerings in the market 2.
The customized HBM4 solutions are rumored to be integrated into:
This development could prove crucial for Samsung, as major tech companies increasingly shift towards custom-built AI chips for their data centers and supercomputers. The move aims to reduce chip purchases and cut costs for these tech giants 1.
Samsung's initiative comes as a response to fierce competition in the HBM market:
Samsung is expected to complete HBM4 development by the end of 2025, with mass production commencing shortly after. While the company unveiled initial HBM4 specifications at the ISSCC 2024 conference in February, detailed specs are yet to be released 1.
This strategic move could potentially revitalize Samsung's position in the HBM market. However, the company will need to overcome its recent setbacks and face strong competition from established players like SK hynix to secure a significant market share in the rapidly evolving AI chip industry 2.
SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.
5 Sources
5 Sources
SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.
12 Sources
12 Sources
SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.
3 Sources
3 Sources
Samsung Electronics has successfully cleared Nvidia's tests for its 8-layer High Bandwidth Memory 3E (HBM3E) chips. This breakthrough could lead to significant advancements in AI chip technology and strengthen Samsung's position in the memory chip market.
4 Sources
4 Sources
NVIDIA is working rapidly to certify Samsung's HBM3E memory chips for its AI GPUs, potentially diversifying its supply chain beyond current major suppliers SK hynix and Micron.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved