Curated by THEOUTPOST
On Fri, 27 Sept, 4:02 PM UTC
3 Sources
[1]
NVIDIA's Blackwell GB200 AI Servers Ready For Mass Deployment In December
NVIDIA's Blackwell GB200 AI servers are expected to be deployed by December as Team Green & its partners resolve the supply chain issue while bolstering the sampling process. NVIDIA was plagued with rumors of a defect in its Blackwell AI products a few weeks ago, with the main issue lying in TSMC's packaging technology, the CoWoS-L. However, given the immense popularity of Blackwell and NVIDIA's reputation on the line, Team Green quickly resolved the issue, cooperating with partners, which is why, in the recent earnings call, NVIDIA announced that Blackwell is on track for Q4 2024. We weren't aware of the specific timelines until now, as analyst Tim Culpan has revealed what's going on behind the scenes. Mass production, from module to system assembly, will be in full swing with significant shipments by early December, sources tell me. Both Ariel boards (1x Grace, 1x Blackwell) and Bianca boards (1x Grace, 2x Blackwell) will be ready. via Tim Culpan According to the analyst, NVIDIA's Blackwell GB200 AI servers will be shipped to customers by December. Both the NVL36 and NVL72 counterparts are expected to be shipped by December, and the report states that Microsoft, Oracle, and Meta will be among the first customers to acquire the latest architecture from NVIDIA. We can expect Blackwell to enter mainstream markets to captivate the markets by Q1 2025, but supply will remain confined for the first few months. In terms of Blackwell suppliers, Foxconn's industrial server division (FII) is expected to be responsible for shipping out most of the Blackwell AI servers initially, with Quanta, WIstron, and Luxshare selling out the rest of the share. It is said that the suppliers mentioned above have all secured one large-scale client, which is why they are on NVIDIA's exclusive list. The demand for Blackwell is tremendous, and the architecture is on track to be the firm's "most successful" product, credit to its power. It will be interesting to see how Blackwell plans for NVIDIA in the future since the product lineup will usher in the next era of AI computing, opening new doors for the industry.
[2]
Nvidia will reportedly start Blackwell server deliveries in early December -- the first servers will go to Microsoft
After resolving yield-killing issues with the packaging of B100 and B200 AI GPUs, Tim Culpan reports that Nvidia is set to begin shipping its GB200 servers to major cloud service providers in early December. According to the analyst, key clients like AWS, Meta, Microsoft, and Oracle are lined up to receive Nvidia's next-generation AI servers only a month behind schedule. However, the exact volumes are unclear. As expected, Nvidia will first ship its NVL36 and NVL72 servers. The revised delivery schedule means that the GB200-NVL36 (with 36 Blackwell GPUs) and GB200-NVL72 (with 72 GPUs) server versions, initially planned for late October and early November, respectively, will now ship in the first week of December, Culpan claims. Unfortunately, the number of machines set to be shipped in December is unknown. According to the analyst, Microsoft is expected to receive one of the largest allocations of these servers. AWS, Meta, and Oracle are also among the alpha clients for Blackwell. Despite the production hiccup, demand for the GB200 servers remains strong. In August, Nvidia acknowledged that it had to produce 'low-yielding Blackwell material' to meet demand, negatively affecting its profit margins. These low-yielding B200 GPUs will likely be used in the first GB200-NVL servers, which might affect their volumes. Jensen Huang, Nvidia's chief executive, stated last month that all necessary design changes for the Blackwell B100 and B200 GPUs had been completed. The company is on schedule to start mass production in the fourth quarter, which, in Nvidia's case, begins in late October. Given TSMC's 4NP production cycle of around three months, the refined Blackwell GPUs will be ready in late January as soon as possible unless TSMC finds a way to expedite its production and packaging process. Meanwhile, Blackwell GPU module makers and system integrators already have qualification samples of the reworked GPUs (which is about time), according to Culpan. Once they qualify for the new silicon, they can assemble appropriate products. Nvidia handles the sale of its Grace CPUs and Blackwell GPUs directly with clients like Microsoft or Oracle. However, the latter gets actual servers from manufacturers, who have to win contracts with customers that can allocate GPUs from Nvidia. Here is how it works. TSMC handles silicon production and packaging using its CoWoS-L process, rumored to be the reason for Blackwell's production hiccup. Once Grace GPU and GPU packages are ready, they are sent to companies like Foxconn Industrial Internet (FII) for module assembly. After that, modules can be installed on the boards of Ariel (one Grace CPU, one Blackwell GPU) or Bianca (one Grace CPU, two Blackwell GPUs). These components are sent directly to system integrators, who assemble and deliver the final server racks to the customers. Foxconn and Quanta are the primary integrators that will deliver the first Blackwell-based servers to Nvidia's clients. Others, including brands like Asus, Gigabyte, Luxshare, and Wiwynn, can also assembly that reference NVL machines, but before they get components, they need to win contracts with a significant customer, which would negotiate GPU allocation with Nvidia. Initially, all server assembly will take place in Taiwan. However, Nvidia is considering expanding production to other regions, including Mexico and Texas, in the first quarter 2024. This would increase production capacity as demand (and supply of B100 and B200 GPUs) grow.
[3]
NVIDIA Blackwell AI GPU issues addressed: GB200 AI servers to major cloud clients in December
NVIDIA's issues with its Blackwell AI GPUs have been "addressed" according to tech journalist Tim Culpan on X, with actual GB200 AI server shipments going out to major cloud service providers (CSPs) "earlier than previously-feared delay". Microsoft is expected to take delivery of one of the largest allocations of NVIDIA's new Blackwell AI GPUs, as well Oracle, AWS, Meta, and others. NVIDIA's new GB200 NVL36 AI servers were originally slated for delivery at the end of this month, while the higher-end (and more expensive) GB200 NVL72 AI servers were scheduled for early November. Culpan took to his Substack to explain that delays pushed the timeline back to January, but now both NVL36 and NVL72 AI servers will "ship around the first week of December". Initially, reports suggested that production of NVIDIA's new Blackwell AI GPUs would be delayed, after TSMC and NVIDIA found issues with the CoWoS-L advanced packaging (which connects the GPU and HBM together, check out this article to see just how that works). NVIDIA's major roadblocks with the Blackwell AI GPU delays were rumored to persist into early 2025, but it seems that NVIDIA, TSMC, and their supply chain partners have moved Heaven and Earth to fix these issues, and now NVL36 and NVL72 AI server cabinets are shipping out to major cloud companies.
Share
Share
Copy Link
NVIDIA prepares to launch its next-generation Blackwell GB200 AI servers in December, with major cloud providers like Microsoft among the first recipients. This move aims to address supply issues and meet the growing demand for AI computing power.
NVIDIA, the leading graphics processing unit (GPU) manufacturer, is poised to make a significant leap in the artificial intelligence (AI) computing landscape. The company is reportedly gearing up to deploy its next-generation Blackwell GB200 AI servers as early as December 2023, marking a crucial milestone in the evolution of AI hardware 1.
The imminent release of the Blackwell GB200 servers comes as a response to the ongoing supply issues that have plagued the tech industry. NVIDIA has been working diligently to overcome these challenges, with the company's CEO, Jensen Huang, emphasizing their commitment to ramping up production to meet the surging demand for AI computing power 2.
According to industry sources, Microsoft is slated to be among the first recipients of the new Blackwell GB200 servers. This strategic move aligns with Microsoft's ambitious plans to expand its AI capabilities and maintain its competitive edge in the cloud computing market 2.
While detailed specifications of the Blackwell GB200 servers remain under wraps, industry experts anticipate significant improvements over the current generation of AI hardware. The new servers are expected to offer enhanced performance, improved energy efficiency, and increased memory bandwidth, all crucial factors in advancing AI and machine learning capabilities 3.
The deployment of NVIDIA's Blackwell GB200 servers is poised to have far-reaching implications for the AI industry. With major cloud providers like Microsoft integrating these cutting-edge servers into their infrastructure, we can expect to see advancements in various AI applications, from natural language processing to computer vision and beyond 1.
As NVIDIA prepares for the mass deployment of its Blackwell GB200 servers, the tech industry eagerly anticipates the potential breakthroughs that this new hardware might enable. The increased computing power and efficiency offered by these servers could accelerate AI research and development across various sectors, potentially leading to innovative solutions in fields such as healthcare, finance, and autonomous systems 3.
Reference
[2]
NVIDIA's next-generation Blackwell AI GPUs are experiencing unprecedented demand, with the entire supply sold out for the next 12 months. Major tech companies are aggressively acquiring these GPUs, highlighting the intense competition in the AI hardware market.
6 Sources
6 Sources
Nvidia's next-generation Blackwell AI servers, including the GB200 and GB300 models, may experience delays in mass production and peak shipments until mid-2025 due to overheating, power consumption, and interconnection optimization issues.
3 Sources
3 Sources
NVIDIA is expected to offer US-compliant GB20 Blackwell AI servers to China, while facing potential high costs for Blackwell server cabinets. This development highlights the complexities of international tech trade and the increasing value of AI infrastructure.
2 Sources
2 Sources
NVIDIA showcases its next-generation Blackwell AI GPUs, featuring upgraded NVLink technology and introducing FP4 precision. The company also reveals its roadmap for future AI and data center innovations.
4 Sources
4 Sources
NVIDIA's next-generation Blackwell AI chips face delays due to design flaws, potentially affecting major tech companies and the AI industry. The setback could have significant implications for AI development and market competition.
7 Sources
7 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved