Curated by THEOUTPOST
On Sat, 12 Oct, 12:02 AM UTC
6 Sources
[1]
NVIDIA To Ship 150K-200K Blackwell GB200 AI Servers In Q4 2024 Alone & 500-550K Units In Q1 2025, Microsoft Being The Leading Buyer
NVIDIA's Blackwell GB200 AI servers are expected to witness a massive shipment volume in Q4 2024, with Microsoft being the most "aggressive" acquirer of the new AI servers. NVIDIA To Generate Billions Alone From Blackwell AI Servers, Quarterly Demand Expected To Rise By 250% Team Green's Blackwell architecture is on its way to making history when it comes to the expected revenue NVIDIA will generate from it and the capabilities it will bring to the AI industry. Not only is Blackwell set to dethrone Hopper in terms of market hype, but every mainstream tech company is looking toward acquiring NVIDIA's new AI offerings. In a new report by the prominent analyst Ming-Chi Kuo, NVIDIA is expected to ship out 150,000-200,000 units of their GB200 AI servers in Q4 alone, with QoQ growth projected to be up to 250%, showing the aggressive industry demand. Blackwell chip production ramp-up begins in early 4Q24. Considering yield rates and testing efficiency, estimated shipments are about 150,000-200,000 units in 4Q24, with significant growth projected at 200-250% QoQ to 500,000-550,000 units in Q1 2025. Microsoft's 4Q24 GB200 orders have recently surged 3-4 times from the previous 300-500 racks (mainly NVL36) to about 1,400-1,500 racks (about 70% NVL72). Subsequent Microsoft orders will primarily focus on NVL72. Other CSP orders, such as Amazon's 300-400 racks of GB200 NVL36 in 4Q24 and Meta's architecture focusing on Ariel rather than Bianca, have significantly lower order volumes than Microsoft. It's important to note that this doesn't necessarily indicate conservatism from other CSPs but that Microsoft's current demand for GB200 is significantly higher than that of other CSPs. via Medium NVIDIA can potentially ship 500,000-550,000 units of its Blackwell AI servers by Q1 2025, which means that a million-unit mark can be achieved only in a matter of quarters. In terms of the companies expected to acquire NVIDIA's Blackwell servers, Microsoft is said to be the largest beneficiary, which is said to buy out both the NVL36 and the NVL72 variants of the GB200 AI clusters, with the NVL36 models expected to be delivered this quarter. Microsoft has upscaled its orders by up to 3 to 4 times, mainly including the NVL72 variant in its list. Foxconn and Quanta are responsible for delivering Microsoft's Blackwell systems, and based on an independent survey from both organizations, Microsoft is the leading CSP when it comes to the orders placed, and the firm is said to be rapidly expanding its AI compute capabilities, especially with the debut of NVIDIA's Blackwell. Companies like Amazon and Meta have significantly lower order volumes, which shows that Microsoft is indeed determined to capitalize on Blackwell's capabilities. NVIDIA's Blackwell is expected to become mainstream moving into Q4 2024, where we will likely see the architecture in action, delivering computing power to AI companies that will ramp up the technology's development.
[2]
NVIDIA Blackwell GPUs Fully Sold Throughout Next Year Amid High Demand from Major Tech Firms
NVIDIA's latest Blackwell series GPUs are fully sold out for the next twelve months. In a recent investor meeting, NVIDIA CEO Jensen Huang revealed that the Blackwell GPUs have experienced unprecedented demand from major technology companies such as Amazon, Google, Microsoft, and Meta. These GPUs are designed to improve artificial intelligence (AI) capabilities in servers and data centers, highlighting the ongoing growth of AI applications across various industries. The swift adoption of Blackwell GPUs by leading tech companies demonstrates the essential role these advanced processors play in handling complex AI and machine learning tasks. Market analysts expect that NVIDIA will continue to strengthen its leadership in the AI accelerator market by 2025. The company's comprehensive range of GPU solutions remains preferred for their performance and reliability, even as competitors like AMD launch alternatives such as the Instinct MI300X cards. NVIDIA's ability to maintain a top market position is due to its continuous innovation and the scalability of its GPU architecture, which effectively meets the evolving needs of data-intensive AI tasks. This advantage is likely to increase NVIDIA's market share, reinforcing its position as a major player in the AI hardware industry. Any new Blackwell orders now that aren't already in queue will be shipped late next year, as they are booked out 12 months, which continues to drive strong short term demand for Hopper which will still be a major factor through the year - Morgan Stanley's Joseph Moore Despite reports of potential production issues, including component shortages and design problems, the demand for NVIDIA's Blackwell GPUs remains strong. NVIDIA has assured that there will be no delays in delivering its B100 and B200 chip models, citing high pre-order numbers as evidence of customer confidence in its supply chain management. This commitment to timely delivery is vital for maintaining trust among enterprise clients who rely on NVIDIA's GPUs for critical AI operations. Source: WCCFTech
[3]
NVIDIA Blackwell GPUs for AI are effectively 'sold out' for the next 12 months
NVIDIA's customers, which include Google, Meta, Microsoft, AWS, Oracle, and others, have already bought every Blackwell GPU that NVIDIA (and TSMC) will produce to the point where Blackwell is effectively "sold out" for the next 12 months. This comes directly from NVIDIA management, including CEO Jensen Huang, who recently sat down with Morgan Stanley analysts. Blackwell was recently delayed due to NVIDIA B100 and B200 GPU packaging issues that required a redesign, but production is now well underway. Recently, OpenAI received one of the first engineering builds of the NVIDIA DGX B200 AI server, which includes eight B200 AI GPUs with up to 1.4TB of high-speed HBM3 memory. With Blackwell delivering "massive performance leaps" regarding next-gen AI training and inference performance, it's at the top of the list for all large-scale AI projects. And it's gotten to the point where the high demand for Blackwell means new customers ordering Blackwell GPUs will have to wait until late next year to receive their orders. According to Morgan Stanley analysts, this demand will see NVIDIA gain market share in 2025, even though it will compete with more AI chips and GPU hardware from AMD, Intel, and other companies. "Our view continues to be that NVIDIA is likely to actually gain share of AI processors in 2025," Joseph Moore, an analyst with Morgan Stanley, wrote. "The biggest users of custom silicon are seeing very steep ramps with NVIDIA solutions next year, [and] everything we heard this week reinforced that." Blackwell represents NVIDIA's most cutting-edge AI GPU technology to date, utilizing TSMC's CoWoS-L packaging and next-gen HBM3E memory. Over the next few quarters, NVIDIA will produce as many Blackwell GPUs as possible while dealing with capacity limitations and potential memory supply issues. The demand for AI GPU hardware is not going anywhere, with Blackwell showing that it's intensifying.
[4]
Nvidia Blackwell GPUs sold out for the next 12 months as AI market boom continues
Selling like hotcakes: The extraordinary demand for Blackwell GPUs illustrates the need for robust, energy-efficient processors as companies race to implement more sophisticated AI models and applications. The coming months will be critical to Nvidia as the company works to ramp up production and meet the overwhelming requests for its latest product. Nvidia's latest Blackwell GPUs are experiencing unprecedented demand, with the company reporting that it has sold out of these next-gen processors. Nvidia CEO Jensen Huang revealed the news during an investors meeting hosted by Morgan Stanley. Morgan Stanley Analyst Joe Moore notes that Nvidia executives disclosed that their Blackwell GPU products have a 12-month backlog, echoing a similar situation with Hopper GPUs several quarters ago. The overwhelming demand for Blackwell GPUs comes from Nvidia's traditional customers, including major tech giants like AWS, CoreWeave, Google, Meta, Microsoft, and Oracle. These companies have purchased every Blackwell GPU Nvidia and its manufacturing partner TSMC can produce for the next four quarters. The extreme demand indicates that Nvidia's already considerable footprint in the AI processor market should continue to grow next year, even as competition from rivals such as AMD, Intel, and various cloud service providers grab their share. "Our view continues to be that Nvidia is likely to gain share of AI processors in 2025, as the biggest users of custom silicon are seeing very steep ramps with Nvidia solutions next year," Moore said in a client note. Nvidia unveiled the Blackwell GPU platform in March. It includes the B200 GPU and GB200 Grace "super chip." These processors can handle the demanding workloads of large language model (LLM) inference while significantly reducing energy consumption, a growing concern in the industry. Nvidia has overcome initial packaging issues with its B100 and B200 GPUs, allowing the company to ramp up production. Both GPUs utilize TSMC's advanced CoWoS-L packaging technology. However, questions remain about whether TSMC has sufficient CoWoS-L capacity to meet the skyrocketing demand. Another potential bottleneck in the production process is the supply of HBM3E memory, which is crucial for high-performance GPUs like Blackwell. Tom's Hardware pointed out that Nvidia is yet to qualify Samsung's HBM3E memory for its Blackwell GPUs, adding another layer of complexity to the supply chain. Also see: Nvidia Blackwell server installation showcase at Hot Chips 2024 In August, Nvidia acknowledged that its Blackwell-based products were experiencing low yields, necessitating a re-spin of some layers of the B200 processor to improve production efficiency. Despite these challenges, Nvidia remains confident in its ability to ramp up Blackwell production in Q4 2024. It expects to ship several billion dollars worth of Blackwell GPUs in the last quarter of this year.
[5]
NVIDIA's Entire Blackwell AI GPU Supply Sold Out For The Next 12 Months, AI Firms Gobble Up 100K Units In A Single Order Highlighting Insane Demand
NVIDIA's Blackwell AI portfolio is sold out for the next 12 months as Team Green witnesses massive demand for its upcoming products. It won't be wrong to say that NVIDIA is on track to see the Blackwell generation as the firm's most successful product in its history, mainly because demand is now said to have skyrocketed to an unprecedented level. With the capabilities and computing power brought in by Blackwell, the architecture has managed to attract the attention of major market players such as Microsoft, Meta, Oracle and OpenAI, and now, Morgan Stanley (via Barrons) claims that Team Green has sold out the supply for Blackwell products moving into 2025, creating a massive supply-demand bottleneck. In a meeting with NVIDIA's CEO Jensen Huang and CFO Colette Kress, Morgan Stanley's analyst Joseph Moore analyzed that AI is a long-term investment and is a technology that is poised for sustainable growth moving into the future. Not only has Morgan Stanley promoted NVIDIA's share pricing ratings, but it also believes that Blackwell is going to play a crucial role in the growth of the firm from hereon, as not only are the products sold out for the next year, but they will be pivotal in setting the tone of the AI markets as well. Any new Blackwell orders now that aren't already in queue will be shipped late next year, as they are booked out 12 months, which continues to drive strong short term demand for Hopper which will still be a major factor through the year - Morgan Stanley's Joseph Moore The reports we are getting on Blackwell are indeed shocking, and there is no doubt the fact that the industry will benefit tremendously from the debut of NVIDIA's Blackwell products, once they hit the mainstream markets. Based on the figures we have seen until now, Blackwell's demand and anticipation is unmatched when compared to the Hopper generation, and its important to note that NVIDIA's Hopper architecture initially fueled the AI hype, leading NVIDIA into the trillion-dollar club, so you can only imagine what Blackwell brings onboard, Not just raw computing power, but NVIDIA is also said to now shifting its focus towards the inferencing markets, which has recent seen a massive growth with the debut of "reasoning AI models" like OpenAI's o1. So eventually, Team Green has transitioned into focusing on inferencing capabilities, a segment which was initially said to be dominated by Intel, given their focus towards inferencing early-on but it looks like Team Blue isn't anywhere for now. Will Blackwell's demand reflect a similar trend into NVIDIA's stock pricing? Well, that is time to decide but AI is now evolving into a much more broader industry rather than simply taking it as an investment opportunity, and it looks like NVIDIA is a forerunner here.
[6]
Nvidia's Blackwell GPUs are sold out for the next 12 months -- chipmaker to gain market share in 2025
Nvidia's Blackwell GPUs for AI and HPC faced a slight delay due to a yield-killing issue with packaging that required a redesign, but it looks like this did not impact demand for these processors. According to the company's management questioned by Morgan Stanley analysts (via Barron's), the supply of Nvidia Blackwell GPUs for the next 12 months has been sold out, which mimics a situation with Hopper GPUs supply several quarters ago. As a result, Nvidia is expected to gain market share next year (via Seeking Alpha). Morgan Stanley analysts shared insights from recent meetings with Nvidia's leadership, including CEO Jensen Huang. During these meetings, it was revealed that orders for the Blackwell GPUs are already sold out for the next 12 months. This means new customers placing orders today must wait until late next year to receive their orders. Nvidia's traditional customers (AWS, CoreWeave, Google, Meta, Microsoft, and Oracle, to name some) have bought every Blackwell GPU that Nvidia and its partner TSMC will be able to produce in the coming quarters. Such an overwhelming demand may indicate that Nvidia might gain market share next year despite intensified competition from AMD, Intel, cloud service providers (with proprietary offerings), and various smaller companies. "Our view continues to be that Nvidia is likely to actually gain share of AI processors in 2025, as the biggest users of custom silicon are seeing very steep ramps with Nvidia solutions next year," Joseph Moore, an analyst with Morgan Stanley, wrote in a note to clients. "Everything that we heard this week reinforced that." Now that packaging issues of Nvidia's B100 and B200 GPUs have been resolved, Nvidia can produce as many Blackwell GPUs as TSMC can. Both B100 and B200 use TSMC's CoWoS-L packaging, and whether the world's largest chip contract maker has enough CoWoS-L capacity remains to be seen. Also, as demand for AI GPUs is skyrocketing, it remains to be seen whether memory makers can supply enough HBM3E memory for leading-edge GPUs like Blackwell. In particular, Nvidia has yet to qualify Samsung's HBM3E memory for its Blackwell GPUs, another factor influencing supply.
Share
Share
Copy Link
NVIDIA's next-generation Blackwell AI GPUs are experiencing unprecedented demand, with the entire supply sold out for the next 12 months. Major tech companies are aggressively acquiring these GPUs, highlighting the intense competition in the AI hardware market.
NVIDIA's upcoming Blackwell GPU architecture is set to make history in the AI industry, with unprecedented demand from major tech companies. The company's entire supply of Blackwell AI GPUs is reportedly sold out for the next 12 months, highlighting the intense competition in the AI hardware market [1][2][3].
NVIDIA is expected to ship between 150,000 and 200,000 units of their GB200 AI servers in Q4 2024 alone. The company projects a significant quarter-over-quarter growth of 200-250%, with shipments estimated to reach 500,000 to 550,000 units in Q1 2025 [1]. This rapid increase in production and demand underscores the growing importance of AI infrastructure in the tech industry.
Microsoft has emerged as the most aggressive buyer of NVIDIA's Blackwell GPUs. The company has reportedly increased its Q4 2024 orders by 3-4 times, from 300-500 racks to about 1,400-1,500 racks, with a focus on the more powerful NVL72 variant [1]. Other major tech firms, including Amazon, Google, Meta, and Oracle, have also placed significant orders, although at lower volumes compared to Microsoft [1][2][4].
Despite initial reports of production issues, including component shortages and design problems, NVIDIA has assured that there will be no delays in delivering its B100 and B200 chip models [2]. The company has overcome packaging issues with its GPUs, allowing for increased production. However, potential bottlenecks remain, such as the supply of HBM3E memory and TSMC's CoWoS-L packaging capacity [4][5].
The extraordinary demand for Blackwell GPUs is expected to strengthen NVIDIA's leadership in the AI accelerator market. Morgan Stanley analysts predict that NVIDIA is likely to gain market share in AI processors in 2025, despite increasing competition from AMD, Intel, and other companies [3][5].
The overwhelming demand for NVIDIA's Blackwell GPUs reflects the rapid growth and increasing sophistication of AI applications across various industries. As companies race to implement more advanced AI models, the need for robust, energy-efficient processors has become critical [4]. This trend is likely to drive further innovation and competition in the AI hardware market, potentially accelerating the development of AI technologies across the board.
The extended lead time for Blackwell GPUs is also driving strong short-term demand for NVIDIA's current Hopper architecture. This suggests that the AI boom is not only focused on cutting-edge technology but also on maximizing the use of available resources to meet immediate AI computing needs [3][5].
As NVIDIA works to ramp up production and meet the overwhelming demand for its latest products, the coming months will be crucial for the company and the AI industry as a whole. The success of Blackwell could set new standards for AI computing capabilities and further solidify NVIDIA's position as a leader in the AI hardware market.
Reference
[2]
NVIDIA prepares to launch its next-generation Blackwell GB200 AI servers in December, with major cloud providers like Microsoft among the first recipients. This move aims to address supply issues and meet the growing demand for AI computing power.
3 Sources
NVIDIA is expected to offer US-compliant GB20 Blackwell AI servers to China, while facing potential high costs for Blackwell server cabinets. This development highlights the complexities of international tech trade and the increasing value of AI infrastructure.
2 Sources
Morgan Stanley analysts report robust demand for Nvidia's Hopper and Blackwell chips, projecting $10 billion in revenue from Blackwell alone in Q4 2024. This forecast underscores Nvidia's dominance in the AI chip market.
2 Sources
NVIDIA's next-generation Blackwell GPU is set for production ramp-up in Q4 2024. CEO Jensen Huang addresses design challenges and confirms mask change completion, emphasizing the GPU's potential impact on AI advancements.
3 Sources
NVIDIA showcases its next-generation Blackwell AI GPUs, featuring upgraded NVLink technology and introducing FP4 precision. The company also reveals its roadmap for future AI and data center innovations.
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved