2 Sources
2 Sources
[1]
Vertiv Partners with Netweb Technologies to Deliver Liquid-Cooled AI Data Center Racks in India
Netweb's rack-scale solutions will leverage Vertiv's, liquid cooling infrastructure, including coolant distribution units and free cooling chillers;and advanced power infrastructure that includes busways, and uninterruptible power supply (UPS) systems with power conversion and dynamic load management. The collaboration will allow customers to address the rapidly rising power demands of AI workloadsand the extreme thermal densities driven by high-performance accelerators.. The validated rack-scale solutions will enable higher rack densities, faster deployment, and reliable performance for the most demanding AI training and inference environments.
[2]
Vertiv and Netweb Technologies to Deliver Advanced Liquid-Cooled Rack Solutions for AI Data Centers in India
Vertiv today announced a collaboration with Netweb Technologies India Ltd. to jointly engineer and validate Netweb's cutting-edge, in-house designed GPU compute platforms with Vertiv's integrated, end-to-end AI data center solutions. Netweb's rack-scale solutions will leverage Vertiv's, liquid cooling infrastructure, including coolant distribution units and free cooling chillers;and advanced power infrastructure that includes busways, and uninterruptible power supply (UPS) systems with power conversion and dynamic load management. The collaboration will allow customers to address the rapidly rising power demands of AI workloadsand the extreme thermal densities driven by high-performance accelerators.. The validated rack-scale solutions will enable higher rack densities, faster deployment, and reliable performance for the most demanding AI training and inference environments. "AI workloads are fundamentally reshaping data center design, pushing power densities and thermal limits beyond what traditional infrastructure can support," said Shrirang Deshpande, general manager for strategic planning new business development at Vertiv India. "By combining Vertiv's global expertise in liquid cooling, power, and critical infrastructure with Netweb's indigenous high-performance compute platforms, such as their latest and most powerful Netweb Tyrone platforms designed using next generation GPUs, we are enabling scalable, future-ready AI infrastructure that is engineered and manufactured in India, for India and global markets." Both Vertiv and Netweb operate design, engineering, and manufacturing facilities in India, reinforcing the Make in India initiative and strengthening the country's position as a global hub for AI infrastructure innovation and export. This "India for India, India for the world" approach enhances domestic self-reliance while supporting global supply chains. The engagement between Vertiv and Netweb aligns with the preferential tax regulations for foreign cloud service providers and data centre operators in India, further supporting the country's position as a global AI and digital infrastructure hub. "This initiative allows us to collaboratively work on designing complex cooling technologies, including Direct to Chip cooling for our GPU accelerated AI & HPC Systems, right at the stage of conceptualization of product designs -- deliver fully validated, rack-scale AI systems where compute, cooling, and power are designed to work together," said Hirdey Vikram, senior vice president at Netweb Technologies. "OEM-level collaboration with Vertiv to jointly work strengthens our ability to design and manufacture world's latest architecture GPU systems and populate with DLC technologies to cross 200KW IT load limits per rack unit. This engagement will support India's and world's growing AI ecosystem while delivering high-density and future-ready AI infrastructure for enterprises, cloud providers, and research institutions worldwide." Environmental responsibility is a core focus of the collaboration. Vertiv's liquid cooling solutions are designed to significantly improve energy efficiency compared to conventional air-cooled architectures and to reduce water usage compared to traditional water-cooled technologies, helping lower the environmental impact of power-intensive AI workloads while supporting India's energy efficiency and environmental responsibility goals. The integrated liquid-cooled rack stack solutions are designed to enhance overall data center operational and energy efficiency, offering superior thermal performance, optimized power utilization, and scalable architectures to support large-scale AI deployments. The collaboration underscores both companies' commitment to advancing resilient, efficient, and reliable AI infrastructure for the next phase of digital growth
Share
Share
Copy Link
Vertiv has announced a collaboration with Netweb Technologies to engineer and validate liquid-cooled AI data center racks in India. The partnership combines Vertiv's liquid cooling infrastructure with Netweb's GPU compute platforms to address the high power demands of AI workloads and extreme thermal densities, enabling rack densities beyond 200KW per unit while supporting the Make in India program.
Vertiv has announced a collaboration with Netweb Technologies India Ltd. to jointly engineer and validate advanced liquid-cooled rack solutions designed specifically for AI data center deployments in India and global markets
2
. The partnership brings together Vertiv liquid cooling infrastructure, including coolant distribution units and free cooling chillers, with Netweb GPU compute platforms to create fully integrated rack-scale systems1
. This collaboration addresses the rapidly rising high power demands of AI workloads and the extreme thermal densities driven by high-performance accelerators that are pushing traditional data center infrastructure beyond its limits.
Source: DT
AI workloads are fundamentally reshaping data center design, creating unprecedented challenges for power infrastructure and thermal management. According to Shrirang Deshpande, general manager for strategic planning new business development at Vertiv India, "AI workloads are fundamentally reshaping data center design, pushing power densities and thermal limits beyond what traditional infrastructure can support"
2
. The validated rack-scale solutions will enable higher rack densities, faster deployment, and reliable performance for demanding AI training and inference environments1
. Netweb's systems, including their latest Netweb Tyrone platforms designed using next-generation GPUs, will integrate with Vertiv's power infrastructure that includes busways and uninterruptible power supply (UPS) systems with power conversion and dynamic load management capabilities.The collaboration enables both companies to work on designing complex cooling technologies, including Direct-to-Chip cooling for GPU accelerated AI and HPC Systems, right from the conceptualization stage. Hirdey Vikram, senior vice president at Netweb Technologies, emphasized that "OEM-level collaboration with Vertiv to jointly work strengthens our ability to design and manufacture world's latest architecture GPU systems and populate with DLC technologies to cross 200KW IT load limits per rack unit"
2
. This capability positions the partnership to serve enterprises, cloud service providers, data center operators, and research institutions worldwide seeking scalable AI infrastructure that can handle increasingly dense compute requirements.Related Stories
Both Vertiv and Netweb operate design, engineering, and manufacturing facilities in India, reinforcing the Make in India program and strengthening the country's position as a global hub for AI infrastructure innovation and export
2
. This "India for India, India for the world" approach enhances domestic self-reliance while supporting global supply chains. The engagement aligns with preferential tax regulations for foreign cloud service providers and data center operators in India, further supporting the country's position as a global AI and digital infrastructure hub. Environmental responsibility remains a core focus, as Vertiv's liquid cooling solutions are designed to significantly improve energy efficiency compared to conventional air-cooled architectures and reduce water usage compared to traditional water-cooled technologies, helping lower the environmental impact of power-intensive AI workloads.
Source: CXOToday
Summarized by
Navi
[1]
20 Nov 2024•Technology

28 May 2025•Technology

05 May 2025•Technology

1
Business and Economy

2
Technology

3
Policy and Regulation
