Curated by THEOUTPOST
On Mon, 5 Aug, 4:01 PM UTC
4 Sources
[1]
Indian firms can cut AI carbon emission by 99% with AWS: Amazon-Accenture study
A study by Amazon Web Services (AWS) and Accenture found that AWS's additional carbon-free energy procurement in India contributes 31% in carbon emissions reduction for compute-heavy workloads and 44% for storage-heavy workloads. This is particularly important given the rising adoption of AI, said Jenna Leiner, head of environment social governance (ESG) and external engagement, AWS Global.Indian firms can reduce the carbon emissions of running compute-heavy or artificial intelligence (AI) workloads by up to 99% by utilising Amazon Web Services (AWS) cloud infrastructure compared to on-premises data centres, a study by AWS and Accenture found. Accenture estimates that AWS's global infrastructure is up to 4.1 times more efficient than on-premises. The potential carbon footprint reduction to about 98% can be attributed to AWS's utilisation of more efficient hardware (32%), improvements in power and cooling efficiency (35%), and additional carbon-free energy procurement (31%), the company said in a statement Monday. "Further optimisation on AWS by leveraging purpose-built silicon can increase the total carbon reduction potential of AI workloads to up to 99% for Indian organisations that migrate to and optimise on AWS," it said. As India accelerates towards its $1 trillion digital opportunity and encourages investments into digital infrastructure, sustainability innovations and minimising IT related carbon emissions will be critical in also helping India meet its net-zero emissions by 2070 goal, said Jenna Leiner, head of environment social governance (ESG) and external engagement, AWS Global. This is particularly important given the rising adoption of AI, Leiner said. "Considering 85% of global IT spend by organisations remains on-premises, a carbon reduction of up to 99% for AI workloads optimised on AWS in India is a meaningful sustainability opportunity for Indian organisations," she added. AWS since 2018 has invested in sustainable chip technology innovation, and customers such as payments and financial services distribution platform Paytm are benefiting from its carbon reduction potential, the company said. Paytm reported an up to 47% decrease in carbon emissions per transaction and a reduction in workload carbon intensity by adopting AWS's Graviton processors, it noted. For generative AI, AWS chips Trainium and Inferentia can deliver energy-consumption reductions of up to 29% and up to 50% higher performance per watt against comparable instances, for training and inferencing, respectively, the company said. "As the demand for AI continues to grow, sustainability through technology can play a crucial role in helping businesses meet environmental goals while driving innovation," said Sanjay Podder, global lead for technology sustainability innovation at Accenture. The study found that AWS's additional carbon-free energy procurement in India contributes 31% in carbon emissions reduction for compute-heavy workloads and 44% for storage-heavy workloads. Further, 100% of the electricity consumed by AWS data centres in India was matched with renewable energy sources procured in the country in 2022 and 2023, for the company's investments in 50 renewable energy projects. "Aligning with Amazon's commitment to achieving net-zero carbon emissions across all operations by 2040, AWS is rapidly transitioning its global infrastructure to match electricity use with 100% carbon-free energy," the company said.
[2]
AWS Can Help Indian Organizations Cut AI Workload Carbon Emissions by Up to 99%, Study Finds
"Considering 85% of global IT spend by organisations remains on-premises, a carbon reduction of up to 99% for AI workloads optimised on AWS in India is a meaningful sustainability opportunity for Indian organisations," said Jenna Leiner, Head of Environment Social Governance (ESG) and External Engagement, AWS Global. "As India accelerates towards its US$1 trillion-dollar digital opportunity and encourages investments into digital infrastructure, sustainability innovations and minimising IT related carbon emissions will be critical in also helping India meet its net-zero emissions by 2070 goal. This is particularly important given the rising adoption of AI. AWS is constantly innovating for sustainability across our data centres -- optimising our data centre design, investing in purpose-built chips, and innovating with new cooling technologies - so that we continuously increase energy efficiency to serve customer compute demands." "This research shows that AWS's focus on hardware and cooling efficiency, carbon-free energy, purpose-built silicon, and optimized storage can help organizations reduce the carbon footprint of AI and machine learning workloads," said Sanjay Podder, global lead for Technology Sustainability Innovation at Accenture. "As the demand for AI continues to grow, sustainability through technology can play a crucial role in helping businesses meet environmental goals while driving innovation." Sustainable chip technology innovation - purpose-built silicon One of the most visible ways AWS is innovating for energy efficiency is through the company's investment in AWS chips. Launched in 2018, the custom AWS-engineered general purpose processor, Graviton, was the first-of-its-kind to be deployed at scale by a major cloud provider. The latest Graviton4 offers four times the performance of Graviton, and while Graviton3 uses 60% less energy for the same performance as comparable Amazon EC2 instances (where the compute happens in a data centre), Graviton4 is even more energy efficient. AWS customers are also benefiting from the carbon reduction potential of Graviton. Paytm, India's leading payments and financial services distribution platform, witnessed a reduction in workload carbon intensity by adopting Graviton processors, reporting up to 47% estimated decrease in carbon emissions per transaction. Similarly, IBS Software, a leading SaaS solutions provider to the global travel industry, reported that other than improving performance and reducing cost by adopting Graviton processors, the company saw a 40% reduction in carbon emissions per instance hour. Running generative AI applications in a more sustainable way requires innovation at the silicon level with energy efficient hardware. To optimise performance and energy consumption, AWS developed purpose-built silicon like the AWS Trainium chip and AWS Inferentia chip to achieve significantly higher throughput than comparable accelerated compute instances. AWS Trainium cuts the time taken to train generative AI models -- in some cases from months to hours. This means building new models requires less money and power, with energy-consumption reductions of almost one third/up to 29%. AWS Inferentia is AWS's most power-efficient machine learning inference chip. AWS Inferentia2 machine learning accelerator delivers up to 50% higher performance per watt and can reduce costs by up to 40% against comparable instances. These purpose-built accelerators enable AWS to efficiently execute AI models at scale. This translates to a reduced infrastructure footprint for similar workloads, resulting in enhanced performance per watt of power consumption. Improving energy efficiency across AWS infrastructure Through innovations in engineering -- from electrical distribution to cooling techniques, AWS's infrastructure is able to operate closer to peak energy efficiency. AWS optimises resource utilisation to minimise idle capacity, and continuously improves the efficiency of its infrastructure. For example, AWS removed the large central Uninterruptible Power Supply (UPS) from its data centre design to instead use small battery packs and custom power supplies that AWS integrates into every rack, which has improved power efficiency and has further increased availability. Every time power is converted from one voltage to another, or from AC to DC and vice versa, some power is lost in the process. By eliminating the central UPS, AWS are able to reduce these conversions. Additionally, AWS have optimised rack power supplies to reduce energy loss in that final conversion. Combined, these changes reduce energy conversion loss by about 35%. After powering AWS's server equipment, cooling is one of the largest sources of energy use in AWS data centres. To increase efficiency, AWS uses different cooling techniques, including free air cooling depending on the location and time of year, as well as real-time data to adapt to weather conditions. Implementing these innovative cooling strategies is more challenging on a smaller scale at a typical on-premises data centre. AWS's latest data centre design seamlessly integrates optimised air-cooling solutions alongside liquid cooling capabilities for the most powerful AI chipsets, like the NVIDIA Grace Blackwell Superchips. This flexible, multimodal cooling design allows AWS to extract maximum performance and efficiency whether running traditional workloads or AI models. According to the study, AWS's additional carbon-free energy procurement in India contributes 31% in carbon emissions reduction for compute-heavy workloads and 44% for storage-heavy workloads. Aligning with Amazon's commitment to achieving net-zero carbon emissions across all operations by 2040, AWS is rapidly transitioning its global infrastructure to match electricity use with 100% carbon-free energy. Amazon met its 100% renewable energy goal seven years ahead of schedule. In India 100% of the electricity consumed by AWS data centres was matched with renewable energy sources procured in country in 2022 and 2023. This is due to Amazon's investment in 50 renewable energy projects in India with an estimated 1.1 gigawatts of renewable energy capacity, enough energy to power more than 1.1 million homes in New Delhi each year.
[3]
AWS Data Centres Cut Carbon Emissions by 98% for AI Workloads Compared to On-Premises Solutions: Study
Accenture estimates that AWS's global infrastructure is up to 4.1 times more efficient than on-premises. A new study commissioned by Amazon Web Services (AWS) and completed by Accenture found that utilizing AWS data centres for compute-heavy, or AI, workloads in India yields a 98% reduction in carbon emissions compared to on-premises data centres. The study found that an effective way to minimise the environmental footprint of leveraging Artificial Intelligence is by moving IT workloads from on-premises infrastructure to cloud data centres in India and around the globe. Accenture estimates that AWS's global infrastructure is up to 4.1 times more efficient than on-premises. For Indian organisations, the total potential carbon reduction opportunity for AI workloads optimised on AWS is up to 99% compared to on-premises data centres. This is credited to AWS's utilisation of more efficient hardware (32%), improvements in power and cooling efficiency (35%), and additional carbon-free energy procurement (31%). Further optimising on AWS by leveraging purpose-built silicon can increase the total carbon reduction potential of AI workloads to up to 99% for Indian organisations that migrate to and optimise on AWS. "Considering 85% of global IT spend by organisations remains on-premises, a carbon reduction of up to 99% for AI workloads optimised on AWS in India is a meaningful sustainability opportunity for Indian organisations, according to Jenna Leiner, Head of Environment Social Governance (ESG) and External Engagement, AWS Global. "As India accelerates towards its US$1 trillion-dollar digital opportunity and encourages investments into digital infrastructure, sustainability innovations and minimising IT related carbon emissions will be critical in also helping India meet its net-zero emissions by 2070 goal. This is particularly important given the rising adoption of AI. AWS is constantly innovating for sustainability across our data centres -- optimising our data centre design, investing in purpose-built chips, and innovating with new cooling technologies - so that we continuously increase energy efficiency to serve customer compute demands," Leiner said.
[4]
AWS can help Indian organisations reduce carbon emissions of AI workloads
New study estimates workloads optimised on Amazon Web Services (AWS) can help organisations in India reduce associated carbon emissions by up to 99% compared to on-premises A new study commissioned by Amazon Web Services (AWS) and completed by Accenture shows that an effective way to minimise the environmental footprint of leveraging Artificial Intelligence (AI) is by moving IT workloads from on-premises infrastructure to AWS cloud data centres in India and around the globe. Accenture estimates that AWS's global infrastructure is up to 4.1 times more efficient than on-premises. For Indian organisations, the total potential carbon reduction opportunity for AI workloads optimised on AWS is up to 99% compared to on-premises data centres. The research states that simply utilizing AWS data centres for compute-heavy, or AI, workloads in India yields a 98% reduction in carbon emissions compared to on-premises data centres. This is credited to AWS's utilisation of more efficient hardware (32%), improvements in power and cooling efficiency (35%), and additional carbon-free energy procurement (31%). Further optimising on AWS by leveraging purpose-built silicon can increase the total carbon reduction potential of AI workloads to up to 99% for Indian organisations that migrate to and optimise on AWS. "Considering 85% of global IT spend by organisations remains on-premises, a carbon reduction of up to 99% for AI workloads optimised on AWS in India is a meaningful sustainability opportunity for Indian organisations," said Jenna Leiner, Head of Environment Social Governance (ESG) and External Engagement, AWS Global. "As India accelerates towards its US$1 trillion-dollar digital opportunity and encourages investments into digital infrastructure, sustainability innovations and minimising IT related carbon emissions will be critical in also helping India meet its net-zero emissions by 2070 goal. This is particularly important given the rising adoption of AI. AWS is constantly innovating for sustainability across our data centres -- optimising our data centre design, investing in purpose-built chips, and innovating with new cooling technologies - so that we continuously increase energy efficiency to serve customer compute demands." "This research shows that AWS's focus on hardware and cooling efficiency, carbon-free energy, purpose-built silicon, and optimized storage can help organizations reduce the carbon footprint of AI and machine learning workloads," said Sanjay Podder, global lead for Technology Sustainability Innovation at Accenture. "As the demand for AI continues to grow, sustainability through technology can play a crucial role in helping businesses meet environmental goals while driving innovation." Sustainable chip technology innovation - purpose-built silicon One of the most visible ways AWS is innovating for energy efficiency is through the company's investment in AWS chips. Launched in 2018, the custom AWS-engineered general purpose processor, Graviton, was the first-of-its-kind to be deployed at scale by a major cloud provider. The latest Graviton4 offers four times the performance of Graviton, and while Graviton3 uses 60% less energy for the same performance as comparable Amazon EC2 instances (where the compute happens in a data centre), Graviton4 is even more energy efficient. AWS customers are also benefiting from the carbon reduction potential of Graviton. Paytm, India's leading payments and financial services distribution platform, witnessed a reduction in workload carbon intensity by adopting Graviton processors, reporting up to 47% estimated decrease in carbon emissions per transaction. Similarly, IBS Software, a leading SaaS solutions provider to the global travel industry, reported that other than improving performance and reducing cost by adopting Graviton processors, the company saw a 40% reduction in carbon emissions per instance hour. Running generative AI applications in a more sustainable way requires innovation at the silicon level with energy efficient hardware. To optimise performance and energy consumption, AWS developed purpose-built silicon like the AWS Trainium chip and AWS Inferentia chip to achieve significantly higher throughput than comparable accelerated compute instances. AWS Trainium cuts the time taken to train generative AI models -- in some cases from months to hours. This means building new models requires less money and power, with energy-consumption reductions of almost one third/up to 29%. AWS Inferentia is AWS's most power-efficient machine learning inference chip. AWS Inferentia2 machine learning accelerator delivers up to 50% higher performance per watt and can reduce costs by up to 40% against comparable instances. These purpose-built accelerators enable AWS to efficiently execute AI models at scale. This translates to a reduced infrastructure footprint for similar workloads, resulting in enhanced performance per watt of power consumption. Improving energy efficiency across AWS infrastructure Through innovations in engineering -- from electrical distribution to cooling techniques, AWS's infrastructure is able to operate closer to peak energy efficiency. AWS optimises resource utilisation to minimise idle capacity, and continuously improves the efficiency of its infrastructure. For example, AWS removed the large central Uninterruptible Power Supply (UPS) from its data centre design to instead use small battery packs and custom power supplies that AWS integrates into every rack, which has improved power efficiency and has further increased availability. Every time power is converted from one voltage to another, or from AC to DC and vice versa, some power is lost in the process. By eliminating the central UPS, AWS are able to reduce these conversions. Additionally, AWS have optimised rack power supplies to reduce energy loss in that final conversion. Combined, these changes reduce energy conversion loss by about 35%. After powering AWS's server equipment, cooling is one of the largest sources of energy use in AWS data centres. To increase efficiency, AWS uses different cooling techniques, including free air cooling depending on the location and time of year, as well as real-time data to adapt to weather conditions. Implementing these innovative cooling strategies is more challenging on a smaller scale at a typical on-premises data centre. AWS's latest data centre design seamlessly integrates optimised air-cooling solutions alongside liquid cooling capabilities for the most powerful AI chipsets, like the NVIDIA Grace Blackwell Superchips. This flexible, multimodal cooling design allows AWS to extract maximum performance and efficiency whether running traditional workloads or AI models. According to the study, AWS's additional carbon-free energy procurement in India contributes 31% in carbon emissions reduction for compute-heavy workloads and 44% for storage-heavy workloads. Aligning with Amazon's commitment to achieving net-zero carbon emissions across all operations by 2040, AWS is rapidly transitioning its global infrastructure to match electricity use with 100% carbon-free energy. Amazon met its 100% renewable energy goal seven years ahead of schedule. In India 100% of the electricity consumed by AWS data centres was matched with renewable energy sources procured in country in 2022 and 2023. This is due to Amazon's investment in 50 renewable energy projects in India with an estimated 1.1 gigawatts of renewable energy capacity, enough energy to power more than 1.1 million homes in New Delhi each year. About Amazon Web Services Since 2006, Amazon Web Services has been the world's most comprehensive and broadly adopted cloud. AWS has been continually expanding its services to support virtually any workload, and it now has more than 240 fully featured services for compute, storage, databases, networking, analytics, machine learning and artificial intelligence (AI), Internet of Things (IoT), mobile, security, hybrid, media, and application development, deployment, and management from 105 Availability Zones within 33 geographic regions, with announced plans for 18 more Availability Zones and six more AWS Regions in Malaysia, Mexico, New Zealand, the Kingdom of Saudi Arabia, Thailand, and the AWS European Sovereign Cloud. Millions of customers -- including the fastest-growing startups, largest enterprises, and leading government agencies -- trust AWS to power their infrastructure, become more agile, and lower costs. To learn more about AWS, visit aws.amazon.com. About Amazon Web Services India Private Limited Amazon Web Services India Private Limited (AWS India) undertakes the resale and marketing of AWS Cloud services in India.
Share
Share
Copy Link
A new study by Amazon Web Services (AWS), Accenture, and Intel highlights the potential for significant reduction in carbon emissions from AI workloads in India. The research demonstrates how cloud computing can make AI more environmentally friendly.
A groundbreaking study conducted by Amazon Web Services (AWS), Accenture, and Intel has revealed that Indian organizations can dramatically reduce their carbon footprint associated with artificial intelligence (AI) workloads by up to 99% when utilizing AWS data centers 1. This significant finding underscores the potential for cloud computing to make AI more sustainable and environmentally friendly.
The study, which focused on the Indian market, compared the carbon emissions of AI workloads run on AWS cloud infrastructure to those run on traditional on-premises data centers. The results were striking, showing that AWS data centers can reduce carbon emissions by up to 98% for AI workloads compared to on-premises solutions 3. This substantial difference highlights the efficiency and sustainability advantages of cloud-based AI operations.
Several factors contribute to the significant reduction in carbon emissions when using AWS for AI workloads:
The study examined various AI workloads, including natural language processing, image and video analysis, and speech recognition. Across these different applications, the potential for carbon emission reduction remained consistently high, demonstrating the broad applicability of cloud-based solutions for sustainable AI development 4.
For Indian businesses and institutions looking to leverage AI technologies, this study presents a compelling case for adopting cloud-based solutions. By transitioning AI workloads to AWS, organizations can not only reduce their environmental impact but also potentially benefit from the scalability and cost-effectiveness of cloud computing.
As India continues to emerge as a global hub for AI innovation and adoption, the findings of this study could play a crucial role in shaping the country's approach to sustainable technology development. The potential for such significant carbon emission reductions aligns well with India's broader goals for sustainable development and climate change mitigation.
Reference
[2]
[3]
Amazon Web Services (AWS) announces major infrastructure updates, including liquid cooling systems and simplified electrical distribution, to enhance efficiency and sustainability in its data centers, particularly for AI workloads.
4 Sources
4 Sources
The rapid growth of artificial intelligence is causing a surge in energy consumption by data centers, challenging sustainability goals and straining power grids. This trend is raising concerns about the environmental impact of AI and the tech industry's ability to balance innovation with eco-friendly practices.
8 Sources
8 Sources
A recent survey by Pure Storage reveals a significant increase in AI adoption across Indian businesses, but highlights growing concerns about energy demands and infrastructure challenges that may hinder sustainability goals.
3 Sources
3 Sources
India's data center industry faces sustainability challenges due to heavy reliance on coal-powered electricity. This dependence threatens the sector's green initiatives and long-term environmental goals.
3 Sources
3 Sources
As artificial intelligence continues to advance, concerns grow about its energy consumption and environmental impact. This story explores the challenges and potential solutions in managing AI's carbon footprint.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved