Curated by THEOUTPOST
On Tue, 14 Jan, 4:01 PM UTC
3 Sources
[1]
Explained: Generative AI's environmental impact
Caption: MIT News explores the environmental and sustainability implications of generative AI technologies and applications. In a two-part series, MIT News explores the environmental implications of generative AI. In this article, we look at why this technology is so resource-intensive. A second piece will investigate what experts are doing to reduce genAI's carbon footprint and other impacts. The excitement surrounding potential benefits of generative AI, from improving worker productivity to advancing scientific research, is hard to ignore. While the explosive growth of this new technology has enabled rapid deployment of powerful models in many industries, the environmental consequences of this generative AI "gold rush" remain difficult to pin down, let alone mitigate. The computational power required to train generative AI models that often have billions of parameters, such as OpenAI's GPT-4, can demand a staggering amount of electricity, which leads to increased carbon dioxide emissions and pressures on the electric grid. Furthermore, deploying these models in real-world applications, enabling millions to use generative AI in their daily lives, and then fine-tuning the models to improve their performance draws large amounts of energy long after a model has been developed. Beyond electricity demands, a great deal of water is needed to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems. The increasing number of generative AI applications has also spurred demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport. "When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in. There are much broader consequences that go out to a system level and persist based on actions that we take," says Elsa A. Olivetti, professor in the Department of Materials Science and Engineering and the lead of the Decarbonization Mission of MIT's new Climate Project. Olivetti is senior author of a 2024 paper, "The Climate and Sustainability Implications of Generative AI," co-authored by MIT colleagues in response to an Institute-wide call for papers that explore the transformative potential of generative AI, in both positive and negative directions for society. Demanding data centers The electricity demands of data centers are one major factor contributing to the environmental impacts of generative AI, since data centers are used to train and run the deep learning models behind popular tools like ChatGPT and DALL-E. A data center is a temperature-controlled building that houses computing infrastructure, such as servers, data storage drives, and network equipment. For instance, Amazon has more than 100 data centers worldwide, each of which has about 50,000 servers that the company uses to support cloud computing services. While data centers have been around since the 1940s (the first was built at the University of Pennsylvania in 1945 to support the first general-purpose digital computer, the ENIAC), the rise of generative AI has dramatically increased the pace of data center construction. "What is different about generative AI is the power density it requires. Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload," says Noman Bashir, lead author of the impact paper, who is a Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium (MCSC) and a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL). Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development. By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatts (which would bump data centers up to fifth place on the global list, between Japan and Russia). While not all data center computation involves generative AI, the technology has been a major driver of increasing energy demands. "The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants," says Bashir. The power needed to train and deploy a model like OpenAI's GPT-3 is difficult to ascertain. In a 2021 research paper, scientists from Google and the University of California at Berkeley estimated the training process alone consumed 1,287 megawatt hours of electricity (enough to power about 120 average U.S. homes for a year), generating about 552 tons of carbon dioxide. While all machine-learning models must be trained, one issue unique to generative AI is the rapid fluctuations in energy use that occur over different phases of the training process, Bashir explains. Power grid operators must have a way to absorb those fluctuations to protect the grid, and they usually employ diesel-based generators for that task. Increasing impacts from inference Once a generative AI model is trained, the energy demands don't disappear. Each time a model is used, perhaps by an individual asking ChatGPT to summarize an email, the computing hardware that performs those operations consumes energy. Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search. "But an everyday user doesn't think too much about that," says Bashir. "The ease-of-use of generative AI interfaces and the lack of information about the environmental impacts of my actions means that, as a user, I don't have much incentive to cut back on my use of generative AI." With traditional AI, the energy usage is split fairly evenly between data processing, model training, and inference, which is the process of using a trained model to make predictions on new data. However, Bashir expects the electricity demands of generative AI inference to eventually dominate since these models are becoming ubiquitous in so many applications, and the electricity needed for inference will increase as future versions of the models become larger and more complex. Plus, generative AI models have an especially short shelf-life, driven by rising demand for new AI applications. Companies release new models every few weeks, so the energy used to train prior versions goes to waste, Bashir adds. New models often consume more energy for training, since they usually have more parameters than their predecessors. While electricity demands of data centers may be getting the most attention in research literature, the amount of water consumed by these facilities has environmental impacts, as well. Chilled water is used to cool a data center by absorbing heat from computing equipment. It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling, says Bashir. "Just because this is called 'cloud computing' doesn't mean the hardware lives in the cloud. Data centers are present in our physical world, and because of their water usage they have direct and indirect implications for biodiversity," he says. The computing hardware inside data centers brings its own, less direct environmental impacts. While it is difficult to estimate how much power is needed to manufacture a GPU, a type of powerful processor that can handle intensive generative AI workloads, it would be more than what is needed to produce a simpler CPU because the fabrication process is more complex. A GPU's carbon footprint is compounded by the emissions related to material and product transport. There are also environmental implications of obtaining the raw materials used to fabricate GPUs, which can involve dirty mining procedures and the use of toxic chemicals for processing. Market research firm TechInsights estimates that the three major producers (NVIDIA, AMD, and Intel) shipped 3.85 million GPUs to data centers in 2023, up from about 2.67 million in 2022. That number is expected to have increased by an even greater percentage in 2024. The industry is on an unsustainable path, but there are ways to encourage responsible development of generative AI that supports environmental objectives, Bashir says. He, Olivetti, and their MIT colleagues argue that this will require a comprehensive consideration of all the environmental and societal costs of generative AI, as well as a detailed assessment of the value in its perceived benefits. "We need a more contextual way of systematically and comprehensively understanding the implications of new developments in this space. Due to the speed at which there have been improvements, we haven't had a chance to catch up with our abilities to measure and understand the tradeoffs," Olivetti says.
[2]
Generative AI: Uncovering its environmental and social costs
A recent commentary article by researchers from Northwestern University, Harvard University, and The University of Texas at San Antonio highlights the significant but overlooked environmental and social impacts of Generative Artificial Intelligence (GenAI). Published in Environmental Science and Ecotechnology, the research underscores the urgent need for sustainable practices and ethical governance as GenAI technologies proliferate. The study reveals the environmental toll of GenAI development, with hardware production such as GPUs and data centers consuming vast resources. Mining rare metals like cobalt and tantalum for these systems contributes to deforestation, water pollution, and soil degradation. Data centers, essential for GenAI operations, are projected to consume over 8% of U.S. electricity by 2030, further straining energy grids. Additionally, GenAI systems generate substantial e-waste, exacerbating global pollution challenges. On the social front, the study highlights inequities in GenAI's production and use. Labor concerns range from child exploitation in cobalt mining to underpaid workers training AI systems under precarious conditions. Unequal access to GenAI deepens the global digital divide, privileging industrialized nations and English speakers over marginalized communities. The researchers advocate for immediate action to mitigate these impacts. Proposed measures include energy-efficient AI training, sustainable hardware designs, improved labor conditions, and inclusive governance frameworks. Transparency from developers and policymakers is essential, with recommendations for mandatory reporting of GenAI's environmental and social footprint. "This study sheds light on the hidden costs of GenAI and calls for collective action to address them," said lead author Mohammad Hosseini. The findings provide a roadmap for fostering responsible and equitable AI development globally.
[3]
Businesses are slowly waking up to the environmental effects of Gen AI
Businesses are becoming increasingly aware of artificial intelligence's environmental impacts when it comes to the power and natural resources required to operate data centers, new research has found. A surge in AI interest and usage has put AI data centers in the firing line in recent years - their hunger for large amounts of electricity as well as water for cooling has left critics questioning the effectiveness of the developing technology compared with its environmental cost. As a result, research from Capgemini revealed nearly half (48%) of the surveyed execs attributed rising greenhouse gas emissions to their generative AI projects. Although companies are increasingly aware of the environmental impacts AI has, quantifying them is more of a challenge. Only 12% of organizations measure their GenAI carbon footprint, and sustainability ranks low when it comes to influencing a company's decision to pick a model. There's also a notable reliance on third-party vendors, which reduces the amount of control companies have on their emissions. More than three in four use pre-trained models, compared with just 4% who build their own. Some organizations are seeking to use smaller models to reduce their environmental impacts, and others are exploring renewable energy options to further reduce emissions, however the key challenge is that sustainability right now is a low priority for execs - only one in five see it as a key factor. Model providers also scarcely disclose data regarding their environmental impact, with bundled data sets making it hard to identify certain impacts. Besides addressing sustainability's ranking as a decision-making factor, Capgemini says businesses should consider using smaller, task-specific models to lower energy consumption. Businesses can also improve the sustainability of their infrastructure by choosing efficient hardware and green data centers. The report also calls for further governance on the ethical and sustainable use of generative AI. "If we want Gen AI to be a force for sustainable business value, there needs to be a market discussion around data collaboration, drawing up industry-wide standards around how we account for the environmental footprint of AI, so business leaders are equipped to make more informed, responsible business decisions, and mitigate these impacts," noted Capgemini Head of Global Sustainability Services and Corporate Responsibility, Cyril Garcia.
Share
Share
Copy Link
As generative AI technologies rapidly advance, concerns grow about their significant environmental impact, from energy consumption to e-waste generation. This story explores the challenges and potential solutions for sustainable AI development.
As generative AI technologies continue to advance at a rapid pace, researchers and industry experts are sounding the alarm about the significant environmental impacts associated with their development and deployment. The computational power required to train and run these sophisticated models is placing unprecedented demands on energy resources and infrastructure.
The electricity consumption of data centers, which are crucial for training and running generative AI models, has seen a dramatic increase. In 2022, global data center electricity consumption reached 460 terawatts, equivalent to the 11th largest electricity consumer in the world 1. This figure is projected to more than double to 1,050 terawatts by 2026, potentially making data centers the fifth-largest global electricity consumer.
The training process for large language models like GPT-3 is particularly energy-intensive. A 2021 study estimated that training GPT-3 alone consumed 1,287 megawatt hours of electricity, generating approximately 552 tons of carbon dioxide 1. This level of energy consumption is comparable to powering 120 average U.S. homes for a year.
Beyond electricity, the environmental impact of generative AI extends to water usage and hardware production. Significant amounts of water are required to cool the hardware used in AI model training and deployment, potentially straining local water supplies and ecosystems 1. The increased demand for high-performance computing hardware also contributes to indirect environmental impacts through manufacturing and transportation.
Researchers from Northwestern University, Harvard University, and The University of Texas at San Antonio have highlighted the social costs associated with generative AI development. These include labor concerns such as child exploitation in cobalt mining and underpaid workers involved in AI system training 2. The study also points out that unequal access to generative AI technologies may exacerbate the global digital divide.
Recent research by Capgemini indicates that businesses are becoming more aware of the environmental impacts of AI. Nearly half (48%) of surveyed executives attributed rising greenhouse gas emissions to their generative AI projects 3. However, only 12% of organizations currently measure their generative AI carbon footprint, and sustainability ranks low in influencing model selection decisions.
To address these challenges, experts are calling for several measures:
As generative AI continues to evolve, balancing innovation with environmental sustainability remains a critical challenge. Industry leaders, policymakers, and researchers must collaborate to develop and implement sustainable practices that mitigate the environmental impact of this transformative technology while harnessing its potential benefits.
Reference
[1]
The rapid growth of AI technology has raised concerns about its environmental sustainability. This story explores the energy consumption of AI models, their carbon footprint, and potential solutions for a greener AI industry.
2 Sources
2 Sources
As generative AI usage surges, concerns about its ecological footprint are mounting. This story explores the environmental impact of AI in terms of energy consumption, water usage, and electronic waste.
2 Sources
2 Sources
As AI technology advances, concerns grow over its environmental impact. ChatGPT and other AI models are consuming enormous amounts of energy and water, raising questions about sustainability and resource management in the tech industry.
3 Sources
3 Sources
As artificial intelligence continues to advance, concerns grow about its energy consumption and environmental impact. This story explores the challenges and potential solutions in managing AI's carbon footprint.
5 Sources
5 Sources
Artificial Intelligence is contributing to the acceleration of the climate crisis, according to an expert's warning. The technology's energy consumption and its application in fossil fuel extraction are raising concerns about its environmental impact.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved