6 Sources
[1]
As AI becomes part of everyday life, it brings a hidden climate cost
Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts. "We try to use it conscientiously though because we do realize that there is an impact on the environment," she said. Her personal AI use isn't unique anymore. Now it's a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change. And it's getting harder to live without it. The climate cost AI is largely powered by data centers that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centers increases, leading to grid reliability problems for people living nearby. "Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium. The data centers also generate heat, so they rely on fresh water to stay cool. Larger centers can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people. It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face. "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said. Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centers more efficient, but that does not mean AI's environmental impact will shrink. That's because of a problem called the Jevons Paradox. "The cheaper resources get, the more we tend to use them anyway," he said. When cars replaced horses, he said, commute times didn't shrink. We just traveled farther. Quantifying AI's footprint How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data center that's processing the query, how clean the grid is and how complex the AI task is. Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult. But Ippolito tried anyway. He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. "Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said. And that's just for a simple prompt. A complex prompt, such as, "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year. It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene. Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much. "It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said. Limit tech, limit tech's climate impact Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it. Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn't rely on large data centers, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one. "Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more," she said. Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off. ChatGPT also has a temporary chat function so the queries you send to the data center get deleted after a few weeks instead of taking up data center storage space. But AI is only taking up a fraction of the data center's energy use. Ippolito estimates roughly 85% is data collection from sites like TikTok and Instagram, and cryptocurrency. His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere. "If you can do anything that cuts a data center out of the equation, I think that's a win," Ippolito said. ___ The Associated Press' climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[2]
As AI becomes part of everyday life, it brings a hidden climate cost
Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts. "We try to use it conscientiously though because we do realize that there is an impact on the environment," she said. Her personal AI use isn't unique anymore. Now it's a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change. And it's getting harder to live without it. The climate cost AI is largely powered by data centers that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centers increases, leading to grid reliability problems for people living nearby. "Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium. The data centers also generate heat, so they rely on fresh water to stay cool. Larger centers can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people. It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face. "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said. Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centers more efficient, but that does not mean AI's environmental impact will shrink. That's because of a problem called the Jevons Paradox. "The cheaper resources get, the more we tend to use them anyway," he said. When cars replaced horses, he said, commute times didn't shrink. We just traveled farther. Quantifying AI's footprint How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data center that's processing the query, how clean the grid is and how complex the AI task is. Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult. But Ippolito tried anyway. He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. "Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said. And that's just for a simple prompt. A complex prompt, such as, "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year. It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene. Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much. "It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said. Limit tech, limit tech's climate impact Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it. Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn't rely on large data centers, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one. "Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more," she said. Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off. ChatGPT also has a temporary chat function so the queries you send to the data center get deleted after a few weeks instead of taking up data center storage space. But AI is only taking up a fraction of the data center's energy use. Ippolito estimates roughly 85% is data collection from sites like TikTok and Instagram, and cryptocurrency. His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere. "If you can do anything that cuts a data center out of the equation, I think that's a win," Ippolito said. ___ The Associated Press' climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[3]
As AI becomes part of everyday life, it brings a hidden climate cost
As AI becomes a part of everyday life, its massive energy and water needs are coming more into focus Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts. "We try to use it conscientiously though because we do realize that there is an impact on the environment," she said. Her personal AI use isn't unique anymore. Now it's a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change. And it's getting harder to live without it. AI is largely powered by data centers that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centers increases, leading to grid reliability problems for people living nearby. "Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium. The data centers also generate heat, so they rely on fresh water to stay cool. Larger centers can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people. It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face. "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said. Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centers more efficient, but that does not mean AI's environmental impact will shrink. That's because of a problem called the Jevons Paradox. "The cheaper resources get, the more we tend to use them anyway," he said. When cars replaced horses, he said, commute times didn't shrink. We just traveled farther. How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data center that's processing the query, how clean the grid is and how complex the AI task is. Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult. But Ippolito tried anyway. He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. "Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said. And that's just for a simple prompt. A complex prompt, such as, "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year. It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene. Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much. "It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said. Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it. Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn't rely on large data centers, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one. "Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more," she said. Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off. ChatGPT also has a temporary chat function so the queries you send to the data center get deleted after a few weeks instead of taking up data center storage space. But AI is only taking up a fraction of the data center's energy use. Ippolito estimates roughly 85% is data collection from sites like TikTok and Instagram, and cryptocurrency. His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere. "If you can do anything that cuts a data center out of the equation, I think that's a win," Ippolito said. ___ The Associated Press' climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[4]
As AI Becomes a Part of Everyday Life, It Brings a Hidden Climate Cost
Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts. "We try to use it conscientiously though because we do realize that there is an impact on the environment," she said. Her personal AI use isn't unique anymore. Now it's a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change. And it's getting harder to live without it. AI is largely powered by data centers that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centers increases, leading to grid reliability problems for people living nearby. "Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium. The data centers also generate heat, so they rely on fresh water to stay cool. Larger centers can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people. It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face. "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said. Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centers more efficient, but that does not mean AI's environmental impact will shrink. That's because of a problem called the Jevons Paradox. "The cheaper resources get, the more we tend to use them anyway," he said. When cars replaced horses, he said, commute times didn't shrink. We just traveled farther. How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data center that's processing the query, how clean the grid is and how complex the AI task is. Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult. But Ippolito tried anyway. He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. "Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said. And that's just for a simple prompt. A complex prompt, such as, "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year. It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene. Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much. "It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said. Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it. Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn't rely on large data centers, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one. "Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more," she said. Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off. ChatGPT also has a temporary chat function so the queries you send to the data center get deleted after a few weeks instead of taking up data center storage space. But AI is only taking up a fraction of the data center's energy use. Ippolito estimates roughly 85 percent is data collection from sites like TikTok and Instagram, and cryptocurrency. His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere. "If you can do anything that cuts a data center out of the equation, I think that's a win," Ippolito said.
[5]
As AI Becomes Part of Everyday Life, It Brings a Hidden Climate Cost
Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts. "We try to use it conscientiously though because we do realize that there is an impact on the environment," she said. Her personal AI use isn't unique anymore. Now it's a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change. And it's getting harder to live without it. The climate cost AI is largely powered by data centers that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centers increases, leading to grid reliability problems for people living nearby. "Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium. The data centers also generate heat, so they rely on fresh water to stay cool. Larger centers can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people. It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face. "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said. Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centers more efficient, but that does not mean AI's environmental impact will shrink. That's because of a problem called the Jevons Paradox. "The cheaper resources get, the more we tend to use them anyway," he said. When cars replaced horses, he said, commute times didn't shrink. We just traveled farther. Quantifying AI's footprint How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data center that's processing the query, how clean the grid is and how complex the AI task is. Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult. But Ippolito tried anyway. He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. "Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said. And that's just for a simple prompt. A complex prompt, such as, "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year. It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene. Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much. "It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said. Limit tech, limit tech's climate impact Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it. Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn't rely on large data centers, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one. "Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more," she said. Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off. ChatGPT also has a temporary chat function so the queries you send to the data center get deleted after a few weeks instead of taking up data center storage space. But AI is only taking up a fraction of the data center's energy use. Ippolito estimates roughly 85% is data collection from sites like TikTok and Instagram, and cryptocurrency. His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere. "If you can do anything that cuts a data center out of the equation, I think that's a win," Ippolito said. ___ The Associated Press' climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[6]
As AI becomes part of everyday life, it brings a hidden climate cost
Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts. "We try to use it conscientiously though because we do realize that there is an impact on the environment," she said. Her personal AI use isn't unique anymore. Now it's a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change. And it's getting harder to live without it. AI is largely powered by data centres that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centres increases, leading to grid reliability problems for people living nearby. "Since we are trying to build data centres at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centres are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium. The data centres also generate heat, so they rely on fresh water to stay cool. Larger centres can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people. It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face. "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said. Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centres more efficient, but that does not mean AI's environmental impact will shrink. That's because of a problem called the Jevons Paradox. "The cheaper resources get, the more we tend to use them anyway," he said. When cars replaced horses, he said, commute times didn't shrink. We just traveled farther. How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data centre that's processing the query, how clean the grid is and how complex the AI task is. Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult. But Ippolito tried anyway. He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. "Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said. And that's just for a simple prompt. A complex prompt, such as, "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year. It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene. Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much. "It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said. Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it. Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn't rely on large data centres, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one. "Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more," she said. Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off. ChatGPT also has a temporary chat function so the queries you send to the data centre get deleted after a few weeks instead of taking up data centre storage space. But AI is only taking up a fraction of the data centre's energy use. Ippolito estimates roughly 85% is data collection from sites like TikTok and Instagram, and cryptocurrency. His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere. "If you can do anything that cuts a data centre out of the equation, I think that's a win," Ippolito said. ___
Share
Copy Link
As artificial intelligence becomes an integral part of daily life, its significant energy consumption and environmental impact are coming under scrutiny. This article explores the hidden climate costs associated with AI usage and data centers, and suggests ways to mitigate these effects.
Artificial Intelligence (AI) has become an integral part of our daily lives, from smartphones to search engines. However, this technological advancement comes with a hidden environmental cost. As Marissa Loewen, an early AI adopter, notes, "We try to use it conscientiously though because we do realize that there is an impact on the environment" 1.
Source: Inc. Magazine
AI's environmental impact primarily stems from the data centers that power it. These centers consume vast amounts of energy, often from fossil fuels, contributing to greenhouse gas emissions. Noman Bashir, a computing and climate impact fellow at MIT's Climate and Sustainability Consortium, explains, "Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels" 1.
Moreover, these data centers require significant water resources for cooling. Larger centers can consume up to 5 million gallons (18.9 million liters) of water daily, equivalent to the water demand of a town with 50,000 residents 2.
Jon Ippolito, a professor of new media at the University of Maine, has developed an app to compare the environmental footprint of various digital tasks. His findings are startling:
Sasha Luccioni, AI and Climate Lead at Hugging Face, adds perspective: "In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone" 1.
Despite ongoing efforts to improve the efficiency of chips and data centers, AI's environmental impact may not decrease. This is due to the Jevons Paradox, as explained by Ippolito: "The cheaper resources get, the more we tend to use them anyway" 4. This phenomenon suggests that as AI becomes more efficient and accessible, its usage and consequent energy consumption may actually increase.
Experts suggest several ways to reduce AI's carbon footprint:
Limit AI use: Ippolito recommends using human-captured images instead of AI-generated ones and requesting concise answers to minimize energy consumption 5.
Local computing: Loewen built her own AI that doesn't rely on large data centers, similar to watching a DVD instead of streaming 5.
Efficient search engines: Luccioni uses Ecosia, which employs efficient algorithms and uses profits to plant trees 5.
Temporary storage: ChatGPT's temporary chat function deletes queries after a few weeks, reducing data center storage needs 5.
Reduce overall tech usage: Ippolito estimates that 85% of data center energy use comes from data collection on social media and cryptocurrency. He suggests using screen time restrictions to limit social media use 5.
As AI continues to evolve and integrate into our daily lives, understanding and mitigating its environmental impact becomes increasingly crucial. By adopting conscious usage habits and supporting efficient technologies, we can help reduce the hidden climate cost of our growing reliance on artificial intelligence.
Apple is in early talks with Google to potentially use Gemini AI for a Siri revamp, signaling a shift in Apple's AI strategy as it faces delays in its own development efforts.
18 Sources
Technology
11 hrs ago
18 Sources
Technology
11 hrs ago
Meta has announced a partnership with Midjourney to license their AI image and video generation technology, aiming to enhance Meta's AI capabilities and compete with industry leaders in creative AI.
8 Sources
Technology
11 hrs ago
8 Sources
Technology
11 hrs ago
NVIDIA introduces Spectrum-XGS Ethernet, a revolutionary networking technology designed to connect distributed data centers into giga-scale AI super-factories, addressing the growing demands of AI computation and infrastructure.
3 Sources
Technology
19 hrs ago
3 Sources
Technology
19 hrs ago
NVIDIA CEO Jensen Huang confirms the development of the company's most advanced AI architecture, 'Rubin', with six new chips currently in trial production at TSMC.
2 Sources
Technology
3 hrs ago
2 Sources
Technology
3 hrs ago
TikTok announces plans to lay off hundreds of UK content moderators as part of a global reorganization, shifting towards AI-assisted moderation despite new online safety regulations.
8 Sources
Technology
19 hrs ago
8 Sources
Technology
19 hrs ago