Curated by THEOUTPOST
On Thu, 22 Aug, 12:02 AM UTC
2 Sources
[1]
The sustainability crisis in the AI industry: Here's how change can happen
Working together to ensure sustainability in the AI industry The rapid growth of AI has changed many industries and led to amazing new technology, but it also comes with a big problem: a huge increase in energy use. This rise in energy consumption isn't just a tech issue; it's a big environmental concern that everyone in the AI field needs to tackle. As AI continues to develop, it's not just about making smarter models and solving more users' problems; it's about making sure these advancements don't harm our planet. The question isn't just about what AI can do for us, but how we can ensure its advancements are sustainable for the planet. Gartner predicts that without sustainable AI practices, by 2025, AI will consume more energy than the human workforce, significantly offsetting carbon-zero gains. According to a recent report from the Federal Energy Regulatory Commission, data center demand in the US is expected to reach 35 gigawatts by 2030, that's the equivalent of powering about 26 million homes. (For context, 1 GW is enough energy to power about 750,000 homes. In regions like Salt Lake City where gigantic energy users including Meta and Google are building data centers, there has been a noticeable shift back to coal as more data centers are needed to support AI workloads. Plans to retire coal-fired power plants early are being abandoned, pushing the dates as far back as 2042 and dialing back on clean energy resources. This is a concerning shift that underscores the complex trade-offs between technological advancement and sustainability, especially as AI is on its way to driving a 160% increase in data center power demand by 2030. While some tech giants like Google, Amazon, and Microsoft have committed to powering their data centers with 100% renewable energy by 2030, the current landscape still sees significant carbon footprints from AI operations. Based on public data from Meta, one of its data centers in Iowa uses the annual equivalent amount of power as 7 million laptops running eight hours every day. According to a study from Hugging Face and Carnegie Mellon University, creating an image using generative AI takes as much energy as fully charging your smartphone. ChatGPT queries consume nearly 10 times the amount of electricity as a Google search. For a startup, training its AI models in the US consumes ~1000 tons of CO2 in a year -- the equivalent of 1000 Paris to NYC trips. AI needs an energy breakthrough. The industry is exploring solutions like atomic fusion to speed up the energy transition away from fossil fuels, but until this energy breakthrough happens, people and businesses in AI need to take individual steps toward change. AI-led businesses face challenges in the areas of technology, financial investment, and stakeholder engagement when trying to adopt sustainable AI practices. Transitioning to sustainable AI solutions often requires substantial upfront investment in energy-efficient technologies and renewable energy sources. According to an IBM sustainability study, while the majority of executives (76%) agree that sustainability is central to their business, nearly half (47%) struggle to fund sustainability investments. Furthermore, only 31% of organizations report integrating sustainability data extensively into their operational improvements, indicating a gap between sustainability goals and actionable steps. The shift toward green data centers and sustainable hardware requires not only capital but also a strategic overhaul of existing infrastructures. Companies building AI have complex decisions to make about upgrading to more efficient systems while managing ongoing operational costs. This, plus the rapid pace of technological change, can make it difficult for businesses to keep up. Many AI companies in early development stages may deprioritize sustainability due to the immediate pressures of competition, technological development, and finding product-market fit. But as the demand for AI grows, it's becoming essential for businesses to integrate sustainability into their decision-making processes to achieve environmental goals and drive innovation. The entire industry plays a part in influencing a more sustainable future for AI. The readiness, adoption, and development of green AI practices depend on the maturity of the market and stakeholder involvement. Venture capitalists can evaluate the environmental impact of their portfolio, request impact statements from companies, and share sustainability best practices to inspire more businesses to take action. Enterprise companies and SMB's using AI can request environmental impact statements from providers to evaluate sustainability efforts and commitment. Companies developing AI products can be selective about what type of AI model they use. Recent studies show that specialized AI models consume less energy than general purpose AI models. The more frugal it is, the faster it can execute, improving user experience and reducing energy consumption. Companies developing AI models can partner with green data centers like Genesis Cloud to leverage renewable energy sources and minimize environmental impact. They can ask cloud providers for the Power Usage Effectiveness (PUE) scores of their data centers and even use an open source tool to measure their cloud carbon footprint. Internally, they can develop more frugal specialist AI models to lower carbon emissions. Externally, they can publish their model CO2 emissions like Meta did for Llama 3.1. Cloud providers like Amazon Web Services (AWS), Google Cloud, Scaleway, and Genesis can help reduce the carbon footprint of AI by developing infrastructures that maximize energy efficiency and by being transparent. This involves sharing their PUE scores including the energy consumed to cool the data center and the CO2 emissions to build the data center, and potentially offering green pricing options. Data centers can also relay the demand for energy efficient chips to hardware providers. Hardware providers can develop energy efficient chips like NVIDIA that claims its new "superchip" can boost performance for generative AI tasks by 30 times while consuming 25 times less energy with new chip cooling techniques. Public funding can participate by integrating carbon footprint assessments into decision-making processes. Regulators can evolve toward holding all players in the AI ecosystem accountable. As an industry, we need to take a systemic approach of shared responsibility to reduce the environmental impact of AI at all levels of the ecosystem. Keeping up with today's fast paced AI innovation is crucial for businesses to stay competitive, but with the market maturing, sustainability should play a bigger part in the decision making process. Take the first step by choosing an AI provider that's already taking action to reduce its energy consumption -- ask for their environmental impact statements or if they measure their company's carbon footprint. It's up to all of us to lead the way and advocate for a more sustainable future for AI. We've featured the best green web hosting.
[2]
Light bulbs have energy ratings -- so why can't AI chatbots?
As millions of people increasingly use generative artificial intelligence (AI) models for tasks ranging from searching the Web to creating music videos, there is a growing urgency about minimizing the technology's energy footprint. The worrying environmental cost of AI is obvious even at this nascent stage of its evolution. A report published in January by the International Energy Agency estimated that the electricity consumption of data centres could double by 2026, and suggested that improvements in efficiency will be crucial to moderate this expected surge. Some tech-industry leaders have sought to downplay the impact on the energy grid. They suggest that AI could enable scientific advances that might result in a reduction in planetary carbon emissions. Others have thrown their weight behind yet-to-be-realized energy sources such as nuclear fusion. However, as things stand, the energy demands of AI are keeping ageing coal power plants in service and significantly increasing the emissions of companies that provide the computing power for this technology. Given that the clear consensus among climate scientists is that the world faces a 'now or never' moment to avoid irreversible planetary change, regulators, policymakers and AI firms must address the problem immediately. For a start, policy frameworks that encourage energy or fuel efficiency in other economic sectors can be modified and applied to AI-powered applications. Efforts to monitor and benchmark AI's energy requirements -- and the associated carbon emissions -- should be extended beyond the research community. Giving the public a simple way to make informed decisions would bridge the divide that now exists between the developers and the users of AI models, and could eventually prove to be a game changer. This is the aim of an initiative called the AI Energy Star project, which we describe here and recommend as a template that governments and the open-source community can adopt. The project is inspired by the US Environmental Protection Agency's Energy Star ratings. These provide consumers with a transparent, straightforward measure of the energy consumption associated with products ranging from washing machines to cars. The programme has helped to achieve more than 4 billion tonnes of greenhouse-gas reductions over the past 30 years, the equivalent of taking almost 30 million petrol-powered cars off the road per year. The goal of the AI Energy Star project is similar: to help developers and users of AI models to take energy consumption into account. By testing a sufficiently diverse array of AI models for a set of popular use cases, we can establish an expected range of energy consumption, and then rate models depending on where they lie on this range, with those that consume the least energy being given the highest rating. This simple system can help users to choose the most appropriate models for their use case quickly. Greater transparency will, hopefully, also encourage model developers to consider energy use as an important parameter, resulting in an industry-wide reduction in greenhouse-gas emissions. Our initial benchmarking focuses on a suite of open-source models hosted on Hugging Face, a leading repository for AI models. Although some of the widely used chatbots released by Google and OpenAI are not yet part of our test set, we hope that private firms will participate in benchmarking their proprietary models as consumer interest in the topic grows. A single AI model can be used for a variety of tasks -- ranging from summarization to speech recognition -- so we curated a data set to reflect those diverse use cases. For instance, for object detection, we turned to COCO 2017 and Visual Genome -- both established evaluation data sets used for research and development of AI models -- as well as the Plastic in River data set, composed of annotated examples of floating plastic objects in waterways. We settled on ten popular ways in which most consumers use AI models, for example, as a question-answering chatbot or for image generation. We then drew a representative sample from the task-specific evaluation data set. Our objective was to measure the amount of energy consumed in responding to 1,000 queries. The open-source CodeCarbon package was used to track the energy required to compute the responses. The experiments were carried out by running the code on state-of-the-art NVIDIA graphics processing units, reflecting cloud-based deployment settings using specialized hardware, as well as on the central processing units of commercially available computers. In our initial set of experiments, we evaluated more than 200 open-source models from the Hugging Face platform, choosing the 20 most popular (by number of downloads) for each task. Our initial findings show that tasks involving image classification and generation generally result in carbon emissions thousands of times larger than those involving only text (see 'AI's energy footprint'). Creative industries considering large-scale adoption of AI, such as film-making, should take note. Within our sample set, the most efficient question-answering model used approximately 0.1 watt-hours (roughly the energy needed to power a 25W incandescent light bulb for 5 minutes) to process 1,000 questions. The least efficient image-generation model, by contrast, required as much as 1,600 Wh to create 1,000 high-definition images -- that's the power necessary to fully charge a smartphone approximately 70 times, amounting to a 16,000-fold difference. As millions of people integrate AI models into their workflow, what tasks they deploy them on will increasingly matter. In general, supervised tasks such as question answering or text classification -- in which models are provided with a set of options to choose from or a document that contains the answer -- are much more energy efficient than are generative tasks that rely on the patterns learnt from the training data to produce a response from scratch. Moreover, summarization and text-classification tasks use relatively little power, although it must be noted that nearly all use cases involving large language models are more energy intensive than a Google search (querying an AI chatbot once uses up about ten times the energy required to process a web search request). Such rankings can be used by developers to choose more-efficient model architectures to optimize for energy use. This is already possible, as shown by our as-yet-unpublished tests on models of similar sizes (determined on the basis of the number of connections in the neural network). For a specific task such as text generation, a language model called OLMo-7B, created by the Allen Institute in Seattle, Washington, drew 43 Wh to generate 1,000 text responses, whereas Google's Gemma-7B and one called Yi-6B LLM, from the Beijing-based company 01.AI, used 53 Wh and 147 Wh, respectively. With a range of options already in existence, star ratings based on rankings such as ours could nudge model developers towards lowering their energy footprint. On our part, we will be launching an AI Energy Star leaderboard website, along with a centralized testing platform that can be used to compare and benchmark models as they come out. The energy thresholds for each star rating will shift if industry moves in the right direction. That is why we intend to update the ratings routinely and offer users and organizations a useful metric, other than performance, to evaluate which AI models are the most suitable. To achieve meaningful progress, it is essential that all stakeholders take proactive steps to ensure the sustainable growth of AI. The following recommendations provide some specific guidance to the variety of players involved. Get developers involved. AI researchers and developers are at the core of innovation in this field. By considering sustainability throughout the development and deployment cycle, they can significantly reduce AI's environmental impact from the outset. To make it standard practice to measure and publicly share the energy use of models (for example, in a 'model card' setting out information such as training data, evaluations of performance and metadata), it's essential to get developers on board. Drive the market towards sustainability. Enterprises and product developers play a crucial part in the deployment and commercial use of AI technologies. Whether creating a standalone product, enhancing existing software or adopting AI for internal business processes, these groups are often key decision makers in the AI value chain. By demanding energy-efficient models and setting procurement standards, they can drive the market towards sustainable solutions. For instance, they could set baseline expectations (such as requiring that models achieve at least two stars according to the AI Energy Star scheme) or support sustainable-AI legislation. Disclose energy consumption. AI users are on the front lines, interacting with AI products in various applications. A preference for energy-efficient solutions could send a powerful market signal, encouraging developers and enterprises to prioritize sustainability. Users can nudge the industry in the right direction by opting for models that publicly disclose energy consumption. They can also use AI products more conscientiously, avoiding wasteful and unnecessary use. Strengthen regulation and governance. Policymakers have the authority to treat sustainability as a mandatory criterion in AI development and deployment. With recent examples of legislation calling for AI impact transparency in the European Union and the United States, policymakers are already moving towards greater accountability. This can initially be voluntary, but eventually governments could regulate AI system deployment on the basis of the efficiency of the underlying models. Regulators can adopt a bird's-eye view, and their input will be crucial for creating global standards. It might also be important to establish independent authorities to track changes in AI energy consumption over time. Clearly, a lot more needs to be done to put a suitable regulatory regime in place before mass AI adoption becomes a reality (see go.nature.com/4dfp1wb). The AI Energy Star project is a small beginning and could be refined further. Currently, we do not account for energy overheads expended on model storage and networking, as well as data-centre cooling, which can be measured only with direct access to cloud facilities. This means that our results represent the lower bound of the AI models' overall energy consumption, which is likely to double if the associated overhead is taken into account. How energy use translates into carbon emissions will also depend on where the models are ultimately deployed, and the energy mix available in that city or town. The biggest challenge, however, will remain the impenetrability of what is happening in the proprietary-model ecosystem. Government regulators are starting to demand access to AI models, especially to ensure safety. Greater transparency is urgently needed because proprietary models are widely deployed in user-facing settings. The world is now at a key inflection point. The decisions being made today will reverberate for decades as AI technology evolves alongside an increasingly unstable planetary climate. We hope that the Energy Star project serves as a valuable starting point to send a strong sustainability demand throughout the AI value chain.
Share
Share
Copy Link
The rapid growth of AI technology has raised concerns about its environmental sustainability. This story explores the energy consumption of AI models, their carbon footprint, and potential solutions for a greener AI industry.
As artificial intelligence (AI) continues to advance at an unprecedented pace, concerns about its environmental impact are coming to the forefront. The AI industry is facing a sustainability crisis, with the energy consumption of large language models (LLMs) and other AI systems becoming a significant contributor to carbon emissions 1.
Recent studies have shed light on the enormous energy requirements of training and running AI models. For instance, training a single large language model can consume as much electricity as 100 US homes use in an entire year 2. This energy consumption translates into a substantial carbon footprint, with estimates suggesting that the information and communications technology sector, which includes AI, could account for up to 20% of global electricity demand by 2030 1.
The carbon emissions associated with AI are not just limited to the energy used in training and running models. The entire lifecycle of AI systems, including the manufacturing of hardware and the cooling of data centers, contributes to their environmental impact. Researchers have found that the carbon footprint of training a single AI model can be equivalent to the lifetime emissions of five cars 2.
One of the major obstacles in addressing the sustainability crisis in AI is the lack of standardized methods for measuring its environmental impact. Different studies have produced varying estimates of energy consumption and carbon emissions, making it difficult to assess the true scale of the problem 2. This inconsistency highlights the need for more transparent and uniform reporting practices within the AI industry.
Despite these challenges, there are several promising approaches to reducing the environmental impact of AI:
Efficient AI design: Developing more energy-efficient algorithms and model architectures can significantly reduce power consumption 1.
Green data centers: Utilizing renewable energy sources and improving cooling systems in data centers can lower the carbon footprint of AI operations 2.
Carbon-aware computing: Implementing practices that consider the carbon intensity of the electricity grid when scheduling computationally intensive tasks 1.
Transparency and reporting: Encouraging companies to disclose the environmental impact of their AI systems can drive accountability and innovation in sustainable practices 2.
Some tech giants are already taking steps towards more sustainable AI practices. For example, Google has committed to using carbon-free energy for all its operations by 2030 1. Additionally, initiatives like the Green Software Foundation are working to establish standards for measuring and reducing the environmental impact of software, including AI systems 2.
As the AI industry continues to grow, balancing technological advancement with environmental responsibility will be crucial. By addressing the sustainability crisis head-on, the AI sector has the potential to not only reduce its own environmental impact but also contribute to solving global climate challenges through innovative applications of machine learning and data analysis.
As artificial intelligence continues to advance, concerns grow about its energy consumption and environmental impact. This story explores the challenges and potential solutions in managing AI's carbon footprint.
5 Sources
5 Sources
Chinese startup DeepSeek claims to have created an AI model that matches the performance of established rivals at a fraction of the cost and carbon footprint. However, experts warn that increased efficiency might lead to higher overall energy consumption due to the Jevons paradox.
5 Sources
5 Sources
As generative AI technologies rapidly advance, concerns grow about their significant environmental impact, from energy consumption to e-waste generation. This story explores the challenges and potential solutions for sustainable AI development.
3 Sources
3 Sources
The rapid growth of artificial intelligence is causing a surge in energy consumption by data centers, challenging sustainability goals and straining power grids. This trend is raising concerns about the environmental impact of AI and the tech industry's ability to balance innovation with eco-friendly practices.
8 Sources
8 Sources
Researchers develop innovative methods to significantly reduce AI's energy consumption, potentially revolutionizing the industry's environmental impact and operational costs.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved