2 Sources
[1]
A breakthrough system to reduce AI electricity consumption - Softonic
Artificial intelligence models require massive computing power, leading to a significant rise in global energy consumption. With AI applications expanding rapidly, researchers have been searching for ways to make their training processes more efficient. Scientists at the Technical University of Munich (TUM) have now developed a new system that reduces AI training time by 100 times, making the process significantly more energy-efficient. AI models, particularly large language models and neural networks, rely on extensive computational resources to adjust their parameters through numerous iterations. Traditionally, these adjustments follow a randomized approach, requiring high levels of processing power and prolonged training periods. This process is both costly and environmentally unsustainable. The team at TUM, led by Felix Dietrich, introduced a novel probabilistic training method that optimizes the selection of key parameters. Instead of iteratively refining all parameters across multiple training cycles, the system identifies and prioritizes critical points where large and rapid value changes occur. This allows the AI to converge on an optimal solution much faster, consuming a fraction of the energy typically required. This breakthrough could have a profound impact on data centers, which currently account for approximately 1% of Germany's total energy consumption. By 2025, this number is projected to increase to 22 billion kWh. If widely adopted, this new training method could drastically cut energy costs and reduce the environmental footprint of AI-driven technologies. The study, presented at the NeurIPS 2024 conference, shows that accuracy levels remain comparable to traditional training methods, proving that efficiency does not come at the cost of performance. As AI continues to grow, such advancements will be essential in ensuring a sustainable future.
[2]
How on-device AI could help us to cut AI's energy demand
An energy credit trading system could incentivize businesses to adopt energy-efficient AI. The age of IT devices driven by artificial intelligence (AI) has arrived, much like the revolutions brought by personal computers and mobile devices. Each has brought undeniable and lasting impacts on society. The evolution of computing followed three key phases: AI is now following this same trajectory, with rapid advancements in AI chips driving a new technological wave. AI chip power consumption has surged, particularly in data centres, which require vast energy resources to process large-scale data in real time. AI-driven data centres, primarily powered by graphics processing units (GPUs), now consume more electricity than entire nations, including South Africa and Indonesia. Projections suggest this energy usage will more than double from 260 terawatt-hours in 2024 to 500 terawatt-hours in 2027. A single query in OpenAI's ChatGPT, for example, consumes 2.9Wh of electricity -- roughly 10 times that of a Google search. This massive power demand has raised environmental concerns, emphasizing the need for sustainable industry practices. The manufacturing sector alone accounts for over 40% of global power consumption, making it a focal point for energy efficiency improvements. Governments are taking action. For example, Singapore has introduced regulations limiting data centre capacity due to energy shortages. The country currently operates more than 70 data centres, accounting for 60% of Southeast Asia's total data centre capacity. It stopped further approvals between 2019 and 2022, with those under review subject to capacity restraints. AI leaders acknowledge this challenge. At the 2024 World Economic Forum in Davos, Switzerland OpenAI CEO Sam Altman stated: "An energy breakthrough is necessary for future artificial intelligence, which will consume vastly more power than people have expected." AI is essential to the future, automating repetitive tasks and driving progress. However, for AI to be widely adopted across industries, its energy consumption must be drastically reduced. Delivering sustainable AI requires both a refinement of the technology, which is already underway, as well as creative and ambitious policy-making. Startups such as Groq, DeepSeek and DeepX are pioneering energy-efficient AI technologies that could shape an AI-driven, super-intelligent society. Software optimization can help reduce power usage in servers, but on-device AI - where AI processing happens directly on the device rather than in a cloud data centre - is the most promising solution to this challenge. Currently, AI systems rely heavily on cloud computing, which consumes significant energy to transmit data between edge devices and data centres. Additionally, high-performance GPUs and central processing units (CPUs) in data centres require substantial power. By contrast, on-device AI processes data locally, eliminating the need for energy-intensive data transmission. AI chips designed for on-device processing prioritize energy efficiency over sheer computing power, resulting in a 100 to 1,000-fold reduction in energy consumption per AI task compared to cloud-based AI. As a result, on-device AI is emerging as a game-changing technology that maximizes AI adoption while minimizing environmental impact. Governments must act quickly to create policies that accelerate its adoption. To promote energy-efficient AI, a global "energy credit trading system" could provide financial incentives for companies that adopt low-power AI solutions. Under this system, businesses implementing energy-saving AI could trade energy usage credits, financially benefiting while reducing their environmental footprint. DEEPX presented this innovation at the 2024 World Economic Forum Annual Meeting in Dalian, China. A similar precedent exists in the electric vehicle (EV) industry. In the 2010s, government subsidies and tax incentives drove rapid EV adoption, leading to battery technology and charging infrastructure growth. AI energy credits could play a similar role in managing the power consumption of AI models, ensuring a sustainable future. The AI era is here to stay -- but its success depends on making AI technology energy-efficient. Governments, businesses and innovators must work together to ensure that AI's rapid growth does not come at an unsustainable cost. With the right policies and technologies, we can create an AI-powered future that benefits society and the planet.
Share
Copy Link
Researchers develop innovative methods to significantly reduce AI's energy consumption, potentially revolutionizing the industry's environmental impact and operational costs.
As artificial intelligence (AI) applications expand rapidly, the technology's massive energy consumption has become a pressing concern. AI models, particularly large language models and neural networks, require extensive computational resources, leading to a significant rise in global energy usage. Currently, AI-driven data centers consume more electricity than entire nations, including South Africa and Indonesia 2.
The scale of this issue is staggering:
Scientists at the Technical University of Munich (TUM) have developed a groundbreaking system that could revolutionize AI training processes. Led by Felix Dietrich, the team introduced a novel probabilistic training method that optimizes the selection of key parameters 1.
Key features of the new system:
Instead of iteratively refining all parameters across multiple training cycles, the system identifies and prioritizes critical points where large and rapid value changes occur. This allows the AI to converge on an optimal solution much faster, consuming a fraction of the energy typically required 1.
Another promising approach to reducing AI's energy footprint is on-device AI processing. This method involves performing AI tasks directly on the device rather than in cloud data centers, offering several advantages 2:
Companies like Groq, DeepSeek, and DeepX are pioneering these energy-efficient AI technologies, which could shape the future of AI-driven societies 2.
Recognizing the urgency of the situation, governments and industry leaders are taking action:
These initiatives aim to balance the rapid growth of AI technology with environmental sustainability, drawing parallels to how government incentives drove electric vehicle adoption in the 2010s 2.
As AI continues to evolve, these advancements in energy efficiency will be crucial in ensuring a sustainable future for the technology. The combined efforts of researchers, policymakers, and industry leaders promise to shape an AI-powered world that benefits both society and the environment.
Summarized by
Navi
[2]
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather, potentially improving the protection of Earth's critical infrastructure from solar storms.
5 Sources
Technology
1 hr ago
5 Sources
Technology
1 hr ago
Meta introduces an AI-driven voice translation feature for Facebook and Instagram creators, enabling automatic dubbing of content from English to Spanish and vice versa, with plans for future language expansions.
8 Sources
Technology
17 hrs ago
8 Sources
Technology
17 hrs ago
OpenAI CEO Sam Altman reveals plans for GPT-6, focusing on memory capabilities to create more personalized and adaptive AI interactions. The upcoming model aims to remember user preferences and conversations, potentially transforming the relationship between humans and AI.
2 Sources
Technology
17 hrs ago
2 Sources
Technology
17 hrs ago
Chinese AI companies DeepSeek and Baidu are making waves in the global AI landscape with their open-source models, challenging the dominance of Western tech giants and potentially reshaping the AI industry.
2 Sources
Technology
1 hr ago
2 Sources
Technology
1 hr ago
A comprehensive look at the emerging phenomenon of 'AI psychosis', its impact on mental health, and the growing concerns among experts and tech leaders about the psychological risks associated with AI chatbots.
3 Sources
Technology
1 hr ago
3 Sources
Technology
1 hr ago