New L-Mul Algorithm Promises 95% Reduction in AI Energy Consumption

5 Sources

Researchers at BitEnergy AI have developed a new algorithm called Linear-Complexity Multiplication (L-Mul) that could potentially reduce AI energy consumption by up to 95% without significant performance loss. This breakthrough could address growing concerns about AI's increasing energy demands.

News article

Revolutionary Algorithm Promises Dramatic Reduction in AI Energy Consumption

Researchers at BitEnergy AI have developed a groundbreaking algorithm that could potentially slash AI energy consumption by up to 95%. The new technique, called Linear-Complexity Multiplication (L-Mul), addresses growing concerns about the escalating energy demands of artificial intelligence applications 1.

The Energy Challenge in AI

As AI applications have become mainstream, their energy requirements have skyrocketed. For instance, ChatGPT alone consumes approximately 564 MWh daily, equivalent to powering 18,000 American homes. Industry projections suggest that AI could consume between 85-134 TWh annually by 2027, rivaling the energy consumption of Bitcoin mining operations 2.

How L-Mul Works

The L-Mul algorithm tackles this energy challenge by reimagining how AI models handle calculations:

  1. It replaces complex floating-point multiplications with simpler integer additions.
  2. This approach reduces the computational complexity and energy requirements of AI operations.
  3. L-Mul approximates floating-point multiplications using only integer additions, achieving linear complexity instead of quadratic complexity 3.

Impressive Results

Initial tests of the L-Mul algorithm have shown promising results:

  • 95% reduction in energy costs for element-wise floating-point tensor multiplications
  • 80% reduction in energy costs for dot products
  • Outperforms current 8-bit standards in some cases, achieving higher precision
  • Average performance drop of just 0.07% across various AI tasks 4

Potential Impact on AI Models

The L-Mul technique could have far-reaching implications for various AI applications:

  1. Transformer-based models, including large language models like GPT, could benefit significantly from L-Mul integration.
  2. Tests on popular models such as Llama, Mistral, and Gemma have shown potential accuracy gains in certain vision tasks.
  3. The algorithm's efficiency extends beyond neural networks, potentially impacting hardware design for broader energy efficiency 5.

Challenges and Future Developments

While L-Mul shows great promise, there are some challenges to overcome:

  1. The algorithm currently requires specialized hardware, which is not yet widely available.
  2. Plans for developing this hardware and associated programming APIs are underway.
  3. The response of major players in the AI hardware market, such as Nvidia, could significantly impact the adoption rate of this new technology.

Industry Implications

The introduction of L-Mul could potentially disrupt the AI hardware market:

  1. It may force major chip manufacturers to adapt their designs quickly.
  2. There's potential for new players to enter the market with L-Mul-optimized hardware.
  3. The technology could reshape how hardware is built for neural networks, potentially leading to more energy-efficient AI systems across the board.
Explore today's top stories

Cloudflare Launches Pay-Per-Crawl System to Regulate AI Web Scraping

Cloudflare introduces a new system allowing website owners to charge AI companies for scraping content, aiming to balance content creation and AI innovation while addressing concerns over uncontrolled data harvesting.

Ars Technica logoTechCrunch logoMIT Technology Review logo

21 Sources

Technology

14 hrs ago

Cloudflare Launches Pay-Per-Crawl System to Regulate AI Web

Amazon Deploys One Millionth Robot and Introduces DeepFleet AI Model for Warehouse Optimization

Amazon reaches a milestone with its one millionth robot deployment and introduces a new generative AI model, DeepFleet, to optimize warehouse operations. This development brings the number of robots close to matching the human workforce in Amazon's facilities.

TechCrunch logoPC Magazine logoTom's Hardware logo

13 Sources

Business and Economy

14 hrs ago

Amazon Deploys One Millionth Robot and Introduces DeepFleet

Elon Musk's xAI Secures $10 Billion in Funding, Intensifying AI Competition

Elon Musk's AI company, xAI, has raised $10 billion in a combination of debt and equity financing, signaling a major expansion in the competitive AI landscape.

TechCrunch logoReuters logoCNBC logo

8 Sources

Business and Economy

22 hrs ago

Elon Musk's xAI Secures $10 Billion in Funding,

Oracle Secures Landmark $30 Billion Annual Cloud Contract, Potentially Linked to AI Infrastructure

Oracle has signed a massive cloud contract worth over $30 billion annually, set to begin in fiscal year 2028. This deal, possibly linked to AI infrastructure development, could more than double Oracle's current cloud revenue.

The Register logoFinancial Times News logoTechSpot logo

5 Sources

Business and Economy

22 hrs ago

Oracle Secures Landmark $30 Billion Annual Cloud Contract,

Nothing Phone 3 Launches with Innovative Glyph Matrix and AI Features

Nothing unveils its latest flagship smartphone, the Phone 3, featuring a unique Glyph Matrix display, advanced AI capabilities, and competitive specs to challenge top-tier devices.

TechCrunch logoCNET logoZDNet logo

14 Sources

Technology

6 hrs ago

Nothing Phone 3 Launches with Innovative Glyph Matrix and
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Twitter logo
Instagram logo
LinkedIn logo