New AI-Powered Chip Halves Energy Consumption for Large Language Models

2 Sources

Share

Oregon State University researchers have developed a chip that uses AI principles to reduce energy consumption by 50% in large language model applications, potentially revolutionizing data center efficiency.

News article

Breakthrough in AI Chip Technology

Researchers at Oregon State University's College of Engineering have made a significant advancement in addressing the energy consumption challenges posed by large language models (LLMs) like Gemini and GPT-4. Doctoral student Ramin Javadi and Associate Professor Tejasvi Anand have developed a new chip that reduces energy consumption by 50% compared to traditional designs

1

2

.

The Energy Dilemma in AI

The rapid growth of data-intensive AI applications has led to a surge in energy consumption in data centers. Professor Anand, who directs the Mixed Signal Circuits and Systems Lab at OSU, explains the core issue: "The energy required to transmit a single bit is not being reduced at the same rate as the data rate demand is increasing. That's what is causing data centers to use so much power"

1

.

AI-Powered Solution

The innovative chip leverages AI principles to optimize signal processing and data recovery. Javadi elaborates on the technology:

"Large language models need to send and receive tremendous amounts of data over wireline, copper-based communication links in data centers, and that requires significant energy. One solution is to develop more efficient wireline communication chips"

2

.

Technical Innovation

The chip's efficiency stems from its novel approach to data correction:

  1. High-speed data transmission often results in corruption at the receiver end.
  2. Conventional systems use power-hungry equalizers to clean up this data.
  3. The new chip employs on-chip AI classifiers to recognize and correct errors more efficiently

    1

    2

    .

Recognition and Support

The groundbreaking research has garnered attention and support from key organizations:

  • Presented at the IEEE Custom Integrated Circuits Conference in Boston
  • Javadi received the Best Student Paper Award at the conference
  • Project supported by the Defense Advanced Research Projects Agency, the Semiconductor Research Corporation, and the Center for Ubiquitous Connectivity

    1

Future Prospects

The OSU team is not resting on their laurels. Javadi and Anand are already working on the next iteration of the chip, anticipating even greater gains in energy efficiency

1

2

. This ongoing research could have far-reaching implications for the future of AI infrastructure and data center operations.

Potential Impact

If successfully implemented at scale, this technology could significantly reduce the carbon footprint of AI operations, making large language models more sustainable and cost-effective to run. It may also pave the way for more widespread adoption of AI technologies in various sectors by addressing one of the key challenges in AI deployment – energy consumption.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo