Mythic Raises $125 Million to Challenge Nvidia with 100x More Efficient AI Chips

3 Sources

Share

Palo Alto-based Mythic has secured $125 million in funding to develop analog processing units that cut AI energy use by up to 100 times compared with GPUs. Led by DCVC with backing from SoftBank, Honda, and Lockheed Martin, the AI chip startup plans to deploy its compute-in-memory technology across data centers, automotive systems, robotics, and defense applications.

Mythic Secures $125 Million to Advance Energy-Efficient AI Chip Technology

Mythic, a Palo Alto-based AI chip startup, has raised $125 million in a funding round led by deep tech-focused venture capital firm DCVC

2

. The round attracted significant backing from NEA, Atreides Management, SoftBank Group Corp., Honda Motor Co., and Lockheed Martin Corp., bringing the company's total outside funding to more than $175 million

1

. This capital injection comes as the 13-year-old company undergoes restructuring under chief executive officer Taner Ozcelik, a former Nvidia veteran who previously served as VP and GM at the graphics card giant

2

.

Source: Bloomberg

Source: Bloomberg

Analog Processing Units Promise Revolutionary Performance Gains

Mythic's approach centers on developing analog processing units designed to cut AI energy use by up to 100 times compared with traditional GPUs

2

. Unlike standard processors that represent information as individual electrical signals corresponding to 1 or 0, Mythic's APU represents data as fluctuations in a single, continuous electrical signal . The company claims its chip can run AI models with 100 times the performance per watt of traditional graphics cards, addressing what Ozcelik calls the defining factor of AI's future: "Energy efficiency will define the future of AI computing everywhere"

2

. According to Mythic, its current architecture delivers 120 trillion operations per second per watt

2

.

Compute-in-Memory Architecture Reduces Energy Loss

The energy-efficient AI chips leverage a compute-in-memory chip architecture that fundamentally reimagines how artificial intelligence processors handle data. Mythic's APU is built from memory circuits that not only store information but also process it, with calculations carried out using resistors that inhibit electron flow . This analog in-memory computing design combines memory and processing in a single plane, reducing energy loss during data movement—which the company identifies as the primary source of power consumption in current AI systems

2

. Steve Jurvetson of Future Ventures noted that Mythic's approach unifies computation and memory "as in the brain," improving efficiency

2

.

Source: AIM

Source: AIM

Starlight Platform Targets Edge Systems and Data Centers

Mythic introduced Starlight, a sub-one-watt sensing platform that integrates its chips into image sensors

2

. The device contains multiple APUs that cumulatively consume less than one watt of power and can be embedded in edge systems such as robots to enhance data quality collected by their image sensors . The system improves signal extraction in low-light conditions and targets defense, automotive, and robotics use cases

2

. Beyond edge deployments, Mythic sees customers deploying its silicon in data centers, where testing revealed that APU-powered servers can process up to 750 times more tokens per second than graphics cards .

Complementing Rather Than Replacing GPU Infrastructure

Despite positioning itself to take on Nvidia in the lucrative market for artificial intelligence processors, Ozcelik emphasized that Mythic aims to complement GPUs rather than replace them entirely

1

2

. "Much as GPUs became the accelerated computer of choice next to CPUs, our APUs will become the accelerated computer of choice next to GPUs," he stated

2

. The company claims its chips can run large language models with up to one trillion parameters without requiring the high-speed interconnects used by GPU clusters

2

. Mythic provides a software toolkit that optimizes AI models using quantization and can further boost an LLM's performance by retraining it for the APU .

Strategic Manufacturing and Commercial Deployment Plans

Mythic's chips are manufactured in the United States and allied countries using standard semiconductor processes

2

. The company plans to use the new capital to expand production, mature its software development kit, and pursue commercial deployments in AI inference markets across data centers, automotive systems, robotics, and defense

2

. Aaron Jacobson, partner at NEA, said the platform "collapses today's limits on energy and cost" and gives the company scope to scale

2

. The funding will also support the company's efforts to rebuild its architecture, software stack, and strategy following its recent restructuring

2

.

Source: SiliconANGLE

Source: SiliconANGLE

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo