Arm Unveils Lumex: Next-Gen Mobile Chips Optimized for On-Device AI

Reviewed byNidhi Govil

9 Sources

Share

Arm introduces its new Lumex platform, featuring advanced CPU and GPU designs optimized for AI processing on mobile devices. The platform promises significant performance improvements and energy efficiency for smartphones and other mobile devices.

Arm Introduces Lumex: A New Era for Mobile AI

Arm, the UK-based chip designer, has unveiled its latest mobile platform called Lumex, marking a significant leap in on-device AI capabilities for smartphones and other mobile devices

1

2

3

. This new generation of chip designs aims to revolutionize how intelligence is built, scaled, and delivered in mobile computing

5

.

Source: PCWorld

Source: PCWorld

Advanced CPU Architecture

The Lumex platform introduces a new CPU range with four distinct tiers:

  1. C1-Ultra: Highest-end chip with 25% greater performance over its predecessor
  2. C1-Premium: Next-most-powerful chip
  3. C1-Pro: 12% greater efficiency compared to previous models
  4. C1-Nano: Smallest footprint core for maximum power efficiency

    1

    4

This tiered approach allows for flexible configurations to suit various device types, from high-end smartphones to wearables

4

.

AI Performance Boost

At the heart of Lumex is the Armv9.3 C1 CPU cluster, featuring built-in Scalable Matrix Extension version 2 (SME2) units. This architecture enables:

  • 5x improvement in AI performance
  • 3x more efficiency compared to the previous generation
  • 30% increase in standard benchmark performance
  • 15% speed-up in apps and 12% lower power use in daily workloads

    3

    5

Source: CNET

Source: CNET

Enhanced GPU Capabilities

The Mali G1-Ultra GPU complements the CPU advancements with:

  • 20% better performance than its predecessor
  • Twice the ray tracing performance
  • 20% faster inferencing for AI processing

    1

    4

On-Device AI Focus

Arm is positioning CPUs as the universal AI engine, citing the lack of standardization in Neural Processing Units (NPUs). The Lumex platform aims to run AI workloads efficiently on-device, reducing reliance on cloud processing

2

3

.

Source: Reuters

Source: Reuters

Software Ecosystem

To support developers, Arm is launching a complete Android 16-ready software stack alongside Lumex. This includes:

  • SME2-enabled KleidiAI libraries
  • Telemetry tools for performance analysis and bottleneck identification
  • Integration with popular frameworks like PyTorch, Llama, LiteRT, and ONNX

    2

    5

Market Impact and Future Outlook

The Lumex platform is expected to power a wide range of devices, from flagship smartphones to PCs and tablets. With many popular Google apps already SME2-enabled, the stage is set for improved on-device AI features when next-generation hardware becomes available

5

.

As the demand for generative AI like ChatGPT increases, Arm's focus on efficient on-device processing could significantly impact the mobile industry. The company expects Lumex to be implemented in smartphones and other devices later this year or early next year, with chips potentially running at clock speeds upwards of 4 GHz

1

2

.

This advancement in mobile AI processing sets the stage for a new era of smart devices, promising faster performance, improved power efficiency, and enhanced user experiences across a wide range of applications.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo