Apple accelerates AI wearables push with smart glasses, camera AirPods, and pendant in the works

Reviewed byNidhi Govil

28 Sources

Share

Apple is ramping up development of three AI-powered wearables: smart glasses with dual cameras, AirPods equipped with infrared sensors, and a camera-enabled pendant. All three devices will rely on Visual Intelligence and deeper Siri integration, targeting releases between late 2026 and 2027 as Apple enters the AI hardware race against Meta and Google.

Apple Enters AI Hardware Race With Three New Wearables

Apple is accelerating development of a trio of AI wearables that could reshape how users interact with artificial intelligence in their daily lives. According to Bloomberg reporter Mark Gurman, the company is ramping up work on smart glasses, AirPods with cameras, and an AI pendant, all built around deeper Siri integration and the company's Visual Intelligence feature

2

. The move signals Apple's determination to compete in the AI hardware race alongside Meta, Google, and OpenAI, even as the company plays catch-up in the broader AI landscape.

Source: Geeky Gadgets

Source: Geeky Gadgets

During an all-hands meeting earlier this month, CEO Tim Cook hinted at the company's ambitious plans, telling employees that Apple is working on new "categories of products" enabled by artificial intelligence

2

. "We're extremely excited about that," Cook said, adding that "the world is changing fast." The push into AI-powered wearables comes as iPhone sales remain robust but the company faces mounting pressure to demonstrate innovation beyond its core product lines, particularly after the pricey Vision Pro headset failed to resonate with consumers.

Smart Glasses Lead Apple's Wearable Vision

The smart glasses represent the most advanced offering in Apple's planned AI wearables lineup. Code-named N50, the glasses have made significant progress in recent months, with the company distributing broader sets of prototypes within its hardware engineering division

2

. Apple is targeting the start of production as early as December, ahead of a public release in 2027.

Unlike the augmented reality glasses Apple has long been rumored to develop, these AI-powered wearables will feature no display, instead relying on speakers, microphones, and camera sensors for interaction

1

. The design philosophy mirrors Meta Ray-Bans, which have become a hit since launching in late 2023. However, Apple aims to differentiate through two key areas: build quality and camera technology.

The glasses will incorporate two camera lenses—one high-resolution sensor for capturing photos and video, and another dedicated to computer vision, similar to technology used in Vision Pro

2

. This second sensor is designed to give the device environmental context, helping it more accurately interpret surroundings and measure distances between objects. Early prototypes connected via cable to a standalone battery pack and iPhone, but newer versions have components embedded directly in the frame using high-end materials, including acrylic elements intended to give the glasses a premium feel.

Source: Beebom

Source: Beebom

Apple initially considered partnerships with popular eyewear brands, following the industry trend set by Meta's collaboration with EssilorLuxottica and Google's partnership with Warby Parker. However, the company recently decided to develop its own frames in-house in a variety of sizes and colors, leveraging its retail stores where customers could try on the glasses and get them sized

1

.

Visual Intelligence Powers the AI Experience

Visual Intelligence, Apple's version of computer vision, will serve as the defining feature across the company's AI wearables lineup

5

. First introduced on the iPhone 16 Pro and available from iPhone 15 Pro and up, the feature allows users to analyze on-screen content and the world around them through the camera. It can recognize objects, offer information about them, answer questions, enable real-time translation, and conduct instant web searches.

During Apple's earnings call, Tim Cook heavily emphasized Visual Intelligence, calling it "one of our most popular features" that "helps users learn and do more than ever with the content on their iPhone screen"

5

. The goal is for the glasses to function as an all-day AI companion, capable of understanding what a user is seeing and doing in real time. Wearers could look at an object and ask what it is, inquire about ingredients in a meal, or get assistance with everyday tasks

2

.

Apple is also exploring more advanced uses, such as reading printed text and converting it into digital data. The technology could provide upgraded turn-by-turn directions, telling users to go past a specific landmark rather than just a certain number of feet

3

. Currently, Visual Intelligence relies on OpenAI's ChatGPT for results, though Apple is reportedly working on its own AI visual models that could coincide with the eventual release of Siri 2.0 in 2026.

A potentially breakthrough element comes from Apple's recent $2-billion acquisition of Q.ai, a startup specializing in machine learning systems for interpreting silent voice input

4

. The company also researched systems for interpreting micro facial movements, enabling them to understand speech without it being audible at all. If successfully integrated, this technology could address one of the biggest practical limitations of voice assistant-powered wearables: the need to speak audibly in public or quiet environments.

AirPods With Cameras Target 2026 Launch

The AirPods update expected later this year represents Apple's first piece in the next wearable puzzle. The AirPods Pro 3 design will add infrared cameras focused on gesture tracking rather than taking pictures

1

. The cameras should work similarly to Vision Pro's near-range hand tracking and will likely function in the dark. These AirPods with cameras are envisioned as simpler offerings, equipped with lower-resolution sensors designed to help the AI work rather than for capturing photos or videos

2

.

Source: Wccftech

Source: Wccftech

If these arrive in 2026, they'll push out some AI functions and allow simple hand gestures that could control music or interact with workouts. The gesture tracking technology could also serve as a testing ground for capabilities that might eventually land on the smart glasses.

AI Pendant Remains Uncertain

Gurman emphasizes that a camera-enabled AI pendant is also in the works, with a potential release in 2027, though there's also a chance this device might not happen at all

1

. The pendant, which could be pinned to a shirt or worn as a necklace, sounds reminiscent of the failed Humane AI Pin but would focus on being an iPhone accessory rather than an independent device

2

.

Most other tech companies have been gravitating toward pendants and pins lately, as evidenced by announcements made at CES in January. Pins could be more versatile to wear and might serve as a hedge in case consumers don't embrace glasses. The report sounds less certain on the pendant's arrival than on the glasses and new AirPods.

Competitive Landscape Intensifies

Apple's accelerated push into AI-powered wearables comes as the competitive landscape intensifies rapidly. Meta Ray-Bans have already become a hit, demonstrating consumer appetite for camera-equipped eyewear despite privacy concerns

4

. Google is preparing its own lineup of glasses powered by Gemini, with Google I/O scheduled for late May potentially serving as the launch platform

1

. OpenAI is also developing a series of devices, including wearables, with the help of ex-Apple design chief Jony Ive and other former Apple executives.

Apple's recent deal with Google to have Gemini power its next wave of Apple AI always suggested wearables, because Gemini's camera-enabled and live modes are key to Google's upcoming wave of glasses

1

. Apple could be using Gemini to leap ahead in camera and context-aware live AI functions that could make these new wearables work.

However, questions remain about whether Apple's approach represents genuine innovation or simply catching up to competitors. As Gizmodo notes, the Visual Intelligence use cases described—identifying food ingredients, providing navigation instructions, offering task-specific guidance—are already available in Meta Ray-Bans and other smart glasses

3

. Computer vision remains one of the least reliable features of current smart glasses, often getting details wrong and making it difficult to trust for daily use.

The longer-term implications extend beyond individual devices. AI is expected to change the way consumers use phones, with more activities shifting to peripherals. Apple is looking for a breakthrough with its accelerated push into wearable devices, aiming to keep users locked into the Apple ecosystem

2

. Down the road, Apple could find a way to blend a lower-cost Vision headset in a more glasses-like form with these types of smaller wearables, or the company might be figuring out ways to explore new interfaces on wearables first before solving for a new wave of wearable displays beyond Vision Pro.

Odds are consumers won't hear anything about these plans until at least Apple's WWDC developer conference in June, though the company could provide an advance preview similar to what it did with Vision Pro and the original Apple Watch

1

. With Google potentially readying its glasses lineup soon, the race to define the next generation of AI wearables is accelerating rapidly.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo