4 Sources
[1]
Sketchy rumor suggests Apple Glasses will support Vision Pro-style hand gestures
We're expecting to see the launch of an Apple Glasses product at some point next year, and a sketchy rumor suggests that they may borrow a key feature from Vision Pro. Specifically, it's said they may be able to recognize hand gestures as a means of interacting with the wearable device - but there's good reason to doubt the claim ... Early VR headsets relied on hardware like handheld controllers as input devices. Vision Pro greatly streamlined the way users interact with the device by giving it the capability to recognize hand gestures with no additional hardware needed. MacRumors cites an "inside source" in suggesting that Apple Glasses will have the same capability. The AI glasses will include two cameras. A high-resolution camera will be included for capturing photos and videos that can be shared on social media and used like iPhone photos. A second lower-resolution wide-angle lens will read hand gestures and provide visual input for Siri. The site links this to similar rumors about AirPods having cameras for the same reason. Apple glasses are expected to have either one or two cameras, in contrast to the eight external cameras and four internal eye-tracking cameras on Vision Pro. Relying on a single low-resolution external camera to recognize hand gestures, as the report suggests, would be challenging to achieve with any reliability. Bloomberg's Mark Gurman has also expressed his own skepticism about the idea. The technology to do this reliably with a single camera, no neural band and no eye-scanning doesn't exist today as far as I know. I've also heard nothing to suggest the first version has any sophisticated form of gestures as this describes. I am extremely skeptical. What does seem likely is support for AirPods-style head gestures, like nodding and shaking the head, and it may be that the source for this rumour has mistaken talk of this for hand gesture support. That said, it's not totally impossible that the wearables could use some combination of head gestures and extremely obvious hand gestures - but we are certainly viewing this report with a very large pinch of salt.
[2]
Apple's AI Smart Glasses Likely to Support Hand Gesture Controls
Apple is developing a set of AI smart glasses to rival products like the Meta Ray-Bans, and MacRumors has learned a few more details about Apple's work on the device from an inside source. The AI glasses will include two cameras. A high-resolution camera will be included for capturing photos and videos that can be shared on social media and used like iPhone photos. A second lower-resolution wide-angle lens will read hand gestures and provide visual input for Siri. Apple uses hand gesture-based input for the Vision Pro, and rumors suggest the AirPods Pro will be updated with low-resolution cameras and support for gestures as well. Apple appears to be leaning into gesture support, and it's an ideal input method when no screen is available to interact with. While future versions of the smart glasses could include an integrated display for augmented reality features, the first version will have no display at all. Apple will not include a screen, LiDAR, 3D cameras, or other similar technology because such features are too energy-intensive. Battery life is a major constraint because Apple needs to keep the glasses slim and lightweight. Battery size is the bottleneck behind the hardware decisions that Apple is making, and it's why Apple is opting for a stripped-down feature set. According to recent rumors, Apple is testing multiple styles for the smart glasses, with plans to use acetate. Acetate is a lightweight plant-based material that's more flexible than plastic. Apple's smart glasses will incorporate the smarter version of Siri that Apple plans to introduce in iOS 27. The device will be able to take photos, record video, and make phone calls, plus users will be able to interact with Siri to ask questions about what's around them. The feature set will be similar to the features available in the Meta Ray-Bans that Apple is aiming to compete with. Rumors suggest Apple could preview the glasses later this year, with a launch to follow in 2027, though it's also possible we won't see them announced until 2027.
[3]
Apple's smart glasses probably won't have a screen, but gesture controls are expected | Stuff
If you've got one of the best iPhones, you may be holding out for Apple's rumoured smart glasses - a pair of AI-focused glasses designed to rival products like the Ray-Ban Meta (pictured above). And while they're likely a while off yet, new rumours have emerged to help fuel the hype. According to a MacRumors source, you may be able to control the Apple smart glasses without having to touch them at all. Instead, the glasses are said to rely on hand gesture controls, detected via a secondary camera built into the frame. Along with a main high-resolution camera for photos and video, a lower-resolution wide-angle lens apparently tracks your hand movements and provides visual input for Siri, offering a way to interact without a screen. That last bit is important, because, at least in this first version, there reportedly isn't one. Unlike the Vision Pro, Apple's glasses are expected to skip a display entirely. Instead, like Meta's glasses, the focus appears to be more on capturing content, asking questions, and getting quick AI-driven responses about the world around you. Apple's version, though, is expected to tie in with a rumoured smarter, next-gen Siri - the one expected to arrive with iOS 27 - suggesting a stronger AI focus from day one. We've also had hints about the design - Apple is said to be testing multiple styles, including frames made from acetate - a plant-based material often used in premium eyewear that's lighter and more flexible than standard plastics. As for timing, nothing's locked in. The latest rumours suggest a possible preview later this year, with a launch in 2027 - though, as ever with Apple's more experimental hardware, that timeline could easily slip. Stay tuned.
[4]
Apple AI smart glasses may skip a display but add dual cameras and gesture controls
As we had covered a few days back, Apple is reportedly working on a pair of AI-powered smart glasses to take on Meta's Ray-Ban glasses. While we knew the glasses would ditch a display in the first version, a new report by MacRumors, reveal the device is expected to prioritise cameras, voice interaction, and even gesture-based input. There won't be any augmented reality (AR) features, though. If you are an Apple fan and waiting for its smart glasses, here are the details we have so far. The report, citing some insider sources, indicates that Apple's smart glasses could feature two cameras. A high-resolution camera is expected to handle photos and videos, similar to a smartphone , allowing users to share content directly. A second, lower-resolution wide-angle camera is said to track hand gestures and feed visual data to Siri. This aligns with Apple's broader shift towards gesture-based interaction, already seen in the Apple Vision Pro. The same approach is also rumoured to feature in future AirPods models, wherein you get camera-assisted input. The glasses are also expected to integrate a more advanced version of Siri. You may be able to take photos, record videos, make calls, and ask Siri contextual questions about your surroundings. Also Read: Apple may turn iPhone camera into AI search engine with new Siri Mode in iOS 27 But this will still be a first-gen product. So, it wouldn't include a display, LiDAR, 3D cameras or AR features as the aim is to keep the glasses slim and lightweight, and there are some technological limitation at this point of time. In terms of design, Apple is said to be testing multiple frame styles, including acetate builds. Acetate is a plant-based material known for being lighter and more flexible than conventional plastic, which could help improve long-term comfort. So, Apple seems to be approaching this in a phased manner, and us users could get a pair of glasses that act as AI companion to the smartphone. Meanwhile, there is some scepticism around the leak. Mark Gurman from Bloomberg has questioned whether reliable gesture recognition can work with a single low-resolution camera and without additional hardware like neural bands or eye-tracking systems. He also noted that he has not heard of advanced gesture capabilities being ready for the first version. This means some of these features may not be available at launch or pushed to later iterations. The Apple Glasses is expected to get a preview in 2026 and the launch could take place in 2027. If you want a smart glass now, current options like Meta AI Glasses are good alternative. But, for those who are in the Apple ecosystem, could benefit from ecosystem benefits with Apple's glasses. We will have more clarity over the next 12 to 18 months. So, stay tuned. Keep reading Digit.in for similar stories.
Share
Copy Link
Apple is reportedly developing AI smart glasses with hand gesture controls and dual cameras to rival Meta Ray-Bans. The first version may skip a display entirely, relying on a low-resolution camera for gesture recognition and an advanced version of Siri. However, Bloomberg's Mark Gurman questions whether the technology can work reliably without additional hardware.
Apple is developing AI smart glasses designed to compete with products like the Meta Ray-Bans, and new details suggest the company is exploring Vision Pro-style hand gestures as a primary input method. According to MacRumors, citing an inside source, the Apple Glasses will feature dual cameras: a high-resolution camera for capturing photos and videos that can be shared on social media, and a second low-resolution camera with a wide-angle lens to read hand gestures and provide visual input for Siri
2
. This approach mirrors the hand gesture-based user interaction already deployed in Vision Pro, which eliminated the need for handheld controllers1
.
Source: Digit
The hand gesture controls would allow users to interact with the wearables without touching them, offering a practical solution for a device that will operate without a screen
3
. Unlike Vision Pro, which uses eight external cameras and four internal eye-tracking cameras, Apple Glasses are expected to rely on just one or two cameras, making gesture recognition significantly more challenging1
.
Source: 9to5Mac
Bloomberg's Mark Gurman has expressed considerable skepticism about the rumored hand gesture capabilities. "The technology to do this reliably with a single camera, no neural band and no eye-scanning doesn't exist today as far as I know," Gurman stated, adding that he has heard nothing to suggest the first version includes sophisticated gesture support
1
. Gurman's concerns highlight a critical technical hurdle: relying on a single low-resolution camera to recognize hand gestures with any reliability remains unproven4
.The skepticism extends to whether Apple can deliver advanced gesture capabilities without additional hardware like neural bands or eye-tracking systems. While AirPods-style head gestures like nodding and shaking appear more feasible, the source may have confused discussions about head gestures with more ambitious hand gesture support
1
.Apple's decision to exclude a display, LiDAR, 3D cameras, and augmented reality features stems from battery life constraints. The company needs to keep the AI smart glasses slim and lightweight, making battery size the primary bottleneck behind hardware decisions
2
. This stripped-down approach positions Apple Glasses as an AI companion rather than an augmented reality device, focusing on capturing content, making phone calls, and enabling users to ask Siri contextual questions about their surroundings4
.The glasses will integrate an advanced version of Siri expected to arrive with iOS 27, suggesting a stronger AI focus from launch
3
. This positions the device as a voice-first wearable that extends iPhone functionality rather than replacing it.Related Stories
Apple is testing multiple frame styles for the smart glasses, including acetate frames. Acetate is a lightweight plant-based material that's more flexible than plastic and commonly used in premium eyewear, which could improve long-term comfort
2
3
.Rumors suggest Apple could preview the glasses later in 2026, with a 2027 launch to follow, though the timeline could easily slip given Apple's cautious approach to experimental hardware
2
3
. The phased approach indicates Apple is prioritizing ecosystem integration and practical functionality over cutting-edge features that might compromise form factor or battery performance4
.Summarized by
Navi
[3]
1
Entertainment and Society

2
Health

3
Technology
