2 Sources
2 Sources
[1]
AppleInsider.com
Apple Glass will be a direct competitor to Meta's Ray-Ban smart glasses, but it will be only a part of a larger three-pronged AI wearable strategy for the company. Here's what's coming. Apple has long been working on its smart glasses, known as Apple Glass. What is anticipated to actually launch will be quite close to what the existing Meta Ray-Bans can already do. In Sunday's "Power On" newsletter for Bloomberg, Mark Gurman writes that the Apple Glass will be easily able to handle everyday uses, including photographs and video capture, dealing with phone calls, handling notifications from an iPhone, and music playback. There will also be the ability to call upon Apple's digital assistant, Siri, in a hands-free way. However, that will only really be useful as part of an upgraded Siri in iOS 27. A big AI approach Apple Glass won't be a product in isolation, aside from its connection to an iPhone. It's really part of an attempt by Apple to take advantage of the power of artificial intelligence and computer vision. This will include two other products in the same category. The previously rumored AirPods with cameras and the more recent pendant claims. The idea is that Apple wants to use all of this hardware to feed a view of the user's surroundings into Apple Intelligence. This data could then be used to create contextual awareness, such as by Siri responding based on what it can "see" in the world.
[2]
Apple's Display-Free Smart Glasses To Debut By Early 2027, Might Sport 4 Different Designs
Apple is gearing up to give Meta's Ray-Ban smart glasses a run for their money by launching a superior competitor, replete with a more premium build and greater design versatility, as per the latest Power On newsletter from Bloomberg's Mark Gurman. Mark Gurman has disclosed today that Apple's upcoming display-less smart glasses will likely launch by early 2027, and come equipped with integrated cameras, microphones, and speakers, enabling the wearer to interact via an improved version of its bespoke AI assistant, Siri. These new camera-equipped smart glasses will be able to capture photos and videos, sync with an iPhone for post-capture editing and sharing, handle phone calls, keep a tab on notifications, play music, and enable hands-free interaction via Siri. These smart glasses would complete a troika of new AI-powered devices from Apple, which include camera-equipped AirPods Pro and an AI pendant. All of these devices will leverage computer vision to interpret a given user's surroundings and feed contextual awareness directly into Siri and Apple Intelligence, enabling features like improved turn-by-turn map directions and visual reminders. Interestingly, Apple is planning to create a hefty differentiation between its smart glasses and the ones from Meta by implementing a tight, utility-heavy integration with the iPhone. Additionally, Apple appears to be opting for the more premium acetate frame for its smart glasses, along with a host of color and design options: Gurman concludes by noting: "Despite Meta's early lead and Google's advantages with the larger Android ecosystem, Apple's strengths -- its brand, in-house chips, giant retail presence and deep iPhone integration -- position it well to compete. If executed properly with a functional Siri, these glasses could follow a trajectory similar to the Apple Watch: not first to market, but ultimately dominant." Elsewhere, Omdia expects Apple to launch its own AR smart glasses - replete with 0.6-inch dual OLEDoS displays - only in 2028, months after Meta would have presumably launched its own competitive offerings. For the benefit of those who might not be aware, OLEDoS, also called Micro-OLED display tech, mounts Organic Light-Emitting Diodes (OLED) directly onto a single-crystal silicon wafer substrate. Unlike traditional OLED screens used in smartphones or TVs that are built on a glass or plastic base, OLEDoS leverages semiconductor manufacturing processes to achieve extreme miniaturization and performance, leading to ultra-high pixel density and an improved power consumption profile, especially as the circuitry is integrated directly into the silicon backplane using CMOS technology.
Share
Share
Copy Link
Apple plans to launch Apple Glass by early 2027 as display-free smart glasses that compete directly with Meta's Ray-Ban offering. The device is part of a broader AI wearable strategy that includes camera-equipped AirPods and an AI pendant, all designed to feed contextual awareness into Apple Intelligence and create a more capable Siri experience.
Apple is preparing to launch Apple Glass by early 2027, marking its entry into the smart glasses market currently dominated by Meta's Ray-Ban smart glasses
2
. According to Bloomberg's Mark Gurman in his Power On newsletter, the display-free smart glasses will be equipped with integrated cameras, microphones, and speakers, enabling wearers to capture photos and videos, handle phone calls, manage notifications, and play music1
. The device represents just one element of Apple's ambitious AI wearable strategy designed to leverage computer vision and contextual awareness.
Source: Wccftech
The strategy extends beyond Apple Glass to include AirPods with cameras and an AI pendant, forming a troika of AI-powered Apple devices
2
. Apple's goal is to use this hardware ecosystem to feed a comprehensive view of the user's surroundings into Apple Intelligence, enabling contextually aware Siri interactions that respond based on what the system can "see" in the world1
. This approach could enable features like improved turn-by-turn map directions and visual reminders. However, the hands-free Siri functionality will only reach its full potential with an upgraded version expected in iOS 271
.Apple appears intent on creating substantial differentiation from Meta through premium materials and tight iPhone integration
2
. The company is reportedly opting for acetate frames along with multiple color and design options, potentially offering four different designs to appeal to varied consumer preferences2
. The glasses will sync seamlessly with an iPhone for post-capture editing and sharing, creating a utility-heavy integration that leverages Apple's existing ecosystem advantages.
Source: AppleInsider
Related Stories
Despite Meta's early lead and Google's advantages within the larger Android ecosystem, Gurman notes that "Apple's strengths -- its brand, in-house chips, giant retail presence and deep iPhone integration -- position it well to compete"
2
. He suggests these glasses could follow a trajectory similar to the Apple Watch: not first to market, but ultimately dominant if executed properly with a functional Siri. Looking further ahead, Omdia expects Apple to launch AR smart glasses with 0.6-inch dual OLEDoS displays only in 2028, months after Meta's anticipated competitive offerings2
. This suggests Apple is taking a measured, two-phase approachβfirst establishing a foothold with display-free smart glasses before advancing to more sophisticated AR capabilities.Summarized by
Navi
[1]