11 Sources
[1]
Apple Reportedly Testing Glasses AI in Several Frame Styles
Expertise I have more than 35 years' experience in journalism in the heart of Silicon Valley. It appears consumers will be able to pick a style of Apple smart glasses that suits their tastes. Bloomberg's Mark Gurman reported Sunday that Apple is testing at least four frame styles to rival those of smart glasses competitor Ray-Ban. The company is also reportedly working on a new camera setup for its first foray into smart glasses. Apple's design team has created at least four styles that will be available in multiple color options, Gurman reported in his Power On newsletter. According to Gurman, the designs include a pair with a large rectangular frame, like Ray-Ban Wayfarers; a slimmer rectangular frame; a larger oval or circular design; and a smaller, oval or circular option. Apple is also reportedly considering a camera setup with vertically oriented oval lenses with surrounding lights -- another design feature intended to differentiate Apple's product from its rivals, such as Meta, the biggest maker of smart glasses right now. The goal for the frames is reportedly to have an instantly recognized design. Research released in March by Counterpoint Research suggests the smart-glasses category is still in its early stages. The tech market research firm reported that the smart glasses market grew 139% year-over-year in the second half of 2025, compared with 2024. And Meta's AI smart glasses portfolio gets credit for the growth. The smart glasses are expected to feature cameras for capturing videos and photos; microphones and speakers for handling phone calls, listening to notifications and playing music. They will also reportedly have a multimodal AI that could respond to requests via Siri. Gurman reported that Apple's plan is to announce the glasses as soon as the end of the year or early next year, with shipments beginning by the end of 2027. Apple representatives didn't immediately respond to a request for comment.
[2]
Apple testing four frame designs for AI smart glasses ahead of 2027 launch
In short: Apple is testing at least four frame styles for its upcoming AI-powered smart glasses, according to a Bloomberg report by Mark Gurman published 12 April 2026. The designs include a large rectangular style similar to Wayfarer frames, a slimmer rectangular style comparable to those worn by CEO Tim Cook, a larger oval or circular frame, and a smaller oval variant. The frames are being made from acetate rather than standard plastic. The hardware uses the N401 chip, a custom processor based on Apple Watch S-series architecture, alongside two cameras: one for photo and video capture and one for computer vision. There is no display in the first version. Apple is targeting the start of production in December 2026, with a public launch expected in spring or summer 2027. The four frame designs span different aesthetic registers. The large rectangular option resembles classic Wayfarer-style frames, a shape with broad consumer recognition and mainstream wearability. The slimmer rectangular variant is similar to the frames Tim Cook wears, a choice that suggests one option being tested is explicitly calibrated for professional wear. The two oval formats range from a larger, more expressive shape to a smaller, minimal design. The breadth of options being tested indicates Apple is not yet committed to a single visual language for the product and is gauging which combination of shapes attracts the widest set of potential users, consistent with how Apple typically approached its first watch designs before settling on a single enclosure direction. The material is acetate, described in Gurman's report as more durable and luxurious than standard plastic, a distinction that positions the glasses against Meta's existing Ray-Ban line rather than against budget-tier wearables. Colours confirmed in the testing phase include black, ocean blue, and light brown, with many additional options expected by the time a final product is announced. The camera module on the front of the frame uses a vertically oriented oval arrangement with indicator lights surrounding it, a configuration that differs from the horizontal lens placement used in Meta's Ray-Ban models and is intended to make the device recognisable as an Apple product at a distance. Apple is targeting a sub-50g weight and all-day battery life. A price of approximately $499 has been reported in secondary coverage, though Gurman did not confirm a figure in the April 12 report. The N401 chip is a custom low-power processor derived from the Apple Watch S-series architecture, optimised for on-device inference within the thermal and battery constraints a pair of frames can accommodate. The glasses include two cameras: the primary handles photo and video capture; the second is dedicated to computer vision, providing Siri and Apple Intelligence with real-time environmental context without requiring the phone to be raised or unlocked. Microphones and spatial sensors are also integrated. There is no display in the first version, which means all information reaches the user through speakers or through the iPhone screen, and the glasses rely on the iPhone for any computationally intensive processing that cannot be handled on-device. The primary interface is Siri, which will handle notifications, music playback, phone calls, live translation, and visual intelligence queries about the wearer's surroundings. The version of Siri these glasses will run is the overhauled assistant Apple announced in January 2026, powered in part by a custom Gemini model developed through Apple's partnership with Google. The underlying system has been in preparation for some time: Apple Intelligence accidentally launched in China on 30 March 2026 before regulatory approval had been granted from the Cyberspace Administration, a moment that confirmed the readiness of the software while flagging the compliance surface Apple must navigate in markets outside the United States. The smart glasses category Apple is preparing to enter has been commercially validated by Meta over the past two years. Meta sold more than seven million Ray-Ban and Oakley AI frames in 2025, more than tripling its 2024 volume in a product category that barely existed three years earlier. The most recent iteration of Meta's strategy extended the product into corrective eyewear: Meta launches prescription Ray-Ban smart glasses to reach billions of eyewear buyers, a direct bid to convert the roughly 69% of the global eyewear market that requires corrective lenses and which standard smart glasses have not addressed. Apple updated the MacBook Air with M5 in March 2026, its most recent major hardware release, continuing a cadence of incremental but commercially significant hardware updates that Apple has maintained across its product lines even as the smart glasses project has moved from research into active engineering prototypes. Google is also preparing to enter the category in 2026 through its Android XR platform, in partnership with Warby Parker and Gentle Monster, with both audio-only and display-equipped variants targeting a launch ahead of Apple. Apple's 2027 entry means it will arrive after Meta has established commercial viability and after Google's first models have reached the market, a position Apple has occupied before: it arrived after BlackBerry in smartphones and after Fitbit in wearables, and in both cases produced a product that shifted what users expected from the category. The scale of Apple's existing iPhone user base, estimated at more than one billion active devices, gives its glasses a distribution advantage that neither Meta nor Google can replicate from a standing start. The four frame styles and technical specifications are part of a three-pronged AI wearable plan Gurman reported in February 2026. In addition to the smart glasses, Apple is developing a camera-equipped AI pendant, roughly AirTag-sized, designed to be clipped or pinned to clothing and to feed visual context to Siri continuously. The company is also developing camera-equipped AirPods. All three devices are intended to work together as ambient input channels for Apple Intelligence, creating a distributed sensory layer that extends Siri's awareness across different positions on the body rather than concentrating it in a single device. The pendant and camera AirPods are at earlier stages of development than the glasses, with the pendant a plausible candidate for a 2027 release and the AirPods cameras likely arriving in a subsequent year. The bet Apple is placing on ambient AI hardware reflects a broader pattern in how the technology industry is allocating resources. In March 2026, Meta cut hundreds of jobs across Reality Labs, recruiting, and sales while simultaneously expanding its investment in Ray-Ban hardware and raising its production targets, an indication that the economics of the AI hardware transition require companies to reallocate from traditional categories rather than simply scaling headcount. TNW reported in August 2025 that the next generation of AI unicorns might not hire anyone at all, analysing a structural shift in which companies are built with significantly smaller teams that accomplish more through AI capability. Apple's wearable strategy is the consumer-facing expression of the same dynamic: hardware that extends individual capability through ambient intelligence, built for an iPhone ecosystem that already provides the processing foundation, and designed to be worn all day rather than carried in a pocket or placed on a desk.
[3]
I'm a smart glasses expert -- and Apple's rumored Meta Ray-Bans rivals could tempt me to switch thanks to one key strength
* Apple is working on smart glasses that should be revealed later this year, according to leaks * AI AirPods and an AI pendant are also planned * Apple's privacy-first approach could help it win with AI glasses Every tech company and its dog is cramming AI into whatever system it can, and while Apple has been slow to get off the starting line, its rumored AI glasses, AI-focused AirPods, and AI necklace could help it snatch victory from the jaws of defeat -- all thanks to one major long-term focus: privacy. While a couple of these gadgets have been teased previously by Bloomberg's Mark Gurman (behind a paywall), who often shares reliable Apple insider info, he has just revealed more details about Apple's plans to create its first smart glasses. According to Gurman, Apple is developing display-free smart glasses that will compete directly with the Ray-Ban Meta Gen 2. These smart specs, internally code-named N50, will apparently help you capture photos and videos, play music, catch up on notifications, and interact with Siri. The latter will finally, according to reports, get a big upgrade in iOS 27. The rumored AirPods and pendant could similarly rely on tech like cameras and microphones to capture information from your surroundings to provide insight and assistance, such as visual reminders. Gurman doesn't share release dates for any of the trio, but says the smart spectacles should appear later this year, with a launch due in 2027. He also claims that Apple's smart glasses are being tested with four designs: a large, rectangular frame (like the Ray-Ban Wayfarers), a slimmer rectangular design (like Tim Cook's specs), larger oval or circular frames, plus a "smaller, more refined" version of the latter. But despite these rumors that Apple will try to eclipse its rivals with a "higher-end build", I still think they're going to have their work cut out for them, for a few reasons. Not plain sailing For starters, this delayed Apple glasses rollout could damage the perceived utility of its tech compared to its rivals -- especially as by the time it debuts AI specs, Apple's competitors are expected to have a generation or two of AR / display glasses released. From experience, AR and display glasses (which can overlay your vision with various details, including live translation, shared play environments, or HUD elements like a map) are also a significant step up from display-less AI specs in terms of their usefulness and capabilities. The less-than-stellar Apple Intelligence rollout also gives me cause for concern. Apple still hasn't really proved that it knows what it's doing with AI. The other potential pitfall is that Apple is said to be going it alone design-wise, rather than linking with a brand like Ray-Ban or Warby Parker, like Meta and Google have. As a fashion accessory, the look of smart glasses is almost as essential as their usefulness, and several brands I've seen try to design their glasses in-house have struggled to make something that looks good. That said, if any brand can buck that trend, it's arguably the tech design champion that is Apple. And the Californian tech giant also has one major smart glasses strength that could still win me over, even as an Android fan: privacy. Privacy, privacy, privacy Visual reminders are seen as the next big advantage AI wearables can leverage. That is, they can take in all of the information about your life and help you remember things like people's names, where you left your keys, or what needs restocking in your fridge. The issue is, while this level of AI assistance is undeniably useful, it's equally invasive -- essentially requiring the wearable to have an always-on view of your life. Otherwise, it could miss vital context that would make its advice useless. Meta and Google's practices have come under fire in the past for their data privacy, and what information is or isn't shared with their AI -- most recently, Meta has fallen into hot water over how many more videos and images than people realized are being shared with Meta and reviewed by contractors. Apple, on the other hand, has always made a big effort to promote privacy with its tech. And in the world of AI -- where some tasks require personal information to be processed on servers rather than on your device -- it created Private Cloud Compute to ensure that user data is kept private even when it is used by Apple's remote servers. I'm firmly in the Android ecosystem, and if you've seen our podcast you know I have a bit of an 'Apple-hater' persona. Still, I'd currently trust Apple glasses over any other brand as things stand -- and with the privacy advantages I expect them to offer I'd even be willing to put up with worse performance and specs if it meant knowing my personal data was secure. We'll have to wait and see what Apple reveals -- as with all leaks, we must take teases and speculation with a pinch of salt -- but I'm uncharacteristically excited to see what Apple has up its sleeve. Meta and Google (and the rest) should watch out. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[4]
Apple smart glasses might avoid the creepy reputation of Meta Ray-Bans with a light trick
Apple's upcoming smart glasses could sidestep one of the biggest issues facing the category - privacy concerns - by rethinking something as simple as the camera indicator light. According to a recent report by Bloomberg, the company is working on display-free smart glasses that focus on everyday functionality, but with a design approach that may make them feel less intrusive than current offerings. The device, internally codenamed N50, is expected to arrive around 2026 or 2027 and will function more like a companion to the iPhone than a standalone augmented reality system. Instead of a display, the glasses will rely on features like photo and video capture, voice interactions via Siri, notifications, and media playback. A Subtle Hardware Shift With Big Implications What sets Apple's approach apart is how it plans to handle recording visibility. Unlike existing smart glasses that use small LED indicators, Apple is reportedly experimenting with a more prominent lighting system integrated directly into the camera module. Recommended Videos The design includes vertically oriented lenses surrounded by visible lighting elements, making it harder to hide when recording is active. This could address a key concern that has plagued smart glasses since their inception: the fear of being recorded without consent. The Privacy Problem Others Are Still Facing The issue isn't theoretical. A report by WIRED highlights how users of Meta's Ray-Ban smart glasses have attempted to bypass privacy safeguards. Third-party sellers have even promoted accessories like "ghost dots," designed to dim or block the recording indicator light. These attempts, while often ineffective due to built-in protections, reveal a broader problem. If users actively try to hide recording signals, the trust required for widespread adoption breaks down. Even unsuccessful workarounds contribute to the perception that smart glasses can be misused, reinforcing the "creepy" reputation that has limited their acceptance. Apple's Strategy: Solve Trust Through Design Rather than relying solely on software restrictions, Apple appears to be addressing the issue at the hardware level. By making the recording indicator more visible and integrated into the design, the company is attempting to remove ambiguity. If successful, this could make it significantly harder to use the glasses in a way that feels covert or deceptive. This aligns with Apple's broader approach to new product categories. As seen with devices like the iPhone and Apple Watch, the company often enters markets later but focuses on refining user experience and addressing key pain points. Part Of A Larger AI Wearables Push The smart glasses are not being developed in isolation. Bloomberg notes that they are part of a broader strategy that includes AI-powered AirPods and other wearable devices designed to interpret the user's surroundings. These products will rely on computer vision and Apple Intelligence to provide contextual information, from navigation assistance to real-time reminders. This suggests that Apple's goal is not just to build smart glasses, but to create an ecosystem of devices that make AI more ambient and seamlessly integrated into daily life. What This Means For Users For consumers, the success of smart glasses will depend as much on perception as on functionality. If Apple can make its glasses feel transparent and trustworthy, it could overcome one of the biggest barriers to adoption. At the same time, tight integration with the iPhone and Apple's ecosystem may make the device more useful in everyday scenarios. What Comes Next Apple's smart glasses are still in development, with a launch expected no earlier than 2026 or 2027. Fully featured augmented reality glasses remain further out, likely toward the end of the decade. Until then, Apple's focus appears to be on getting the basics right - functionality, usability, and most importantly, trust.
[5]
AppleInsider.com
Apple Glass will be a direct competitor to Meta's Ray-Ban smart glasses, but it will be only a part of a larger three-pronged AI wearable strategy for the company. Here's what's coming. Apple has long been working on its smart glasses, known as Apple Glass. What is anticipated to actually launch will be quite close to what the existing Meta Ray-Bans can already do. In Sunday's "Power On" newsletter for Bloomberg, Mark Gurman writes that the Apple Glass will be easily able to handle everyday uses, including photographs and video capture, dealing with phone calls, handling notifications from an iPhone, and music playback. There will also be the ability to call upon Apple's digital assistant, Siri, in a hands-free way. However, that will only really be useful as part of an upgraded Siri in iOS 27. A big AI approach Apple Glass won't be a product in isolation, aside from its connection to an iPhone. It's really part of an attempt by Apple to take advantage of the power of artificial intelligence and computer vision. This will include two other products in the same category. The previously rumored AirPods with cameras and the more recent pendant claims. The idea is that Apple wants to use all of this hardware to feed a view of the user's surroundings into Apple Intelligence. This data could then be used to create contextual awareness, such as by Siri responding based on what it can "see" in the world.
[6]
Apple's smart glasses sound a lot more like Ray-Bans than anyone expected - Phandroid
Meta's Ray-Ban glasses spent years being a product people found interesting but didn't actually buy. That changed fast. Last year, EssilorLuxottica sold over 7 million pairs, far more than the combined total from the two years prior. The category has real momentum now, and Apple is clearly paying attention. According to Bloomberg's Mark Gurman, Apple smart glasses are further along than most people realized. Gurman's latest Power On newsletter reveals Apple is actively testing at least four different frame designs for the project, internally codenamed N50. The designs include a large rectangular frame similar to Ray-Ban Wayfarers and a slimmer rectangular version resembling the glasses worn by Tim Cook. Two oval or circular options round out the lineup, in different sizes. Rather than partnering with an established eyewear brand, Apple is building the frames entirely in-house. The material of choice is acetate, which Gurman describes as more durable and premium than standard plastic. Color options currently in testing include black, ocean blue, and light brown. The plan, according to Gurman, is to launch multiple styles at once across multiple colors, similar to how Apple debuted the Apple Watch in 2015 with a wide range of combinations from day one. These aren't AR glasses. There's no display. What Apple is reportedly building sits in the same category as the Ray-Ban Meta glasses: cameras, microphones, speakers, and a direct line to AI features. The glasses will reportedly capture photos and videos, handle phone calls, relay notifications, and enable hands-free Siri interactions. The front cameras are arranged in a vertical oval pattern, surrounded by indicator lights. That last detail is worth noting. Apple's indicator light approach is reportedly more prominent than what competitors use, making it harder to record without others noticing. Whether that's a genuine privacy consideration or just a design decision, it sets the Apple smart glasses apart visually from everything else on the market right now. The glasses are meant to tie closely into the iPhone, leaning on a rebuilt Siri expected to ship with iOS 27. Apple shifted engineers away from the Vision Pro revamp to push this project forward, which says a lot about where the company's priorities are heading. Gurman has the unveiling penciled in for late 2026 or early 2027, with actual availability in 2027. The Ray-Ban glasses had a years-long head start. Whether Siri is good enough by then is the only question that really matters.
[7]
Leak: Apple Testing Four Distinct Frame Styles for Upcoming 2027 Smart Glasses
Apple is poised to reshape the landscape of wearable technology with its much-anticipated smart glasses, expected to launch in 2027. These glasses aim to blend state-of-the-art innovation with everyday practicality, offering a seamless fusion of advanced technology and stylish design. With four distinct styles reportedly under development, Apple is working to integrate features such as AI-powered contextual awareness and computer vision into a product that feels as natural as traditional eyewear. If successful, this could represent a fantastic moment in the evolution of wearable devices, setting a new benchmark for the industry. The video below from SaranByte gives us more details about the rumored Apple smart glasses. Apple's smart glasses are being developed with a strong focus on aesthetics and functionality, making sure they appeal to a wide range of users. Reports indicate that Apple is testing four design variations to cater to diverse preferences: These frames are crafted from premium acetate, a material known for its durability and luxurious feel. To further enhance personalization, Apple is exploring a variety of color options, including black, ocean blue and light brown, making sure users can select a style that complements their personality. By prioritizing comfort, style and usability, Apple aims to create glasses that feel less like a piece of technology and more like an integral part of your daily wardrobe. At the heart of Apple's smart glasses lies a suite of advanced technologies designed to enhance everyday interactions. Using Apple's expertise in artificial intelligence, these glasses are expected to seamlessly integrate with your iPhone and Siri, allowing hands-free access to information and tasks. For instance, you could walk past a restaurant and instantly receive reviews or compare prices while shopping, all without needing to reach for your phone. The inclusion of computer vision technology further expands the glasses' capabilities. Potential features include object recognition, real-time text translation and augmented reality (AR) overlays for navigation. Imagine being guided through an unfamiliar city with AR directions displayed directly in your field of vision. These features aim to make the glasses not just a convenience but an indispensable tool for navigating and simplifying daily life. One of the most intriguing yet sensitive aspects of Apple's smart glasses is the potential integration of a camera system. Early prototypes reportedly feature vertically oriented oval lenses equipped with cameras, raising questions about privacy. To address these concerns, Apple is considering incorporating green indicator lights to signal when recording is active, making sure transparency and fostering trust among users and those around them. This approach reflects Apple's commitment to balancing innovation with ethical considerations. By prioritizing privacy and social acceptance, the company aims to mitigate the discomfort often associated with wearable cameras. This focus on responsible design could position Apple's smart glasses as a socially conscious and user-friendly device, setting them apart from competitors. Despite their potential, Apple's smart glasses face several challenges that could impact their adoption. As a premium product, they must deliver flawless functionality while remaining comfortable and unobtrusive. Privacy concerns, coupled with what is likely to be a high price point, may deter some consumers. Additionally, the personal and stylistic nature of eyewear means that the glasses must appeal to a broad audience without feeling overly technical or intrusive. Apple appears to be addressing these challenges by designing glasses that closely resemble traditional eyewear, avoiding the overly futuristic appearance that has hindered similar devices in the past. By emphasizing user comfort, style and seamless integration, Apple hopes to overcome these hurdles and position its smart glasses as a mainstream success. The anticipated release of Apple's smart glasses in late 2026 or early 2027 aligns with the company's broader vision of expanding its portfolio of AI-powered wearables. Rumors suggest that Apple is also exploring other innovative devices, such as camera-equipped AirPods and smart pendants, signaling a shift toward screenless interaction models. These devices aim to create a more immersive and intuitive user experience, reducing reliance on traditional screens. Apple's smart glasses could serve as a pivotal step toward this future, offering a glimpse into how technology can seamlessly integrate into daily life. However, their success will depend on several factors, including execution, pricing and consumer adoption. By addressing these challenges, Apple has the potential to redefine how you interact with both your devices and the world around you, paving the way for a new era of wearable technology. Unlock more potential in Apple smart glasses by reading previous articles we have written. Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.
[8]
Apple's Display-Free Smart Glasses To Debut By Early 2027, Might Sport 4 Different Designs
Apple is gearing up to give Meta's Ray-Ban smart glasses a run for their money by launching a superior competitor, replete with a more premium build and greater design versatility, as per the latest Power On newsletter from Bloomberg's Mark Gurman. Mark Gurman has disclosed today that Apple's upcoming display-less smart glasses will likely launch by early 2027, and come equipped with integrated cameras, microphones, and speakers, enabling the wearer to interact via an improved version of its bespoke AI assistant, Siri. These new camera-equipped smart glasses will be able to capture photos and videos, sync with an iPhone for post-capture editing and sharing, handle phone calls, keep a tab on notifications, play music, and enable hands-free interaction via Siri. These smart glasses would complete a troika of new AI-powered devices from Apple, which include camera-equipped AirPods Pro and an AI pendant. All of these devices will leverage computer vision to interpret a given user's surroundings and feed contextual awareness directly into Siri and Apple Intelligence, enabling features like improved turn-by-turn map directions and visual reminders. Interestingly, Apple is planning to create a hefty differentiation between its smart glasses and the ones from Meta by implementing a tight, utility-heavy integration with the iPhone. Additionally, Apple appears to be opting for the more premium acetate frame for its smart glasses, along with a host of color and design options: Gurman concludes by noting: "Despite Meta's early lead and Google's advantages with the larger Android ecosystem, Apple's strengths -- its brand, in-house chips, giant retail presence and deep iPhone integration -- position it well to compete. If executed properly with a functional Siri, these glasses could follow a trajectory similar to the Apple Watch: not first to market, but ultimately dominant." Elsewhere, Omdia expects Apple to launch its own AR smart glasses - replete with 0.6-inch dual OLEDoS displays - only in 2028, months after Meta would have presumably launched its own competitive offerings. For the benefit of those who might not be aware, OLEDoS, also called Micro-OLED display tech, mounts Organic Light-Emitting Diodes (OLED) directly onto a single-crystal silicon wafer substrate. Unlike traditional OLED screens used in smartphones or TVs that are built on a glass or plastic base, OLEDoS leverages semiconductor manufacturing processes to achieve extreme miniaturization and performance, leading to ultra-high pixel density and an improved power consumption profile, especially as the circuitry is integrated directly into the silicon backplane using CMOS technology.
[9]
Move Over, Ray-Ban Meta: Samsung's AI Galaxy Smart Glasses Are Finally Coming This Year
The Samsung Galaxy AI Smart Glasses are set to redefine the wearable market when they debut in 2026. By merging cutting-edge augmented reality with the power of Android XR, Samsung is promising an immersive experience that feels natural yet highly functional. Through collaborations with Google for the software and Warby Parker and Gentle Monster for the aesthetics, the Galaxy AI Glasses are designed to be a sleek, modern alternative to traditional bulky headsets. The Galaxy AI Smart Glasses are set to launch in the second half of 2026, aligning with the release of Samsung's flagship Galaxy Z Fold 8 and Z Flip 8 devices. This synchronized launch underscores Samsung's strategy to create a unified ecosystem of interconnected devices, enhancing the user experience across its product portfolio. By timing the release alongside its flagship smartphones, Samsung aims to deliver a cohesive and polished product lineup that integrates seamlessly into users' daily lives. Samsung's partnerships with Google, Warby Parker and Gentle Monster reflect its commitment to blending technology with design. These collaborations have resulted in the development of two distinct models of the Galaxy AI Smart Glasses: By working with renowned eyewear brands, Samsung is addressing the needs of both tech enthusiasts and style-conscious consumers. This dual-model approach ensures that the smart glasses appeal to a broad audience, from those seeking innovative technology to individuals prioritizing fashion and practicality. The Galaxy AI Smart Glasses are expected to operate on the Android XR platform, a specialized version of Android designed for extended reality (XR) applications. Key hardware features include: These specifications highlight Samsung's focus on practicality, making sure the glasses are both functional and user-friendly. The combination of advanced hardware and intuitive software positions the Galaxy AI Smart Glasses as a versatile tool for various applications, from productivity to entertainment. The Galaxy AI Smart Glasses are a cornerstone of Samsung's broader vision to advance multimodal AI experiences. By incorporating voice, vision and gesture recognition, the smart glasses aim to deliver a cohesive and intuitive user experience. This initiative aligns with Samsung's ongoing investment in AI technologies, including the development of HBM 4 memory chips designed to support high-performance AI applications. The integration of AI-driven features into the Galaxy AI Smart Glasses underscores Samsung's commitment to leading the next generation of wearable technology. The development of the Galaxy AI Smart Glasses builds upon the foundation established by Samsung's Galaxy XR headset, which debuted in October 2025. The XR headset provided valuable insights into user preferences and technical challenges, informing the design and functionality of the new smart glasses. By addressing the limitations of its earlier products, Samsung aims to deliver a more refined and versatile AR experience with the Galaxy AI Smart Glasses, catering to a wider range of user needs and scenarios. The release of the Galaxy AI Smart Glasses represents a pivotal moment in the evolution of AR technology. By combining advanced hardware, AI-driven software and stylish design, Samsung is setting a new benchmark for wearable devices. As the AR market continues to grow, these smart glasses have the potential to transform how users interact with technology in their everyday lives. Whether enhancing productivity, allowing immersive entertainment, or simply offering a new way to stay connected, the Galaxy AI Smart Glasses are poised to redefine the possibilities of AR wearables. Here is a selection of other guides from our extensive library of content you may find of interest on AR smart glasses. Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.
[10]
Apple Glasses 2026: Everything We Know About the Late-Year Launch and Siri 2.0
Apple is reportedly advancing its efforts in developing augmented reality (AR) glasses, a product that could redefine the wearable technology landscape. Expected to debut as early as 2026, these glasses are anticipated to feature Apple Intelligence, the company's next-generation AI assistant, seamlessly integrating with Apple's ecosystem. This strategic move positions Apple as a formidable competitor in the AR market, aiming to balance innovation with user accessibility. The video below from Matt Talks Tech gives us more details about the Apple Glasses. The release of Apple Glasses is closely linked to the introduction of Apple Intelligence, which is expected to launch in 2026. While the glasses may be announced that year, their actual availability could extend into 2027. Apple appears to be prioritizing a meticulous development process, making sure that both hardware and software are fully optimized before launch. This deliberate approach underscores Apple's commitment to delivering a polished, reliable, and user-friendly product. By taking the time to refine the glasses, Apple aims to avoid the pitfalls of rushed releases, making sure a product that meets the high expectations of its user base. Apple is rumored to be developing two distinct versions of its AR glasses, catering to a diverse range of users and preferences: This dual-version strategy reflects Apple's intent to appeal to both everyday consumers and those seeking state-of-the-art AR functionality, making sure a broader market reach. The Apple Glasses are expected to incorporate a range of innovative features, aligning with Apple's reputation for user-centric design and advanced technology. Some of the anticipated features include: These features highlight Apple's focus on blending practicality with innovation, making sure the glasses are both functional and forward-thinking. Apple's pricing strategy for the glasses aims to strike a balance between affordability and its premium brand image. Current estimates suggest the following pricing tiers: This pricing approach reflects Apple's intent to make the glasses accessible to a broader audience while maintaining its reputation for delivering high-quality, premium products. The AR glasses market is becoming increasingly competitive, with major players such as Meta, Google and Samsung vying for dominance. Apple's entry into this space will directly challenge existing products, particularly Meta's AR offerings. By focusing on competitive pricing, seamless integration with its ecosystem and a robust app library, Apple aims to carve out a significant share of the market. Apple's ability to use its existing ecosystem, including devices like the iPhone, iPad and Apple Watch, could provide a significant advantage. This integration ensures a cohesive user experience, setting Apple apart from competitors who may lack such a unified platform. Reports indicate that Apple has redirected resources from its Vision Pro project to prioritize the development of its AR glasses. This strategic shift underscores the importance of the glasses in Apple's long-term product roadmap. Key areas of focus in the development process include: These efforts demonstrate Apple's commitment to making the glasses a cornerstone of its wearable tech lineup, making sure they meet the high standards associated with the brand. The Apple Glasses represent a significant step forward in wearable technology, combining augmented reality, artificial intelligence and user-focused design. With a potential release as early as 2026, these glasses could transform how you interact with digital content in your daily life. By offering multiple versions, competitive pricing and a focus on accessibility, Apple is positioning itself to lead the AR glasses market. As the launch date approaches, Apple's ability to deliver on its vision for the future of augmented reality will be closely watched. The success of the Apple Glasses could not only redefine the AR market but also set a new standard for wearable technology, further solidifying Apple's role as an industry leader. Here are additional guides from our expansive article library that you may find useful on Apple Glasses. Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.
[11]
Apple Glasses latest leak reveals four designs and multiple colour options
Premium acetate material and a new camera layout are reportedly under evaluation. Apple is reportedly working on its first display-free smart glasses, with multiple design options and a distinct camera system. So, it would be different from the Apple Vision Pro, devoid of AR (augmented reality) features, and similar to Meta Ray-Ban smart glasses. But, since it will be an Apple product, it would have deep integration with other ecosystem products, like hearing iOS notifications through the glass's speakers, for instance. Here's everything else you can expect: Apple's upcoming smart glasses, internally codenamed N50, will be simpler in build and not full augmented reality (AR) eyewear. As mentioned already, the Apple glasses could look similar to Ray-Ban Meta Smart Glasses, which also offer cameras, audio, and voice assistance without a display. So, these Apple smart glasses are designed for everyday use and are expected to include: We learn that Apple is testing at least four frame designs: And you could also get the glasses in multiple finishes, like black, ocean blue, and light brown. A key differentiator could be the use of acetate, a higher-end material typically found in premium eyewear, which suggests that Apple is prioritising build quality and aesthetics. Another notable change is the camera system. Apple is said to be testing vertically-oriented oval lenses with surrounding lights. This would be for privacy reasons, as in, the bystander can know it is a pair of smart glasses. Also Read: Best home projectors in 2026: What to buy and why it matters Apple's approach could rely on tight integration with the iPhone ecosystem. The glasses could be heavily dependent on an iPhone for processing and connectivity, like the Apple Watch at launch. This aligns with Apple's broader AI wearables strategy that reportedly includes upgraded AirPods and a camera-equipped pendant. The goal is to build a network of devices that feed real-world context into Apple Intelligence and Siri. Apple is entering late, but since it controls both hardware and software, its offering could differentiate on user experience. Pricing would be crucial, too, for adoption. Meanwhile, a fully functional AR eyewear could still be several years away as there is still a lot of work needed around battery life, processing power and form factor. The report also notes that Apple's foldable iPhone remains on track for a September announcement, while a long-anticipated leadership transition in its AI division is nearing completion. For Apple, this device represents one of the biggest design changes to the iPhone in years, and market expectations are high. In the near term, the foldable iPhone could be a more impactful product, especially in the premium segment. But over the longer term, Apple's wearable AI strategy, including the titular smart glasses, may define its next phase of growth, if executed well.
Share
Copy Link
Apple is testing at least four frame styles for its upcoming AI-powered smart glasses, targeting a 2027 launch. The designs range from large rectangular Wayfarer-style frames to smaller oval options, all made from acetate. What sets Apple apart: a prominent camera indicator light system designed to address privacy concerns that have plagued Meta Ray-Bans and other smart glasses.
Apple is testing at least four distinct frame designs for its first AI-powered smart glasses, according to Bloomberg's Mark Gurman
1
. The designs include a large rectangular frame similar to Ray-Ban Wayfarers, a slimmer rectangular style comparable to frames worn by CEO Tim Cook, a larger oval or circular design, and a smaller, more refined oval variant2
. The frames are being constructed from acetate rather than standard plastic, positioning them as a more premium alternative to existing options in the market2
.
Source: AppleInsider
The smart glasses, internally code-named N50, will feature cameras for photo and video capture, microphones and speakers for handling phone calls, listening to notifications, and playing music
1
. Apple is targeting production to begin in December 2026, with a public launch date expected in spring or summer 20272
. The breadth of frame designs being tested indicates Apple has not yet committed to a single visual language for the product and is gauging which combination attracts the widest set of potential users2
.
Source: Wccftech
What distinguishes Apple's approach from competitors like Meta is a focus on addressing privacy concerns through hardware design. The company is experimenting with a camera setup featuring vertically oriented oval lenses with surrounding lights, a configuration intended to make recording activity more visible and harder to conceal
1
4
. This design choice directly addresses the "creepy" reputation that has limited smart glasses adoption, particularly concerns about being recorded without consent4
.
Source: CNET
A report by WIRED highlights how users of Meta Ray-Bans have attempted to bypass privacy safeguards, with third-party sellers even promoting accessories like "ghost dots" designed to dim or block recording indicator lights
4
. By making the recording indicator more visible and integrated into the design at the hardware level, Apple appears to be removing ambiguity and building user trust from the ground up4
.Apple's long-standing focus on data privacy gives it a potential advantage in the AI wearables space. The company has consistently promoted privacy with its tech, including creating Private Cloud Compute to ensure user data remains private even when processed on remote servers
3
. This privacy-first approach could prove decisive as smart glasses require always-on access to visual information about users' lives to provide contextual AI assistance3
.The smart glasses represent just one component of Apple's three-pronged AI wearable strategy that also includes AI-powered AirPods with cameras and an AI pendant
5
. Apple wants to use this hardware to feed a view of the user's surroundings into Apple Intelligence, creating contextual awareness for Siri interactions based on what it can "see" in the world5
.The glasses will include two cameras: one primary camera handles photo and video capture, while a second is dedicated to computer vision, providing Siri and Apple Intelligence with real-time environmental context without requiring the iPhone to be raised or unlocked
2
. The device uses the N401 chip, a custom low-power processor derived from Apple Watch S-series architecture, optimized for on-device inference within the thermal and battery constraints a pair of frames can accommodate2
.The version of Siri these glasses will run is the overhauled assistant Apple announced in January 2026, powered in part by a custom Gemini model developed through Apple's partnership with Google
2
. Siri will handle notifications, music playback, phone calls, live translation, and visual intelligence queries about the wearer's surroundings2
. However, the first version will not include a display, meaning all information reaches the user through speakers or through the iPhone screen2
.Related Stories
Apple is entering a smart glasses category that Meta has commercially validated over the past two years. Meta sold more than seven million Ray-Ban and Oakley AI frames in 2025, more than tripling its 2024 volume
2
. Research from Counterpoint Research shows the smart glasses market grew 139% year-over-year in the second half of 2025 compared to 2024, with Meta's AI smart glasses portfolio credited for the growth1
.Google is also preparing to enter the category in 2026 through its Android XR platform
2
. This delayed rollout could damage the perceived utility of Apple's tech compared to rivals, especially as competitors are expected to have a generation or two of augmented reality or display glasses released by the time Apple debuts its display-free AI specs3
. Apple is targeting a sub-50g weight and all-day battery life, with colors confirmed in testing including black, ocean blue, and light brown2
.Summarized by
Navi
[4]
[5]
23 May 2025•Technology

02 Oct 2025•Technology

30 Jun 2025•Technology

1
Entertainment and Society

2
Health

3
Technology
