28 Sources
28 Sources
[1]
Apple's Trio of AI Wearables Could Arrive as Soon as Next Year
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps It seems like every big tech company is getting its feet into the AI wearables game, and Apple is joining in. According to reliable Bloomberg reporter Mark Gurman, a trio of Apple wearables is in the works, arriving anywhere between later this year and late next year. The three reported devices are glasses, a pin, and more advanced AirPods. All of these have been reported on previously in various stories, but Gurman's latest update suggests how they'll work in tandem with each other. That's exactly what I was expecting: In fact, I wrote about it a few weeks ago in a story about all the AI and wearable reports that were popping up. Apple's recent deal with Google to have Gemini power its next wave of Apple AI always suggested wearables to me, because Gemini's camera-enabled and live modes are key to Google's upcoming wave of glasses coming later this year. Apple could be using Gemini to leap ahead in camera and context-aware live AI functions that could make these new wearables work. The AI glasses in Gurman's report are more like existing display-free Meta Ray-Bans, equipped with cameras, microphones and speakers. According to Gurman, two camera sensors will split photo/video and sensory awareness duties. The designs and frames may be Apple-made, something I expected could be the case based on every one of Apple's other products. Apple also has its retail stores where you could try on these glasses and get them sized. I'd expect a similar set of features on them to what Meta and Google are already bringing: recording, assistive features like guidance, translation and audio captioning, and audio features that would make them work like glasses-shaped AirPods. The glasses are expected next year, but if that's the case, I'd expect Apple to give an advance preview of them this year in a similar move to what they did with Vision Pro and the original Apple Watch. It's always made sense for Apple to make glasses, considering its excellence in camera and audio tech. And as I said before, the pieces for glasses are already distributed in a variety of Apple devices, in a sense. The AirPods update expected later this year looks like it'll be focused on adding infrared cameras to the AirPods Pro 3 design. These cameras should work for gesture tracking, not taking pictures: it'll probably work a bit as the Vision Pro's near-range hand tracking does, and likely will work in the dark too. If these do arrive in 2026, they'll be Apple's first piece in the next wearable puzzle. They'll push out some of the AI functions on AirPods, most likely, and allow some simple hand gestures that could control music or interact with workouts. They could also be a way of testing hand tracking technology that could land on the glasses, too. Gurman emphasizes that a camera-enabled AI pin is also in the works, also with a release in 2027 ... although there's also a chance this pin might not happen at all. This mirrors a report from The Information earlier this year. Most other tech companies have been gravitating to pendants and pins lately, judging by announcements made at CES in January. Pins could be more versatile to wear, and maybe they're a bit of a hedge in case everyone doesn't like the idea of glasses. Maybe Apple doesn't actually make all three. Or maybe they do. The report sounds less certain on the pin's arrival than on glasses and new AirPods. Apple's pin clearly sounds like an iPhone accessory rather than an independent device like the failed Humane AI Pin, focusing on assistive camera and audio functions (and maybe a hand-tracking interface). None of these devices would have displays, but down the road, Apple could find a way to blend a lower-cost Vision headset in a more glasses-like form with these types of smaller wearables. Or maybe Apple's figuring out ways to explore new interfaces on wearables first before solving for a new wave of wearable displays beyond Vision Pro. Odds are we won't hear anything about any of this until at least Apple's WWDC developer conference in June, but I'm now extremely curious how much of these plans might emerge then ... considering Google could also be readying its lineup of glasses soon too, with Google I/O on deck for late May. Apple didn't immediately respond to a request for comment.
[2]
Apple Ramps Up Work on Glasses, Pendant, and Camera AirPods for AI Era
Apple Inc. is accelerating development of three new wearable devices as part of a shift toward artificial intelligence-powered hardware, a category also being pursued by OpenAI and Meta Platforms Inc. The company is ramping up work on smart glasses, a pendant that can be pinned to a shirt or worn as a necklace, and AirPods with expanded AI capabilities, according to people with knowledge of the plans. All three devices are being built around the Siri digital assistant, which will rely on visual context to carry out actions. Each of the products, which will be linked to Apple's iPhone, depends on a camera system with varying capabilities, said the people, who asked not to be identified because the plans haven't been announced. A spokesperson for Cupertino, California-based Apple declined to comment. The AirPods and pendant are envisioned as simpler offerings, equipped with lower-resolution cameras designed to help the AI work rather than for taking photos or videos. The glasses, meanwhile, will be more upscale and feature-rich. In an all-hands meeting with employees earlier this month, Chief Executive Officer Tim Cook hinted that the company would be pushing hard into AI devices, saying Apple is working on new "categories of products" that are enabled by artificial intelligence. "We're extremely excited about that." Cook added that the company was investing in new technology. "The world is changing fast," he said. While iPhone sales remain robust, Apple is playing catch-up in AI. Revamping Siri has been a key challenge: Upgrades to the voice assistant have been plagued by development snags, delaying their rollout. The company is preparing a version of the assistant for iOS 27, due later this year, that will feature a chatbot-like interface. Apple will rely on underlying models co-developed with Alphabet Inc.'s Google. In the longer run, AI is expected to change the way consumers use phones -- with more activities shifting to peripherals. Meta's glasses have already become a hit, and OpenAI is developing a series of devices, including wearables, with the help of ex-Apple design chief Jony Ive and other former Apple executives. Apple has been trying to find a winning formula in this area. Its last major push into a new category, the pricey Vision Pro headset, didn't resonate with consumers. The company is looking for a breakthrough with its accelerated push into wearable devices, aiming to keep users locked into the Apple ecosystem. Smart Glasses The smart glasses are planned to be positioned as an advanced offering in the company's AI hardware lineup, intended to compete with Meta's camera-equipped eyewear. They would include a high-resolution camera capable of capturing photos and video. Apple has made significant progress in recent months on its glasses, code-named N50, and has recently distributed a broader set of prototypes within its hardware engineering division. The company is targeting the start of production as early as December, ahead of a public release in 2027. Like most of Meta's current offerings, the glasses won't include a display. Instead, the interface will rely on speakers, microphones and cameras -- letting users make phone calls, access Siri, take actions based on surroundings, play music and take photos. Apple aims to differentiate the product in two key areas: build quality and camera technology. Employees say the company initially developed the hardware by embedding electronics and cameras into off-the-shelf frames from a variety of popular brands. Apple at one point even discussed relying on partnerships to launch the product, following a broader industry trend. Meta works with EssilorLuxottica SA, while Google has teamed up with Warby Parker Inc. More recently, however, Apple decided to develop its own frames in-house in a variety of sizes and colors. Early prototypes of the glasses connect via a cable to a standalone battery pack and an iPhone, but newer versions have the components embedded in the frame. The design uses high-end materials, including acrylic elements intended to give the glasses a premium feel. Apple is already discussing launching the device in additional styles over time. The glasses will include two camera lenses: one for high-resolution imagery and another dedicated to computer vision -- a technology similar to what's used in the Vision Pro. The second sensor is designed to give the device environmental context, helping it more accurately interpret surroundings and measure distance between objects. The goal is for the glasses to function as an all-day AI companion, capable of understanding what a user is seeing and doing in real time. Wearers could look at an object and ask what it is and get assistance with everyday tasks. That could mean inquiring about ingredients in a meal, for instance. Apple is also exploring more advanced uses. The glasses could read printed text and convert it into digital data -- say, by adding the information on an event poster directly to a calendar. The device also could create context-aware reminders, such as prompting a user to grab an item when they're looking at the right shelf in a grocery store. For navigation, Siri could reference real-world landmarks -- rather than just giving more generic instructions. The assistant could tell users to walk past a described building or vehicle before making a turn. Apple already has some visual AI capabilities, including the Visual Intelligence feature for analyzing images on iPhones, but the technology would be more accessible. Pendant and AirPods Of course, some users prefer not to wear something on their face -- especially if they don't already have glasses. Apple is aiming to serve that market with its other wearable AI devices: the pendant and camera-equipped AirPods. Apple's industrial design team hatched the pendant idea while working on the glasses -- before they had settled on a design for that product. The device is reminiscent of the failed Humane AI pin, but it's designed as an iPhone accessory rather than a standalone product. The pendant would essentially serve as an always-on camera for the smartphone that also includes a microphone for Siri input. Some Apple employees call it the "eyes and ears" of the phone. While Apple's industrial design team is leading the strategy for the product, Apple is also leaning on the Vision Products Group that developed the Vision Pro for the engineering. That group is working on the smart glasses as well. Unlike the Humane AI Pin, the Apple device lacks a projector or a display system. It's also designed to rely heavily on an iPhone for processing. Though it has a dedicated chip, the system is closer in computing power to AirPods than an Apple Watch. One area of debate for the product has been whether or not to include a speaker, which would allow users to hold back-and-forth conversations with the device directly. That means they could leave their iPhone in their pocket or bag or not wear AirPods. Apple is working to allow users to wear the AirTag-sized pendant in two primary ways: with a clip that can attach to clothing or via a necklace that can be placed through a hole inside the hardware. The Information previously reported on aspects of the pin project, which remains early-stage and could still be canceled. If Apple moves forward with the device, it could launch as early as next year. The plans for the other products also remain fluid. The company has previously stopped work on other devices, including updated versions of the Apple Watch with embedded cameras. Testers found the concept impractical due to clothing sleeves and the difficulty of capturing usable camera angles from the wrist. The AirPods, planned for as early as this year, have been in development for a while, with Bloomberg News first reporting in early 2024 that Apple was exploring camera-equipped earbuds. The company has steadily added AI features to the product, including a new live-translation mode introduced last year. Down the road, Apple aims to create smart glasses with an augmented reality display, giving users access to richer data and visuals. But a potential launch remains many years away. The company stopped development last year of a cheaper and lighter version of its Vision Pro headset dubbed N100. It was meant to be a bridge toward the AR devices, but Apple ultimately chose to focus on glasses rather than a more enclosed headset design. Beyond wearables, Apple is developing a range of AI devices for the home. That lineup includes a smart display built around the company's upcoming Siri revamp and a later version with a larger screen and robotic arm. The company is also working on an updated HomePod speaker and a compact indoor sensor for home security and automation.
[3]
Apple’s AI Gadgets Don’t Sound Groundbreaking at All
Apple seems to be inching toward AI gadgets, and if a recent report from Bloomberg is any indication, lots of them will have one thing in common: they'll use "Visual Intelligence." In case you're not brushed up on your Apple branding, "Visual Intelligence" is the company's version of computer visionâ€"an AI feature that gives gadgets "sight," so to speak. According to Bloomberg, Apple wants Visual Intelligence to be a defining feature across a range of hardware, including a new generation of AirPods with cameras, Apple's first pair of smart glasses, and even an AI pendant that sounds weirdly reminiscent of a failed Ai Pin made by Humane. What exactly will computer vision be doing in those gadgets? Well, apparently, the exact same stuff that it does in other gadgets. Per Bloomberg: "...the most basic applications could involve taking a plate of food and identifying the items and ingredients. More advanced uses include the device giving specific instructions for conducting a task based on what it sees. That might mean upgraded turn-by-turn directions, with the device telling a user to go past a specific landmark â€" rather than just a certain number of feet. The technology also could remind users to do something when they walk up to a certain object or place." If you're at all familiar with computer vision and how it works in gadgets like smart glasses, you probably read the above and got a little déjà vu. Computer vision is a defining feature of popular smart glasses like the Ray-Ban Meta AI glasses and can be used for quite a few things, like translating text on a food menu, identifying objects in your environment, and giving you instructions for a recipe while you're cooking. While I'll concede that the use case for navigation would be novel, Apple seems to be on the exact same track as Meta and other companies squeezing computer vision capabilities into their hardware. Whether Apple would have any more success in making computer visionâ€"er, Visual Intelligenceâ€"work in AI gadgets is anyone's guess. While computer vision is arguably among the more futuristic and novel features of smart glasses, it's also one of the least reliable and, oftentimes, the least applicable to your daily use. In my experience using the Ray-Ban Meta AI glasses, computer vision has a habit of getting stuff wrong (you can read my review of the Meta Ray-Ban Display for specific examples), which makes it hard to trust and even more difficult to incorporate into your day-to-day use as a result. I still think the technology could be great for accessibility purposes, but that's not exactly what Apple is pitching here. While there's a chance that Apple is working toward some kind of breakthrough on the computer vision front that would make Visual Intelligence more reliable and useful, it's done little, so far, to show progress. As Bloomberg notes, the existing Visual Intelligence features inside iOS, for example, are reliant mostly on OpenAI's ChatGPT and, in the near future, Google's Gemini. Models offered by those companies, in my experience, are just as fallible as the rest. A lot can happen between now and when Apple finally decides to start rolling out its AI-centric hardware (late this year at the soonest), but for now, it would appear that AI gadgets are a bit stuck on how/when computer vision can be usedâ€"or at least stuck on making those scenarios feel functional. Apple's vision for Visual Intelligence may sound a little more useful than OpenAI's reported smart speaker with a camera, but that's a pretty low bar.
[4]
Apple's upcoming AI smart glasses are starting to sound a lot more exciting - 9to5Mac
Earlier this week, Bloomberg reported that Apple will be "accelerating" the development of three upcoming AI wearables: smart glasses, a pendant, and AirPods with cameras. These three products are all meant to integrate Siri deeper into our everyday lives, and I'll be focusing on the glasses aspect of it. Obviously, Apple has always had an ambition to make glasses. The idea of Apple making AR glasses has been in the rumor mill for ages. For now, that's on pause - and Vision Pro will have to do. In the meanwhile, the company is pursuing AI glasses, similar to Meta Ray-Bans. Meta Ray-Bans have been a hit since they launched towards the end of 2023. Despite everyone knowing Meta for being a privacy-invading company, loads of people have been excited to have a pair of glasses from Meta with cameras on them. At their core, they have cameras, microphones, and speakers - allowing users to talk to Meta AI about their surroundings, listen to music, or take photos and videos. Starting last year, we begun to hear that Apple was working towards releasing its own version of Meta Ray-Bans in the coming year. Now, development has advanced significantly. Apple is reportedly integrating two camera lenses: one for computer vision, and another for taking photos and videos. The company has also figured out how to embed all of the components in the frame, when initially they were planning on relying on an external battery. When talking about products that rely on voice for communication, it's always been easy to think about how it might not always be practical to communicate out loud. This is why I largely don't use any of the voice assistant features on my Meta Ray-Bans. Recently, though, Apple acquired a new startup for $2 billion: Q.ai. While we don't know loads about this company, we do know one thing quite clearly - it specialized in machine learning systems for interpreting silent voice input. Right now, if you want to speak to a voice assistant, it has to be pretty audible. Even whispers can be hard at time for certain voice models, especially when you aren't in a completely silent environment. This new acquisition could solve that. Q.ai also researched systems for interpreting micro facial movements, enabling them to understand speech without it being audible at all. If this can all come together nicely, I think Apple Glasses will be incredibly appealing to loads of people, and might make people take voice assistants more seriously. Ultimately, I'm sure Meta Ray-Bans will end up being much cheaper than Apple's AI glasses. However, if Apple is truly able to stick the landing with next-level speech recognition technology, I think a lot of people will be willing to overlook the price difference. While we don't know a concrete release date, it seems likely that they'll be released within the next year or so. What do you think of Apple's AI glasses? Are you excited for them, or will you be skipping? Let us know in the comments.
[5]
Apple smart glasses may have a secret weapon against Meta Ray-Bans -- they can understand what you're looking at
As Apple reportedly doubles its efforts to bring in a trifecta of AI wearables, its anticipated smart glasses are at the forefront to take on Meta Ray-Bans. And it already has just the feature to do so: Visual Intelligence. The rumored Apple AI Glasses will rely on the company's visual AI to "see" the world, as pointed out by Bloomberg's Mark Gurman. Part of Apple Intelligence, Visual Intelligence can look at objects, read text and get information about them. And it's fittingly gearing up to be heavily integrated into Apple's first take on smart specs. Visual Intelligence is surely set to power Apple's next era of AI devices, and it's looking like we'll see it in action in AI smart glasses first. If this is the case, it will be the first big stepping stone to take on the best smart glasses on the market today -- catching up to the likes of the latest Ray-Ban Meta glasses. What is Apple Visual Intelligence? Apple's Visual Intelligence is an AI feature that lets users analyze the on-screen content and the world around them through the camera. It can recognize objects and offer information about them, answer questions about what it sees, allow real-time translation, instant web searches and more. It was first introduced on the iPhone 16 Pro, and is available from the iPhone 15 Pro and up. It relies on OpenAI's ChatGPT for results, and it's become a quick way to find out details on anything you take a picture of. Essentially, it's Apple's version of Google Lens, and it makes sense that this Apple Intelligence feature would come to the Cupertino tech giant's rumored smart glasses. Interestingly, though, it's looking to be the main feature powering the specs. Visual Intelligence takes the lead During Apple's earnings call, Gurman notes that CEO Tim Cook heavily hinted at his interest in Visual Intelligence, which is a clear sign that there's more to come from the feature: "One of our most popular features is Visual Intelligence, which helps users learn and do more than ever with the content on their iPhone screen, making it faster to search, take action and answer questions across their apps," Cook stated. With word of Apple's potential AI wearables all fitted with cameras, including an AI pendant and AI AirPods, it makes sense that Apple would utilize a technology it's already using. And it's also set to improve. According to the report, Apple is working on its own AI visual models, which could coincide with the eventual release of Siri 2.0 in 2026. These would take full advantage of the expected built-in cameras in the smart glasses, which are tipped to be used to snap photos and record video. However, rumor has it that there will be a high-resolution camera in the Apple AI smart glasses for pictures and video, along with one dedicated to Visual Intelligence. If Apple does make Visual Intelligence on its smart specs a priority, it would be interesting to see what other features it could offer in a new form factor that isn't an iPhone. Catching up to Meta With the tools that Apple's Visual Intelligence offers on smart glasses, this still wouldn't be a bold new feature coming to specs, as Meta already has its own form of visual AI onboard. As with the Ray-Ban Meta (gen 2) and Meta Ray-Ban Display, these specs are able to "see" what you're looking at -- able to identify objects and bring up information about them and answer questions. They can also help with navigation in maps, which is handy (but there's still a bit more improvement to be done in that area). And yes, you can expect these features on Google's Android XR glasses, too. Regardless, bringing Visual Intelligence to the anticipated Apple Glasses is a no-brainer, and it shows the Cupertino team already has the means to deliver smart specs to catch up with the competition when it's tipped to arrive at the end of 2026 or 2027. Cook seems certain that Visual Intelligence will flourish, and Apple's smart glasses look to be the device to show off this vision. In the meantime, find out how to put Visual Intelligence to good use on your iPhone. Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.
[6]
Visual Intelligence could be Apple's killer AI wearable feature
Visual Intelligence can identify objects and integrate with apps while maintaining privacy through on-device processing and Private Cloud Compute. Mark Gurman's latest Power On newsletter has several interesting tidbits about upcoming Apple products, but perhaps the most fascinating concerns Apple's plans for future AI-powered wearables. We've heard about these before -- Apple is working on smart glasses (similar to the Meta Ray-Bans), AirPods Pro with cameras, and some sort of pin/pendant item. All are at various stages of development, and all of them will apparently lean heavily on Visual Intelligence. That's Apple's brand for the application of AI to things your device's camera sees. It launched as part of the iPhone 16 Pro and then came to other devices with expanded capabilities. You can take a photo of something around you to get contextual information about it, or even take a screenshot and do the same. You can ask ChatGPT about the subject as well, and the system is smart enough to change your options contextually. If you're looking at an event poster with dates and times, you can simply add it to your calendar. If it's a restaurant, you can look up reviews, hours, or the menu. You can identify plants or animals, and do google image search to find similar objects online. Apparently, Tim Cook sees this area of AI technology as central to its upcoming AI devices. Apple is building its own visual models and intends to make this technology -- contextual awareness based on what the AI "sees" -- a central pillar of future devices. For example, you could simply look at your plate of food to get information on ingredients, portions, or nutritional info. Turn-by-turn directions could use visual landmarks instead of just street names or distances. Reminders could be triggered by walking up to and seeing something, not just times and locations. Cook has been singling out the feature in recent appearances. He gave it a shout-out at the company's last earnings call, and at an all-hands meeting in which he discussed the company's AI ambitions. It's a little odd to bring it up so consistently when it's not exactly new and hasn't changed much in the last year or more. Clearly, the technology is on his mind, likely because he's focused on the company's upcoming new products. Obviously, privacy is central to AI that is processing what it sees around you. And in this area, Apple has an advantage -- strong neural processors in hundreds of billions of devices enables more on-device processing than most competitors, and the company's Private Cloud Compute architecture ensures that anything that is processed in the cloud protects your privacy by design, too.
[7]
Apple's AI Wearables Expected to Lean Heavily on Visual Intelligence
Apple's Visual Intelligence is expected to feature heavily in the company's upcoming set of AI wearable devices, which could include smart glasses, a pendant, and more advanced AirPods, according to Bloomberg's Mark Gurman. Writing in his latest Power On newsletter, Gurman said that hints dropped by CEO Tim Cook in recent months suggested the Apple Intelligence feature would be central to the devices, with Cook's comments following a pattern similar to how he foreshadowed the importance of health sensors and augmented reality before the launch of Apple Watch and Apple Vision Pro, respectively. On iPhone 15 Pro and newer models, Visual Intelligence lets you use the camera to learn more about places and objects around you. It can also summarize text, read text out loud, translate text, search Google for items, ask ChatGPT, and more. Gurman has previously reported that Apple's upcoming smart glasses will have an advanced camera system with a high-resolution camera that's able to capture photos and videos, as well as a second camera that provides visual information to Siri and environmental context. Meanwhile, the AI pin - should the device make it to launch - is said to have a lower-resolution camera to provide the AI with visual insight, but it won't be able to take photos or videos. The camera is always-on, recording what's around the wearer. Like the AI pin, the more advanced AirPods will have a low-resolution camera that's designed for information, rather than photo capture. During a discussion about AI and Apple Intelligence on the company's holiday quarter earnings call, Cook touted Visual Intelligence as "one of our most popular features." Cook said it "helps users learn and do more than ever with the content on their iPhone screen, making it faster to search, take action and answer questions across their apps." On another occasion, during a recent all-hands meeting with employees about AI, the Apple chief reportedly singled out Visual Intelligence as a standout element of Apple Intelligence - even though the feature relies heavily on OpenAI and Google technologies. Gurman argues that Cook "wouldn't be putting it at the forefront of his remarks if things weren't going to accelerate in that area soon." Apple's smart glasses will compete with the Meta Ray-Bans. Apple is said to have recently provided its hardware engineering team with prototypes, and it is targeting a 2027 launch. Production on the glasses could begin as soon as December 2026. AirPods with cameras are planned for as early as this year, while Apple's work on the AI pin is apparently in the early stages, and it's possible that it could still be canceled. If work continues, the AI pin could launch as soon as 2027.
[8]
Apple's AI Pendant Sounds Like a Watered-Down Humane Ai Pin
If you were tempted to make the comparison between Apple's rumored AI pin and Humane's notorious Ai Pin, let me stop you right there. There's (reportedly) a big difference, and that difference is that Humane's defunct AI gadget actually tried to be something other than an accessory to your phone. According to Bloomberg, sources within Apple say that the company is indeed working on an "AI pendant" that you can pin to your shirt or wear as a necklace. For all intents and purposes, it sounds a lot like a pin formerly made by Humane. It reportedly has a camera for computer vision, a microphone so it can be operated by Siri, and potentially a speaker so that you could have a back-and-forth rapport with Siri as well (this last one is reportedly up in the air, though). There is one thing that's different between the two, though, according to Bloomberg's Mark Gurman, and that's the fact that Apple's AI pendant likely won't do much of anything on its own. Apple's pinnable AI gadget will allegedly "rely heavily on an iPhone for processing" and will include a chip that is "closer in computing power to AirPods than an Apple Watch." Translation? It's not going to be doing any heavy lifting. That's probably fine, considering most people don't have any expectations for an AI pin to begin with, but it is a little deflating to know that Apple is seemingly taking the idea of Humane's Ai Pin and scaling it back. Maybe it's for the best. Humane's Ai Pin, as a quick recap, didn't really do a lot of what the company promisedâ€"it was buggy, prone to overheating, and it also cost a lot of money. On top of the $699 price tag for buying the pin, you were required to pay a $24 monthly subscription for cellular service in order to use the device while out and about. No one said killing the smartphone was going to come cheap, okay? Theoretically, Apple's version wouldn't be burdened by those issues as heavily, though, since it offloads computing to your iPhone and can also rely on its wireless connectivity for internet. Whether Apple can convince people to buy an AI pendant is another story entirely. And speaking of uphill battles, Gurman also dropped more intel on Apple's upcoming smart glasses. According to the report, Apple will have another archetype in mind for its first pair of smart specs, and this time, that role model is Meta. Apple's alleged smart glasses will have two cameras, one for computer vision and another for pictures and videos. They'll reportedly have speakers and microphones, allowing wearers to make calls, play music, and snap pics and videos by using Siri. On top of that, they'll also apparently be able to use AI to do stuff like interpret text or provide turn-by-turn directions. If you're reading all of this and thinking, "Boy, that sounds awfully familiar," that's because those are mostly all things that Meta's AI glasses can already do. Even if that's true, though, Apple's smart glasses might still take the crown due to their tight integration with the iPhone. As useful as Meta's AI smart glasses can be, the process of interfacing with your phone still feels disjointed at times, to no fault of Meta's. Something tells me an Apple-made pair won't share that problem. Both the smart glasses and the AI pin are tenuous, to be clear. Even Bloomberg reports that the pin is an "early stage" and "could be cancelled." While smart glasses are much more likely to be released, what form they take is fluid, and so is the release date. For now, though, it looks like what we can expect is a bit more of the same, albeit with some Apple flourishes.
[9]
Apple is reportedly working on AI smart glasses, AirPods that can see, and its own version of those disastrous AI pins
According to recent reports, Apple is going hard on the next generation of AI wearable technology. So far, even the best Apple Watches have been largely limited to health and fitness and a few communication features as an extension of your phone, while its AirPods Pro 3 have been enhanced with heart-rate detection and live translation. Now Apple is taking its wearable AI smarts one step further, according to Bloomberg's Mark Gurman, with a trio of wearables sporting AI cameras to provide 'contextual information' for its revamped Siri AI chatbot. The devices will function like the Meta Ray-Bans AI glasses, which let ask Meta a question based on the world around you and interpret your request as best they can. Gurman reports that Apple is making its own AI glasses to take the fight to Meta, developing its frames in-house (rather than collaborating with an established glasses maker like Ray-Ban) with "an advanced camera system with a high-resolution camera that's able to capture photos and videos, as well as a second camera that provides visual information to Siri and environmental context" according to MacRumors. It's reportedly focusing on build quality to distinguish itself from Meta, which likely means the glasses will be a premium product. With multiple sizes and colors set to be available, MacRumors says Apple is aiming for an "all-day AI companion". The other two devices are expected to be an AI wearable pin - along the lines of the disastrous Humane AI Pin and other devices that have flopped in previous years - and AirPods fitted with cameras. Both devices are expected to have lower-resolution cameras designed for information rather than taking high-quality photos, with included microphones for speaking with Siri. Gurman adds that a speaker for the AI Pin is being considered, but isn't confirmed. The devices would reportedly act like 'the eyes and ears' of an iPhone, linked to Siri on your phone rather than using on-device AI. Apple has struggled to make ground with splashy launches in areas of innovation in recent years, notably with the Apple Vision Pro. It's not that the device wasn't good; rather, the world wasn't ready for it (and it's expensive). However, if these rumors are true and we see the first hints of these devices in 2027, I have the feeling that Apple's smart glasses or AI wearables will do significantly better. Apple has historically done well in fields others have broken ground in. It's not often been a true trailblazer - it lets others take the initial risks, then aims to produce an improved model. Look at how long Apple has taken to launch a foldable phone, for example: something we know it's currently working on. With its smart glasses Apple will look to fix the mistakes it sees Meta making, ironing out the pain points of the form factor and launching when potential customers have become used to the new concept of smart glasses. Likewise with an AI pin or pendant. The Humane AI Pin was its own device, and it flopped, with a strange LED-display system on a device designed to completely replace the user's phone. Apple doesn't need to replace the iPhone; by all accounts it wants to add to it, keeping it as a hub in your pocket while reducing the need to take it out all the time. AirPods with AI cameras would be a similar concept. It's all about making technology easier to use, and reducing friction between the user and their gadgets. That's one thing Apple has always excelled at, and I see no reason why these rumored devices wouldn't be successful.
[10]
Apple is developing AI smart glasses, AirPods, and pendant, report says
Tim Cook appears at Apple Park during WWDC 2025. Credit: Screenshot courtesy of Apple / YouTube Bloomberg's Mark Gurman, who seems to have a direct line into goings on at Apple HQ, reported today that Apple is working on a trio of new AI wearables -- smart glasses, a pendant-style AI device, and AI-powered AirPods. Gurman reports that all three products would be "linked to Apple's iPhone," be built around Siri, and utilize a "camera system." Other companies have released smart glasses and AI pendants with onboard cameras, but wireless earbuds featuring cameras would be highly unusual. According to Gurman's reporting, the AI pendant and AirPods wouldn't use cameras for photography purposes, but only to power AI features. In addition, Gurman reports that Apple CEO Tim Cook told employees that Apple will be focusing heavily on AI devices. Of course, there's no guarantee the new AI devices will eventually land on store shelves, though Gurman's Apple reporting is usually rock solid. Smart glasses come in a variety of form factors, but Apple is rumored to be working on a pair of high-end glasses with a built-in display and camera system. Gurman says the glasses are called N50 internally at Apple, shedding new light on a product that's been rumored since early last year. Even though early AI wearables have largely failed to launch, big tech companies like Apple, OpenAI, Meta, Google, and Samsung are all racing to produce new AI wearables and/or smart glasses. Sales of Meta smart glasses reportedly tripled in 2025, and Google is expected to release its Android XR smart glasses by the end of the year. Recent advances in generative AI have made these types of devices more capable, such as by giving users the ability to translate foreign languages in real-time, for example. However, Apple fans may have to wait longer for AI glasses. As Mashable reported earlier today, market research company Omdi predicts that Apple smart glasses won't arrive until 2028. However, Gurman's report contradicts that forecast, with Bloomberg reporting that Apple AI glasses could go into production this year, ahead of a 2027 release. Apple has been playing catch-up in the artificial intelligence arms race, and the company recently tapped Google Gemini to power a long-awaited AI revamp of Siri. Even though Apple has fallen behind in developing AI technology, the company is still reporting record sales. On top of that, as competitors ramp up their capital expenditures to never-before-seen levels to build new AI infrastructure, Apple is keeping its new spending flat, putting it in an enviable position in 2026.
[11]
Apple's upcoming smart glasses could get dual cameras and a touch of luxury
New leak suggests Apple smart glasses will blend fashion and advanced imaging Apple is accelerating work on its long-rumored smart glasses, and new reporting suggests the device is shaping up to be one of the company's most ambitious entries in the personal AI era. Early prototypes point toward a premium, fashion-oriented wearable that blends compact hardware, advanced imaging capabilities and Apple's upcoming generation of AI features. Apple tests dual-camera smart glasses with a luxury-forward design According to details shared with Bloomberg, Apple's smart glasses are now in an advanced prototyping phase. The most striking development is the inclusion of dual cameras, a feature rarely seen in consumer eyewear. These cameras are expected to support depth perception, environmental scanning and real-world understanding - crucial for Apple's next wave of AI-driven features that rely heavily on visual context. Recommended Videos The design itself leans toward a luxury eyewear aesthetic rather than a tech-heavy headset. Apple is reportedly testing multiple frame styles, including metal and glass combinations, with finishes that echo the premium sensibilities of its high-end Apple Watch models. Rather than positioning the glasses as an alternative to the Vision Pro, Apple sees them as a lightweight, all-day wearable that brings AI into everyday life without the bulk of mixed reality gear. Apple's strategy reflects a broader shift toward building an ecosystem of ambient, AI-enabled devices. The glasses would serve as a more discreet complement to Vision Pro, offering situational intelligence through the wearer's natural perspective. This aligns with the company's parallel development of camera-equipped AirPods and a pendant-style wearable, which together form a network of sensors designed to interpret the environment and enhance Siri's contextual awareness. The move signals Apple's intention to make personal AI a seamless, constant presence - much like the transition smartphones once made from occasional tools to everyday companions. Why the glasses matter for future Apple users The appeal for users goes far beyond novelty. A glasses-based form factor has the potential to revolutionise Apple's AI experience by allowing the system to actually see what the user sees. That unlocks capabilities like real-time translation, object recognition, hands-free note-taking, navigation cues and accessibility enhancements, all delivered without lifting a phone or speaking a command into thin air. This is also a pivotal moment for Apple's product roadmap. With smartphone growth slowing and wearables becoming a bigger revenue pillar, smart glasses offer a pathway into the next major computing platform. The device could appeal to users who want the advantages of AI-enhanced vision without adopting the fully immersive, and often socially awkward, experience of a headset. What's next as Apple refines its wearable AI ecosystem Apple has not finalised a release window, and as with all of its long-term hardware projects, the glasses may still undergo substantial changes before entering production. The company is also evaluating battery placement, weight and optical comfort, which have historically been challenges for smart eyewear. What's clear is that Apple is steadily assembling the pieces of a multi-device wearable AI ecosystem - one that includes smart glasses, camera AirPods and sensors that work together to understand the world around you. As the company prepares major updates to iOS and its AI architecture later this year, these glasses could become one of Apple's most influential steps toward a future where personal computing lives quietly on your face rather than in your pocket.
[12]
AppleInsider.com
In the face of competition from Open AI and Meta, Apple is said to be redoubling its efforts to get a wearable pendant and camera-equipped AirPods right. In a bid to follow the AI marketplace trends, Apple is rumored to be working on a number of new products that take advantage of artificial intelligence. If a Tuesday report on the topic is correct, Apple is keen to make them a reality. According to Bloomberg, Apple held an all-hands meeting earlier in February. During that meeting, CEO Tim Cook said that Apple was working to push hard on AI devices, and that the company is "extremely excited" about the new categories of products. These new categories are beyond the usual expectations of smart glasses, as they include two items that have been previously brought up before. A pendant and camera-equipped AirPods. Apple Intelligence Pendant The pendant may not have been referred to it in that way before, but it seems to be a reworking of the previously rumored wearable AI pin from January. This time, Mark Gurman describes it as a "pendant" that can be pinned onto a garment or worn like a necklace. Allegedly an idea that predates the often rumored Apple Glass smart glasses, it's an idea reminiscent of the Humane AI pin. However, it would be more an extension accessory for the iPhone, which would provide an always-on camera and microphone for Siri and Apple Intelligence to access. The pendant won't handle actual processing. That will still be an iPhone task. The Industrial Design Team is leading work on the product, but with input from the Vision Products Group over its engineering. There's still debate over whether to include a speaker, which could allow two-way conversations with Siri without taking out the iPhone or using AirPods. Despite the extra work, the project is still in its early stages, and may not ever see the light of day. If progress is made, it could potentially launch as early as 2027. AirPods with Cameras The AirPods with Cameras are a more plausible launch for Apple, which has been rumored about for a few years. Effectively smart glasses without the glasses element, the AirPods would have cameras to view the outside world. Apple has reportedly added low-resolution cameras to the AirPods, as well as the pendant. Those cameras are intended to give AI a view of the world rather than for photography or videography. As for when they may launch, the report says it is planned to arrive possibly later in 2026.
[13]
Apple accelerating work on three new AI wearables, per report - 9to5Mac
According to a new report from Bloomberg, Apple is "accelerating" its development of three new wearable products, including smart glasses, an AI pendant, and camera-equipped AirPods. These products, the report says, are being "built around the Siri digital assistant." The smart glasses will reportedly feature a high-resolution camera for photo and video capture, designed to compete with Meta's smart glasses. Apple has reportedly made "significant progress" on these glasses in recent months. The glasses won't include a display, but rather will rely on speakers, microphones, and cameras. Apple's prototypes have two camera lenses. There's one for high-resolution images and videos, and another for "computer vision" that will "give the device environmental context." With environmental context, for example, the glasses could mirror functionality offered by Visual Intelligence on the iPhone. That context could also be used for more precise navigation in Apple Maps, Reminders integration, and more. "The goal is for the glasses to function as an all-day AI companion, capable of understanding what a user is seeing and doing in real time," the report says. Apple is reportedly developing its own frames for the glasses in a range of different sizes and colors. It explored partnerships with other companies, but ultimately decided to design everything in house. Mark Gurman reports: Early prototypes of the glasses connect via a cable to a standalone battery pack and an iPhone, but newer versions have the components embedded in the frame. The design uses high-end materials, including acrylic elements intended to give the glasses a premium feel. Apple is already discussing launching the device in additional styles over time. Apple is also working on an AI pendant device. The Information reported on Apple's AI-powered, AirTag-size wearable pin last month, describing it as a "thin, flat, circular disc with an aluminum-and-glass shell" with two cameras, a speaker, and three microphones. Bloomberg today adds to that reporting, saying the device is "reminiscent of the failed Humane AI pin," but with several key differences. For instance, it will not have a display or projector system. Whereas the Humane AI Pin tried to be a standalone product, Apple's AI pin will reportedly "rely heavily on an iPhone for processing." The pendant would essentially serve as an always-on camera for the smartphone that also includes a microphone for Siri input. Some Apple employees call it the "eyes and ears" of the phone. One area of debate for the product has been whether or not to include a speaker, which would allow users to hold back-and-forth conversations with the device directly. That means they could leave their iPhone in their pocket or bag or not wear AirPods. Apple envisions that users will wear the AI pin with a clip attached to their clothing or as a necklace. Gurman cautions that the product "remains early-stage and could still be canceled," but could launch as early as next year if Apple sticks to it. Finally, Apple is continuing its work on camera-equipped AirPods that could launch as early as this year.
[14]
Apple's AI wearable roadmap is getting wild: Prepare for AI pendant, smart glasses and AirPods with cameras
In fact, they may be coming as soon as the end of the year, with leaks suggesting the all-new Apple Glasses could arrive by December 2026. However, recent rumors indicate a 2027 launch, with the AI pendant and AI AirPods following suit. Here's a closer look at what to expect from Apple's new range of AI wearables. Previously, Apple's rumored AI device had been expected to be an AI pin, not unlike the Humane AI Pin. However, Gurman suggests it could be worn like a pendant, hence an Apple AI pendant. The AI wearable will reportedly come in the same size as the recent AirTag 2, possibly along with an aluminum flat disk and glass. With a clip, this will allow users to attach it to their clothing or bag, but reports also note there's a hole so it can be used as a necklace. As for what's under the (likely small) hood, the AI pendant will feature a camera for visuals, a microphone for sound and a dedicated chipset -- all of which will be used for users to interact with Siri via an iPhone. Notably, the cameras won't be used for taking pictures or video, but will be always-on to record the user's surroundings. According to a leak from The Information, the pendant will come equipped with two cameras, three microphones and a small speaker, along with a physical button. This would allow for voice and gesture controls, but it isn't certain if Apple will incorporate all these components. As Gurman states, the Apple AI pendant is looking to be the "eyes and ears" of the iPhone, so more of an accessory and less of a product that can be used by itself. For now, there's no telling when Apple plans to release its AI pendant, or how much it will cost, as it's reportedly in the early stages of development. If its rumored smart glasses end up arriving by 2027, we could see the pendant launch soon after. And, as an accessory, fingers crossed it won't cost as much as the now-dead $699 Humane AI Pin. Apple plans to catch up with Meta's Ray-Ban and Google's Android XR in the smart glasses race, and it's looking like we'll finally see its first pair of AI specs by the end of 2026, or in 2027. Apparently, production on these glasses could begin in December 2026. Apple's AI smart glasses are expected to be fitted with built-in speakers and cameras, depending on Siri for AI capabilities. These aren't likely to receive a display, like the Meta Ray-Ban Display specs, as these should be reserved for Apple's AR glasses predicted to arrive by 2028. As with its other AI wearables, the specs will connect with a user's iPhone, which will handle the processing. With its built-in cameras, speakers and microphone, the glasses will be able to take and receive phone calls, snap photos and record video. On top of that, you'll be able to play music and use Visual Intelligence to look at objects, read text and get information about them. Plus, it's expected to have live navigation and translation. We're expected high-resolution cameras in the Apple AI smart glasses for pictures and video, along with one dedicated to Visual Intelligence. As for design, as with the Ray-Ban Meta glasses, Apple is tipped to deliver a range of sizes and styles, but it's uncertain what these could be. According to the report, the AirPods with cameras are real, but are at an earlier level of development compared to the AI pendant. We've been hearing rumors of these for a few years, and Apple may finally be taking the plunge to launch these AI-powered earbuds. Again, this AI wearable will come with cameras, but will be used to record information on the user's surroundings rather than capturing photos. From previous rumors, these would be IR (infrared) cameras, offering advanced gesture controls and support for Siri. There's still a lot to learn about these AI wearables and how they could support other Apple products, but it sounds like they will take more advantage of Apple Intelligence and Siri features. With Apple pushing forward with its AI plans, we're now expecting to see AI-dedicated products come into the fold, and it's only a matter of time until we see if these devices are worth the wait.
[15]
Rumors Suggest Apple and Meta Are Betting Big on AI Wearables
These are all rumors as of now, but both companies seem to be betting big on AI wearables. The next generation of Meta's Display smart glasses might come with a smart watch. According to a report from The Information, Meta's watch, codenamed Malibu 2, could feature fitness tracking features and AI, but its real purpose is to replace the Display's neural band and act as controller for the smart glasses. If the reports are accurate, Meta Display smart glasses with a smart watch could be available in 2026. There aren't any other details on the smart watch, so we don't know the price or what features it may have -- but I'd be surprised if this rumor doesn't pan out eventually. Meta has discussed the idea of a smart watch before, and it makes sense: If you're going to have a wrist-controller for your glasses, why not give it smart watch features as well? Especially if a glasses-and-watch combo potentially gives users a reason to switch away from their Apple Watches. Speaking of Apple, if the rumors about the company are true, Apple is pushing to release its own suite of AI-powered wearable devices. According to a report in Bloomberg, Apple could roll out smart glasses in early 2027. The company is also reportedly developing an AI-powered pendant that can "be pinned to a shirt or worn as a necklace," as well as AirPods with expanded AI capabilities. The AirPods and pendant will be equipped with cameras designed to "help the AI work" as opposed to taking photos. Apple's smart glasses will reportedly not feature a display, but will feature a higher end camera and superior build quality to Meta's smart glasses. All of Apple's devices are reportedly designed to work with iPhones. None of this is confirmed, of course. The closest Apple has come to announcing these plans is CEO Tim Cook mentioning "categories of products" enabled by artificial intelligence at an all-hands meeting. However, everything points to Meta and Apple betting that consumers want a collection of connected AI-wearables. Each company is taking a different approach to hooking users into their ecosystem. Apple seems to be betting on devices integrated with iPhones and controlled with the kind of camera-based tech that powers the Apple Vision Pro headset. Meta seems to be aiming at replacing phones with an in-glasses display, and a biometric control scheme that works with muscle movements, like the existing neural band. Both Meta and Apple seem to be competing to go beyond a screen or smart glasses to become the next interface for your life -- but do people want that? Are consumers excited enough by the prospect of always-available AI and tied-together devices to buy them? That's the big question, and the answer is anything but certain. Both Apple and Meta have made big bets on virtual reality, and, despite both companies' VR devices being excellent, neither seems to have captured the market in way these firms would have liked. So, as they say, stay tuned.
[16]
Apple racing to launch an AI pendant to serve as your iPhone's eyes and ears
The AI hardware industry is moving fast, and it seems Apple wants in on the trend. As per a report by Bloomberg, the company is working on an AI pendant that will come equipped with a camera and might serve as the "eyes and ears" for a paired iPhone. In addition to the pendant, the company is also said to be working on AI-powered smart glasses and AirPods with an onboard camera. What's the big shift? Apple's upcoming AI wearable will reportedly miss out on a display, unlike the doomed Humane AI Pin, which was actually the brainchild of former Apple executives. "It's also designed to rely heavily on an iPhone for processing. Though it has a dedicated chip, the system is closer in computing power to AirPods than an Apple Watch," reports Bloomberg. Recommended Videos Apple's team is still debating whether to equip the AI pendant with an onboard speaker. With a built-in speaker, users will supposedly be able to hold back-and-forth conversations with the device. And since it comes equipped with a camera with AI-enabled world understanding (aka computer vision), it should work somewhat similarly to the Visual Intelligence feature that is already available on iPhones as part of the Apple Intelligence package. The pendant will reportedly be similar in size to the AirTag, which makes sense. Numerous companies are building wearable AI devices, an expanding category that includes names such as Friend, Limitless, and NeoaSapien, all of which broadly serve as somewhat of a digital memory bank to collect your thoughts. What comes next? Apple's AI necklace might hit the shelves early next year, as per Bloomberg. "Apple is working to allow users to wear the AirTag-sized pendant in two primary ways: with a clip that can attach to clothing or via a necklace that can be placed through a hole inside the hardware," the outlet reports. The AI wearable plans come at a time when Apple is ramping up its AI efforts, after nearly two years of lagging behind with Siri, while the likes of Gemini and ChatGPT raced far ahead. Earlier this year, Apple inked a deal with Google that would allow the company to use Gemini's underlying infrastructure and use it for an "AI brain transplant" to enhance Siri with a heavy dose of customization. In addition to AI wearables, Apple is reportedly working on a tabletop robotic device, as well. Looking at the competition, OpenAI is also gearing up to launch an AI wearable device, designed in partnership with a firm started by legendary Apple designer, Jony Ive. Meta, on the other hand, acquired the company behind the Limitless AI wearable.
[17]
Apple's N50 smart glasses targeting 2027 launch
Apple is reportedly accelerating the development of three AI-powered wearable products. These devices include an AI pendant, AI-powered smart glasses, and AirPods with new AI capabilities. The information was reported by Bloomberg, following an earlier report from The Information. The AI pendant is described as being AirTag-sized and featuring cameras. It can be pinned to a user's shirt. The smart glasses are code-named N50. They will include a high-resolution camera. Production is targeted to begin as early as December for a public release in 2027. Meta and Snap (which plans to release "Specs" this year) are also developing similar products in this market. The glasses are characterized as "more upscale and feature-rich" compared to the AI pendant and AirPods. All three items are designed for connection to the iPhone. Siri, Apple's virtual assistant, will be a central component of the user experience for these wearables.
[18]
Apple's AI Pendant Said to Use In-House Visual Intelligence Models
Apple's AI wearables could be able to provide context-based results Apple is developing new AirPods models with cameras, while also foraying into new AI-powered wearable categories, a report recently highlighted. The company is said to be working on an AI-enabled pendant that could be worn around the neck, which might feature a suite of sensors and cameras to answer user questions based on their surroundings. Now, Bloomberg journalist Mark Gurman has shared more details about the AI models that will power these wearables. The Cupertino-based tech giant is reportedly working on its own visual models for the upcoming devices. First introduced with the iPhone 16 Pro series in 2024, Visual Intelligence is part of the Apple Intelligence suite of tools. Apple's AI Pendant, Smart Glasses Could be Able to Take Context-Based Action According to the latest edition of Gurman's Power On newsletter, the Cupertino-based tech giant is working on its AI visual models to enable the Visual Intelligence features on the rumoured AI pendant, AI smart glasses, and AirPods model with cameras. This will enable the wearables to provide environment-based answers to users and take context-based actions. Gurman adds that Apple intends to make Visual Intelligence and visual models integral to its upcoming wearables. Coming to Apple's purported AI pendant, the device will reportedly be worn around the neck or body. It is said to be equipped with a suite of sensors and "computer vision cameras", allowing it to answer what is on a user's plate and more. Released with the iPhone 16 Pro lineup in 2024, Visual Intelligence currently leverages multimodal capabilities of OpenAI's ChatGPT to generate results. However, Apple's new visual models could change this. Additionally, Gurman claims that Apple has shelved the idea of launching the Apple Watch model with cameras. This comes soon after Apple CEO Tim Cook, in an internal meeting, revealed that the tech giant plans to launch new AI-powered product categories and services. However, the launch timeline, names, and other details about these unspecified devices remain under wraps. Recently, a report highlighted that Apple's purported AI smart glasses will add to the company's smartphones' functions. The new wearables are said to be thin and light, which Apple will reportedly achieve by offloading AI processing to the paired iPhone models.
[19]
Apple Is Racing to Build Smart Glasses, AI AirPods and a Pendant You Wear Around Your Neck
Bloomberg reports the company is accelerating three camera-equipped wearables to compete with Meta and OpenAI -- with production starting as early as December. Apple seems to be betting its future on wearables. The company is accelerating development of three camera-equipped devices, according to Bloomberg. First, eyewear with high-resolution cameras targeting production as early as December for a 2027 release. Code-named N50, it will compete with Meta's glasses and let users make calls, access Siri, and take actions based on surroundings. Second, camera-equipped earbuds planned for as early as this year. The cameras let Siri see what you're looking at, not capture images. Third, a clip-on pendant reminiscent of the failed Humane AI pin but designed as an iPhone accessory -- some Apple employees call it the "eyes and ears" of the phone. Read more
[20]
Apple's AI pendant could be the 'eyes and ears' of the iPhone | Stuff
The report in January that Apple is planning an AirTag-like wearable AI pendant came from a credible source, but also out of left field. Now the report been corroborated by a noted Apple watcher. In a wider update about Apple's wearable ambitions, Bloomberg's Mark Gurman has an update about the in-development, which he says Apple employees are calling that "eyes and ears" of the iPhone. There's no display or a projector, unlike the failed Humane AI pin and it'll rely on the iPhone for all of the heavy lifting. So the report from The Information that it's effectively an AirTag-like with additional cameras and a mic appears to be on the money. According to this latest report, it'd be worn around the neck on a necklace and may have a speaker so users could conduct phone conversations without consulting having to grab the iPhone - sort of like AirPods do now. "The pendant would essentially serve as an always-on camera for the smartphone that also includes a microphone for Siri input. Some Apple employees call it the "eyes and ears" of the phone," Gurman's report on Tuesday says. "One area of debate for the product has been whether or not to include a speaker, which would allow users to hold back-and-forth conversations with the device directly. That means they could leave their iPhone in their pocket or bag or not wear AirPods." Gurman says the product is still in the early stage and could yet be cancelled. If Apple decides this un-Apple-like product is a goer then it might arrive as soon as next year, alongside Apple's first smart glasses. Gurman's report also says Apple is progressing well on its efforts to place cameras inside of AirPods.
[21]
Did Tim Cook hint at Apple developing AI glasses and camera wearables? What we know so far
Apple is reportedly gearing up for a new era of AI wearables. CEO Tim Cook's recent comments suggest a strong push towards devices leveraging Visual Intelligence. This technology, which analyses a user's surroundings, could power future AirPods, smart glasses, and even a pendant. Apple is usually tight-lipped about its product launches or generally anything that fans can expect from the tech giant. However, CEO Tim Cook seems to be dropping hints about the category of products that are next in line the company's portfolio. According to Bloomberg's Mark Gurman, Cook is suggesting that Apple is pushing towards AI wearables, built around Apple's Visual Intelligence feature that scans the user's environment for context to take action -- sort of how Meta's AI glasses work. Also Read: India's iPhone exports hit $23 billion in 2025 as smartphones become top export segment On Apple's recent earnings call, he singled out Visual Intelligence as "one of our most popular features". "One of our most popular features is Visual Intelligence, which helps users learn and do more than ever with the content on their iPhone screen, making it faster to search, take action and answer questions across their apps," he said. In a company-wide meeting, he also reportedly doubled down, saying Apple "unquestionably" has a "huge advantage" in AI, pointing to its 2.5 billion-device installed base and again spotlighting Visual Intelligence. What's interesting is that this is not the first time Cook has dropped breadcrumbs before a major product launch, which makes this theory more plausible. Back in 2013, years before the Apple Watch was unveiled, Cook said the "whole sensor field is going to explode." At the time, Apple was quietly building what would become its first major new product category, hiring aggressively from the medical sensor industry. The watch didn't launch as a medical lab on your wrist, but over time Apple added heart-rate monitoring, ECG, hypertension detection and sleep apnea alerts. The company is still reportedly working on noninvasive blood glucose monitoring. Also Read: Apple TV partners with EverPass Media to bring sports lineup to bars, hotels He did something similar ahead of the Vision Pro. For years, Cook talked up augmented and virtual reality even saying in 2016 that AR would one day be as essential as eating three meals a day. Apple eventually spent billions developing Vision Pro, unveiling it in 2023 and launching it in 2024. The device hasn't become mainstream yet. But it is clear that when Cook repeatedly emphasises a technology, it's rarely accidental. The company first rolled out Visual Intelligence on the iPhone 16 Pro in 2024 under Apple Intelligence, before expanding it to other devices. Right now, the feature lets users snap a photo or take a screenshot and ask questions about it via OpenAI's ChatGPT. You can also run a reverse image search through Google if you're trying to identify something. Also Read: Apple's global 'special Apple experience' signals a new launch strategy Gurman, in his latest newsletter, notes that Apple is developing its own visual models and intends to build this capability into a new crop of AI-first hardware. That includes upgraded AirPods, smart glasses and even a pendant-style wearable equipped with cameras and sensors that can sit around your neck. Earlier plans to add cameras to the Apple Watch have reportedly been put on hold. At the basic level, think identifying what's on your plate -- breaking down ingredients and items from a quick snap. At a more advanced level, the device could guide you through tasks based on what it sees. Instead of telling you to walk 200 feet and turn right, it could tell you to go past that cafe and turn at the red building. It could also nudge you with reminders when you approach a specific object or place.
[22]
Apple's Secret AI Wearables: Everything We Know About the New Glasses, Pin, and AirPods
Apple is reportedly developing a new lineup of AI-powered wearable devices, including AR glasses, an AI communication pin, and AI-enhanced AirPods. These devices are designed to transform how you interact with technology by using artificial intelligence to deliver smarter, more intuitive experiences. With features such as contextual awareness, voice command integration, and seamless compatibility with Apple's ecosystem, these wearables aim to enhance productivity, connectivity, and convenience in your everyday life. By embedding advanced AI capabilities into these devices, Apple is setting the stage for a future where technology adapts to your needs in real time. The video below from Matt Talks Tech gives us more details. Apple's AR glasses, anticipated to launch by 2026, represent a significant step forward in combining augmented reality (AR) with advanced AI functionality. These glasses are expected to act as an extension of your Apple devices, allowing you to perform tasks like identifying objects, receiving real-time notifications, or issuing voice commands through Siri. Imagine walking into a room and asking the glasses to identify a painting or provide reminders about your upcoming schedule. While initial models may lack a heads-up display (HUD), future iterations could incorporate this feature, allowing digital overlays to seamlessly integrate with your physical surroundings. This would enable you to view directions, notifications, or other digital content directly in your field of vision. Additionally, Apple is likely to offer prescription lens options, making sure accessibility for users with vision needs. Other potential features, such as hands-free video recording, could make it easier to capture moments or document your surroundings effortlessly. The AR glasses are expected to integrate seamlessly with Apple's ecosystem, including iPhones, iPads, and Macs, making them a central tool in your digital life. Whether you're navigating a new city, managing your daily tasks, or exploring creative projects, these glasses could redefine how you interact with both the digital and physical worlds. The AI communication pin is envisioned as a versatile and compact wearable designed to assist with information capture and organization. Equipped with advanced audio recording capabilities, this device could summarize conversations, lectures, or meetings, making it particularly valuable for students, professionals, and anyone who needs to keep track of important details. For example, during a brainstorming session, the pin could record key points and generate a concise summary for later reference. A built-in camera might further enhance its functionality, allowing you to take quick snapshots for visual note-taking or to support contextual queries. Whether you're capturing a diagram during a lecture or snapping a photo of a product label for research, the AI communication pin could streamline your workflow. This wearable is expected to sync effortlessly with other Apple devices, allowing you to organize and access your data with ease. Its compact design and AI-driven capabilities make it a practical tool for managing both work and personal tasks. Whether you're attending a conference, planning a project, or simply organizing your day, the AI communication pin could serve as a reliable and efficient assistant. Apple's AI-enhanced AirPods aim to redefine audio experiences by incorporating environmental awareness and AI-driven sound optimization. Using built-in sensors or cameras, these AirPods could analyze your surroundings and automatically adjust noise cancellation to suit specific environments. For instance, they might reduce background noise in a bustling café or enhance clarity in a quiet library, making sure optimal sound quality wherever you are. Beyond audio enhancements, these AirPods could assist with a variety of tasks. For example, they might record meeting notes during virtual calls, provide real-time translations, or answer contextual questions about your surroundings. Imagine asking your AirPods for information about a nearby landmark or requesting a summary of a recent conversation. By integrating these advanced features, Apple aims to make AirPods an indispensable tool for both work and leisure. These AI-enhanced AirPods are also expected to maintain seamless compatibility with Apple's ecosystem, allowing you to switch effortlessly between devices like iPhones, iPads, and Macs. Whether you're listening to music, participating in a virtual meeting, or exploring new places, these AirPods could elevate your audio experience to a new level of intelligence and convenience. Apple's AI-powered wearables reflect the company's commitment to pioneering innovation and enhancing the user experience. By integrating artificial intelligence with innovative hardware, these devices aim to make Apple's ecosystem more intuitive, efficient, and user-friendly. Each wearable offers unique benefits that cater to different aspects of your daily life: As these devices evolve, they are poised to play a pivotal role in shaping the future of wearable technology. By offering tools that enhance productivity, connectivity, and user experience, Apple's AI-driven wearables could redefine how you engage with the world around you. With their seamless integration into Apple's ecosystem and their potential to adapt to your unique needs, these devices represent a bold step forward in the evolution of smart technology. Browse through more resources below from our in-depth content covering more areas on Apple AI Wearables. Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.
[23]
Apple Is Working on an AI Pendant, AI Smart Glasses and AirPods with Cameras
Apple is also working on AI smartglasses as well as AirPods with some sort of AI capabilities. Apple seems to be working on a trio of AI-powered wearable devices, which includes a AirTag sized pendant, its own AI-smart glasses, and AirPods with improved AI functionality. The Cupertino giant is accelerating the development of all 3 of these products to stay in the competition with brands like Meta and OpenAI. This news comes from a report by Bloomberg's Mark Gurman, who mentions that the Cupertino giant is focusing on building its wearable suite of devices. Starting with an AI-pin, which will include a camera. It can be pinned to a shirt or worn as a pendant. As if the failure of the Humane AI pin wasn't much of a lesson. Apple is also working on AI smart glasses that will take on Meta's RayBan and upcoming glasses by Snap called Specs. These smart glasses will feature a high resolution cameras. However, there is no word on whether these will include a built-in display like the Meta RayBan Display or not. Apple has been working on a chip for these spectacles since last year. And according to Bloomberg's report, the company plans to start production by December 2026. This is ahead of the 2027 public release schedule. There are also rumors about AirPods with improved AI capabilities. Likely featuring a camera, but not much is known at this point. Reportedly, all these features will be centered around the new Siri built with Google Gemini. These three devices will be able to connect with your iPhone, offering a much more seamless user experience. The reception of these 3 AI devices will depend on how much Apple improves Siri over its current iteration.
[24]
Apple Working On An AirTag-Sized AI Pendant As The Race For AI-Enabled Consumer Devices Heats Up
Apple intends to complement its upcoming camera-equipped AirPods Pro with a dedicated AI pendant, with both devices expected to serve a niche segment within the overall wearable AI segment, one where subtlety seemingly reigns supreme, unlike the more in-your-face smart glasses. We noted recently that Apple was developing a dedicated AI pin, with initial sales expectations set at 20 million units. The wearable AI pin would presumably come equipped with multiple cameras, a speaker, microphones, and wireless charging. Now, Bloomberg's Mark Gurman has clarified that the AI pin is more like an AI-enabled pendant around the size of an AirTag. As per additional details, Apple is designing its AI pendant as an accessory to the ubiquitous iPhones. Essentially, the device would serve as an always-on camera, replete with a microphone to act as a conduit for channeling commands to Siri - the Cupertino giant's bespoke voice assistant. It is hardly a surprise, therefore, that some Apple employees have dubbed the planned AI pendant the "eyes and ears" of the iPhone. Of course, unlike the Humane AI Pin, Apple's AI pendant would not sport a projector or a display system. It would also rely heavily on its iPhone counterpart for most processing despite sporting an onboard chip, albeit comparable in power to the one that is housed within the AirPods or the Apple Watch. Even so, Apple engineers are still undecided whether to equip the AI pendant with dedicated speakers. Apple is currently exploring two ways of allowing users to "wear" the device: either a clip that attaches the pendant to one's clothing or a dedicated necklace. Of course, Gurman prefaces this information by noting that the project is still in its nascent stage and could get terminated.
[25]
Apple Said to Be Exploring AI-Centric Wearables Beyond iPhone
Camera-equipped AirPods are also reported to be in development Apple is said to be developing a new wave of AI-centric hardware. According to a seasoned journalist, the Cupertino-based tech giant's lineup of artificial intelligence (AI) devices includes smart glasses, camera-equipped AirPods, and a wearable pendant device. Apple is said to be exploring new form factors that extend beyond the iPhone and Vision Pro. While the company has yet to confirm these plans, the projects are reportedly in active development, with early prototypes already being tested internally. Apple's AI-Centric Wearables According to Bloomberg's Mark Gurman, the first device in Apple's AI lineup is a pair of lightweight smart glasses that could eventually complement or succeed the iPhone's functions. Unlike the high-end Vision Pro headset that is geared towards developers, the glasses are expected to be more consumer-friendly. Apple's smart glasses project is reportedly internally codenamed "N50." Instead of being a standalone device, the tech giant's N50 glasses are said to connect wirelessly to an iPhone, offloading compute-intensive tasks while the glasses handle data capture and lightweight functions. The report claims that early prototypes include cameras and microphones that feed environmental data into Apple's AI systems, enabling features such as object recognition, contextual prompts, and live assistance. Beyond smart glasses, the tech giant is also said to be working on AirPods equipped with outward-facing cameras. The purported earbuds would gather visual data about the user's surroundings, integrating with Apple's AI models to improve spatial awareness and context-based interactions. Gurman said that the concept could enable gesture controls, environmental understanding, and more proactive Siri responses. This aligns with previous leaks that hinted towards a possible launch of AirPods with embedded cameras. Per reports, the "next AirPods Pro can see around you", and the product could launch at the same price as the current model. The last experimental product in development is a wearable AI pendant. The device, reportedly designed to be worn around the neck, is reported to include cameras and sensors that are capable of continuously analysing the user's environment. Similar to other AI-first wearables emerging in the market, the pendant would rely on advanced on-device processing and cloud-based AI to deliver real-time insights and assistance, potentially serving as a digital companion. The report notes that these initiatives are part of Apple's long-term strategy to reduce reliance on the iPhone by introducing new categories of AI-powered devices. None of the reported devices, however, has officially been confirmed. More details about these projects could emerge in the coming years.
[26]
Apple smart glasses sounding ready to go head-to-head with Meta | Stuff
Apple has made "significant progress" on its smart glasses. It all appears to be coming together for a launch in 2027. Apple is reportedly "accelerating" work on its next generation of wearables arriving as soon as 2027... and there isn't a screen in sight. Bloomberg's Mark Gurman reports the three products include the next-generation AirPods Pro with cameras for contextual awareness, the company's first ever smart glasses, and the previously reported AI pendant. The products are all going to be "built around the Siri digital assistant" Gurman says, but we're yet to see the fruits of the rebirth that would make Siri a suitable frontman for such devices. The major news from the report is that the smart glasses seem to be Let's start with the smart glasses, which Apple has made "significant progress on" in recent months, according to Gurman's sources. They'll have a pair of camera lenses - one for taking images and videos and another for the "computer vision" that'll give the specs environmental context. Visual intelligence for your eye line. "The goal is for the glasses to function as an all-day AI companion, capable of understanding what a user is seeing and doing in real time," Gurman adds. According to the reporter, Apple has finalised an in-house frame design having decided against partnering with existing manufacturers. Apple has also found a way to keep the internal components and the battery inside the frame. Gurman adds: "Early prototypes of the glasses connect via a cable to a standalone battery pack and an iPhone, but newer versions have the components embedded in the frame. The design uses high-end materials, including acrylic elements intended to give the glasses a premium feel. Apple is already discussing launching the device in additional styles over time."
[27]
Apple Equipping The Next AirPods Pro With A Camera To Take On Meta's Dominance In The Wearables Segment
Apple is equipping a range of upcoming devices, including the next AirPods Pro, with a camera to imbue them with incremental AI features, thereby hoping to halt Meta's relentless ascendancy in the wearable AI segment. According to Bloomberg's Mark Gurman, Apple has been working for quite a while on an innovative design for its AirPods Pro, aiming to equip the bespoke earbuds with a camera. According to Gurman, the new AirPods Pro might launch as early as this year, and complete a troika of AI-powered devices, which include smart glasses and an AI pendant, to take on Meta's early lead with its Ray-Ban smart glasses. Do note that the prolific Apple-geared analyst, Ming Chi-Kuo, tipped all the way back in 2024 that IR camera-equipped Apple AirPods Pro were in the offing. Moreover, Apple received a patent in July 2025 for leveraging cameras - akin to the Face ID's dot projector - in proximity detection and 3D depth mapping use cases, which obviously has tantalizing applications for the rumored earbuds. As we noted recently, a neat little theory has made a connection between Apple's acquisition of Q.ai and the rumored IR-equipped AirPods Pro, positing that the IR camera would allow the new earbuds to accurately grasp the user's silent speech/whispers by analyzing micro facial movements, leveraging Q.ai tech in this process.
[28]
Apple ramps up AI hardware plans with smart glasses, wearable pendant and upgraded AirPods: Report
Pendant and upgraded AirPods may arrive earlier, possibly from 2026. Apple is reportedly accelerating work on three new wearable devices, viz., smart glasses, a clip-on pendant and camera-equipped AirPods, as part of a broader push into artificial intelligence-powered hardware. According to a Bloomberg report citing people familiar with the matter, the products are being built around Siri and designed to interpret visual context using integrated cameras. Apple hasn't verified the plans, but the development signals Apple's attempt to reposition itself in the fast-evolving AI hardware race, where companies such as Meta Platforms Inc. and OpenAI are already investing in next-generation wearables. Apple's most advanced project is a pair of smart glasses, internally code-named N50. Unlike mixed reality headsets like the Vision Pro, these glasses are not expected to include a display. Instead, they will rely on speakers, microphones and two camera systems, one for high-resolution capture and another dedicated to computer vision. The glasses will connect to the iPhone and use Siri to process real-world input. Users could look at an object and ask what it is, scan printed text to add events to a calendar, or receive navigation cues based on visible landmarks. Production could begin as early as December 2026, and a public release is possible in 2027. Apple is said to be focusing on premium build quality and camera performance to differentiate itself from competitors like Meta's Ray-Ban smart glasses. Also Read: Google I/O 2026 scheduled for May 19-20: What to expect and how to watch the event live Apple is also said to be developing a smaller wearable device that can be clipped onto clothing or worn as a necklace. It is internally described by some employees as the 'eyes and ears' of the iPhone, and the device would include a camera and microphone to provide always-on contextual input to Siri. The Apple pendant would be different from the now-defunct Humane AI Pin. It would not include a display or projector, but rather rely on the iPhone for processing. The product remains in early development and could still be cancelled. If approved, it may launch as early as next year. Finally, the report adds that Apple is also preparing upgraded AirPods with camera sensors to enhance AI functionality. These earbuds are expected to assist Siri with spatial awareness and contextual understanding, rather than serve as photography tools. Apple has already introduced AI features such as live translation in recent AirPods models. The next iteration would deepen integration between audio input, environmental awareness and on-device intelligence. Its launch could happen as early as this year. So, like the broader industry, Apple is also accelerating its AI hardware plans. While iPhone sales remain strong, AI is expected to move more interactions away from screens and into ambient devices. Meta has already found early traction with its AI glasses, and OpenAI is reportedly working on new AI-native hardware. But unlike those brands, Apple can leverage ecosystem control and keep users locked into its services. But Apple may be more careful as the Vision Pro headset, despite its technical strengths, struggled to achieve mass-market adoption due to pricing and use-case limitations. Apple is believed to have scrapped a lighter, cheaper headset variant to focus resources on smart glasses instead. Let's hope Siri and Apple Intelligence are up to snuff, and the company has found meaningful use cases for everyday use and addressed the privacy concerns. We may officially learn more about Siri revamp later this year and hints about Apple's hardware plans at next WWDC. Keep reading Digit.in for such information.
Share
Share
Copy Link
Apple is ramping up development of three AI-powered wearables: smart glasses with dual cameras, AirPods equipped with infrared sensors, and a camera-enabled pendant. All three devices will rely on Visual Intelligence and deeper Siri integration, targeting releases between late 2026 and 2027 as Apple enters the AI hardware race against Meta and Google.
Apple is accelerating development of a trio of AI wearables that could reshape how users interact with artificial intelligence in their daily lives. According to Bloomberg reporter Mark Gurman, the company is ramping up work on smart glasses, AirPods with cameras, and an AI pendant, all built around deeper Siri integration and the company's Visual Intelligence feature
2
. The move signals Apple's determination to compete in the AI hardware race alongside Meta, Google, and OpenAI, even as the company plays catch-up in the broader AI landscape.
Source: Geeky Gadgets
During an all-hands meeting earlier this month, CEO Tim Cook hinted at the company's ambitious plans, telling employees that Apple is working on new "categories of products" enabled by artificial intelligence
2
. "We're extremely excited about that," Cook said, adding that "the world is changing fast." The push into AI-powered wearables comes as iPhone sales remain robust but the company faces mounting pressure to demonstrate innovation beyond its core product lines, particularly after the pricey Vision Pro headset failed to resonate with consumers.The smart glasses represent the most advanced offering in Apple's planned AI wearables lineup. Code-named N50, the glasses have made significant progress in recent months, with the company distributing broader sets of prototypes within its hardware engineering division
2
. Apple is targeting the start of production as early as December, ahead of a public release in 2027.Unlike the augmented reality glasses Apple has long been rumored to develop, these AI-powered wearables will feature no display, instead relying on speakers, microphones, and camera sensors for interaction
1
. The design philosophy mirrors Meta Ray-Bans, which have become a hit since launching in late 2023. However, Apple aims to differentiate through two key areas: build quality and camera technology.The glasses will incorporate two camera lenses—one high-resolution sensor for capturing photos and video, and another dedicated to computer vision, similar to technology used in Vision Pro
2
. This second sensor is designed to give the device environmental context, helping it more accurately interpret surroundings and measure distances between objects. Early prototypes connected via cable to a standalone battery pack and iPhone, but newer versions have components embedded directly in the frame using high-end materials, including acrylic elements intended to give the glasses a premium feel.
Source: Beebom
Apple initially considered partnerships with popular eyewear brands, following the industry trend set by Meta's collaboration with EssilorLuxottica and Google's partnership with Warby Parker. However, the company recently decided to develop its own frames in-house in a variety of sizes and colors, leveraging its retail stores where customers could try on the glasses and get them sized
1
.Visual Intelligence, Apple's version of computer vision, will serve as the defining feature across the company's AI wearables lineup
5
. First introduced on the iPhone 16 Pro and available from iPhone 15 Pro and up, the feature allows users to analyze on-screen content and the world around them through the camera. It can recognize objects, offer information about them, answer questions, enable real-time translation, and conduct instant web searches.During Apple's earnings call, Tim Cook heavily emphasized Visual Intelligence, calling it "one of our most popular features" that "helps users learn and do more than ever with the content on their iPhone screen"
5
. The goal is for the glasses to function as an all-day AI companion, capable of understanding what a user is seeing and doing in real time. Wearers could look at an object and ask what it is, inquire about ingredients in a meal, or get assistance with everyday tasks2
.Apple is also exploring more advanced uses, such as reading printed text and converting it into digital data. The technology could provide upgraded turn-by-turn directions, telling users to go past a specific landmark rather than just a certain number of feet
3
. Currently, Visual Intelligence relies on OpenAI's ChatGPT for results, though Apple is reportedly working on its own AI visual models that could coincide with the eventual release of Siri 2.0 in 2026.A potentially breakthrough element comes from Apple's recent $2-billion acquisition of Q.ai, a startup specializing in machine learning systems for interpreting silent voice input
4
. The company also researched systems for interpreting micro facial movements, enabling them to understand speech without it being audible at all. If successfully integrated, this technology could address one of the biggest practical limitations of voice assistant-powered wearables: the need to speak audibly in public or quiet environments.The AirPods update expected later this year represents Apple's first piece in the next wearable puzzle. The AirPods Pro 3 design will add infrared cameras focused on gesture tracking rather than taking pictures
1
. The cameras should work similarly to Vision Pro's near-range hand tracking and will likely function in the dark. These AirPods with cameras are envisioned as simpler offerings, equipped with lower-resolution sensors designed to help the AI work rather than for capturing photos or videos2
.
Source: Wccftech
If these arrive in 2026, they'll push out some AI functions and allow simple hand gestures that could control music or interact with workouts. The gesture tracking technology could also serve as a testing ground for capabilities that might eventually land on the smart glasses.
Related Stories
Gurman emphasizes that a camera-enabled AI pendant is also in the works, with a potential release in 2027, though there's also a chance this device might not happen at all
1
. The pendant, which could be pinned to a shirt or worn as a necklace, sounds reminiscent of the failed Humane AI Pin but would focus on being an iPhone accessory rather than an independent device2
.Most other tech companies have been gravitating toward pendants and pins lately, as evidenced by announcements made at CES in January. Pins could be more versatile to wear and might serve as a hedge in case consumers don't embrace glasses. The report sounds less certain on the pendant's arrival than on the glasses and new AirPods.
Apple's accelerated push into AI-powered wearables comes as the competitive landscape intensifies rapidly. Meta Ray-Bans have already become a hit, demonstrating consumer appetite for camera-equipped eyewear despite privacy concerns
4
. Google is preparing its own lineup of glasses powered by Gemini, with Google I/O scheduled for late May potentially serving as the launch platform1
. OpenAI is also developing a series of devices, including wearables, with the help of ex-Apple design chief Jony Ive and other former Apple executives.Apple's recent deal with Google to have Gemini power its next wave of Apple AI always suggested wearables, because Gemini's camera-enabled and live modes are key to Google's upcoming wave of glasses
1
. Apple could be using Gemini to leap ahead in camera and context-aware live AI functions that could make these new wearables work.However, questions remain about whether Apple's approach represents genuine innovation or simply catching up to competitors. As Gizmodo notes, the Visual Intelligence use cases described—identifying food ingredients, providing navigation instructions, offering task-specific guidance—are already available in Meta Ray-Bans and other smart glasses
3
. Computer vision remains one of the least reliable features of current smart glasses, often getting details wrong and making it difficult to trust for daily use.The longer-term implications extend beyond individual devices. AI is expected to change the way consumers use phones, with more activities shifting to peripherals. Apple is looking for a breakthrough with its accelerated push into wearable devices, aiming to keep users locked into the Apple ecosystem
2
. Down the road, Apple could find a way to blend a lower-cost Vision headset in a more glasses-like form with these types of smaller wearables, or the company might be figuring out ways to explore new interfaces on wearables first before solving for a new wave of wearable displays beyond Vision Pro.Odds are consumers won't hear anything about these plans until at least Apple's WWDC developer conference in June, though the company could provide an advance preview similar to what it did with Vision Pro and the original Apple Watch
1
. With Google potentially readying its glasses lineup soon, the race to define the next generation of AI wearables is accelerating rapidly.Summarized by
Navi
1
Technology

2
Policy and Regulation

3
Policy and Regulation
