4 Sources
[1]
The Future Beyond Meta Quest and Vision Pro Is Coming in Glasses Form. What Will VR Look Like Then?
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps Meta CEO Mark Zuckerberg recently declared that the gadget of the future is AI-infused glasses. Zuckerberg and Meta have been interested in these types of peripherals for years, but the focus has clearly been shifting lately from VR on our faces to glasses on our faces. Meta's not alone here. Apple sees a future in AR. Google does too. So does Samsung. The list of players goes on and on. As VR has shifted to mixed reality, and smart glasses promise augmented reality features to come, and AI evolves more features that see what we can see, the battle for facial gadgets is coming to a crossroads. As a reviewer of augmented and virtual reality gadgets, it's been pretty easy over the last few years for me to pick a headset to recommend. Meta's Quest headsets are the best for the price in my tests, and Meta has added next-level extras like mixed reality, hand tracking and practical features like the ability to work both standalone and with PCs. But the certainties of the VR space are in flux. AR and smart glasses are coming fast. I've already seen it in pieces. It's clear that, entering late 2025, the AR/VR hardware landscape is shifting. I expect Meta's Quest 3 and 3S to continue their reign as my favorite overall headsets through the end of this year, but new challengers will arrive soon and in forms that won't even feel like what came before. Samsung and Google are expected to finally release their first mixed reality VR device, called Project Moohan, ahead of a wave of glasses coming in 2026. Apple could have an updated version of its Vision Pro with a new, higher-graphics chip onboard, along with support for controllers and other accessories. And smart glasses will be pushing extra features that challenge us for time on our faces, introducing AI functions in new ways. It's already happening. I believe Meta itself could play the biggest wild card, however. My demos last year of Orion, a prototype pair of AR glasses, could be the prelude to Meta making a new pair of high-end smart glasses this year that have a display and a gesture-control wristband, a starting step toward that Orion game plan. When it comes to its next VR headset, meanwhile, Meta could pivot to a smaller design and, possibly, higher-end hardware made by third-party manufacturers. Finally, Valve -- long dormant in VR since releasing the Index headset back in 2019 -- has been rumored to be readying a new headset that could be both standalone and PC-connected. Will that headset emerge in 2025 or later? While Valve's hardware likely won't be AR-focused, it could easily redefine the VR gaming space for the next few years. Here's the tech I'm keeping both eyes out for in the last months of 2025. Last December, I demoed the Moohan headset and saw something I hadn't experienced anywhere else: onboard AI that could comment on and analyze the things I was seeing on-screen. Moohan is the long-awaited first product for Google's Android XR platform. Built by Samsung, it will act as a starting vision for where Google's AR/VR next products could be headed. A lot of Android XR looks to be glasses-focused, with AI glasses partnerships already announced for next year, a display-enabled tethered hardware product made by Xreal, and Samsung's plans for full AR glasses waiting in the wings. In the meantime, Project Moohan -- Samsung's name for the headset for now -- is a high-end VR headset with mixed reality that looks to demonstrate how Google Play apps will run in VR, and how Gemini AI will work in a headset. We know Moohan is coming out in 2025, but when exactly -- and how much it'll cost -- remain a mystery. It looks every bit as premium as Apple's $3,500 Vision Pro, with a high-resolution display, eye tracking and a tethered battery pack. But Moohan will be a sign of where Google and Samsung's ambitions are heading in this space, even if it's not necessarily affordable. It could also push both Apple and Meta to advance their work on onboard AI in their own headsets. Meta often comes out with hardware surprises around Connect, its September developer conference that I attend every year. In 2024, Meta unveiled the Quest 3S and gave me an advance look at prototype AR glasses called Meta Orion. This year, based on reports, Meta might be skipping VR entirely and emphasizing glasses once again. Meta's Ray-Bans have been a hot seller, with partner EssilorLuxottica reporting tripled revenue growth for smart glasses in the last year. Zuckerberg made glasses a big focus last year, wearing Ray-Bans and Orion glasses on stage during his Connect keynote. And in July, Meta introduced a new Oakley Meta HSTN smart glass line with improved battery life and camera quality, which I've already test-driven on my face. Reports say a higher-end set of smart glasses will be unveiled, possibly with a display good enough to show notifications and even play games. These not-quite AR glasses will also come with a peripheral that could be key to everything that happens next: a neural-input wristband. When I tried Orion last year, I got to wear that wristband and use it to navigate gestures in glasses. The wristband uses EMG (electromyographic) tech that senses electrical signals on the wrist, registering small finger gestures that can be used by smart glasses. The end result is sort of like camera-based hand tracking or Apple's gesture-based taps on Apple Watch, but the gestures with this type of band could be read out of range of cameras and with a wider range of fidelity than anything else available right now. I navigated Meta's Orion glasses with pinches, taps and thumb swipes, but Meta's research using this tech promises even more down the road. Meta could get a foot in the door on neural tech by releasing this band now and finessing it over time. Zuckerberg told me back in 2022 that he sees the band as a universal input device, meaning it could eventually be used with VR headsets and even other devices, too. Read more in my conversation with Andrew Bosworth Smaller input wearables are needed for smaller glasses to come, and could even be mixed into smartwatch designs. It looks like a missing link to glasses, one that companies like Google and Samsung have already acknowledged as a future direction. Meta may not totally duck VR this year, though. A year ago, Meta announced hardware partners that were supposed to make additional Horizon OS Quest-alikes sometime soon. Lenovo and Asus are the first known partners, but no headsets have appeared yet. Maybe they'll be announced this year, as spin-off pro-like upgrades while Meta takes a year off from making its own VR hardware. Microsoft's Xbox-branded Quest, which finally went on sale this summer after being announced at the same time as those third-party headsets, could be a sign that the other news is around the corner. I've been reviewing Apple products for years and was one of a handful of journalists to get to be the first to review the Vision Pro, and even I have a hard time guessing where Apple is headed in the XR space. The company is tight-lipped on future plans, and while plenty of reports point to where Apple might be heading, the actual pace of evolution for the Vision Pro's features has moved slowly since launching in early 2024. Read more: The Vision Pro's future needs to hurry up Reports say Apple is going to make a smaller headset and smart glasses, but it's the when part that's a mystery. Next year? The year after? Maybe the year after that? Reports keep shifting. In the meantime, it seems likely that Apple is going to update the Vision Pro with a new M chip -- either M4 or M5 -- sometime soon, along with a more comfortable head strap, according to reliable Bloomberg reporter Mark Gurman. That could happen this fall or early next year. What we do know about Apple's Vision plans is that VisionOS 26 will support spatial controllers and accessories for the first time, starting with the PlayStation VR 2 Sense controllers and Logitech's wireless stylus. What we haven't seen yet are the apps and games that will work with those new inputs, and how many there will be. And I've already tried updated Persona avatars and mixed reality widgets in Vision OS that are coming this fall, too. A new M-series processor could allow better graphics for design and gaming apps on the headset. It could also open the door for deeper on-headset AI that uses the multiple cameras to see the world. Apple hasn't announced any plans for Visual Intelligence on Vision Pro yet, but maybe the hardware pieces will start falling into place this year. The most intriguing reports, and ones that could shake up the space even further, point to a new Valve VR headset that could work both PC-connected and standalone. Project Deckard has been rumored for years, and Valve's still showing continued interest in VR despite not having a new VR headset since 2019's Index. This would enable Steam Link connections to Meta Quests to work well as PC headsets, too. Some earlier reports have suggested Deckard could connect to a future Steam Deck portable game console, playing games on the go while tethered to the console to amp up performance. It would make sense, but Valve may not have a new Steam Deck for at least another year or more. Or Deckard could have Steam Deck-level power in its own onboard hardware. If Deckard does appear, and ends up being reasonably affordable -- a big if, since some reports suggest the price could be over $1000, and the Index was expensive, too -- it could be a more attractive option than the Meta Quest for gaming. Beyond Meta, VR has struggled to offer great new gaming options lately. Sony's PlayStation VR 2 tried to become a high-end VR gaming accessory, but Sony's lack of consistent support for new games, and the headset's need to be tethered to a PS5, hamper its appeal. If Valve could actually make its next headset work as a self-contained, portable device for games like Steam Deck, it could shift up the definition of what "PC VR" even means and where you can play. In the process, maybe, it could lay down ways that future, smaller headsets and glasses could interface with smaller deck-like PCs and tablets. Or is the future just about glasses? So many companies like Meta seem to be pivoting to AI-enhanced wearables as a new model for gadget design. Smart glasses are taking advantage of AI hype and function, but VR isn't yet. Meta's CTO, Andrew Bosworth, told me a year ago that VR is a harder territory for gen AI to work with than glasses at the moment, but that could be changing as more tools emerge for mixed reality headsets equipped with cameras, and Gen AI evolves more models to work with 3D graphics and games. Smart glasses could be the first wave of AI, and next-gen mixed reality headsets after that. This year doesn't look to have all the answers. In fact, it's looking like a year of scattered pieces. But the final products of the year should help point to where the future's heading. And I can't wait to put them on my head for a closer look.
[2]
VR Is in a Really Bad Place Right Now and Smart Glasses Are to Blame
Remember VR? It's that thing where you put on a headset and... You remember VR, right? If you don't remember VR, you couldn't be blamed, because things have taken a turn since way back in 2023, and right now, VR seems to be on the back burner, cold, with the heat turned off, while its sister tech, XR/AR is front and center, getting all of chef's Michelin-starred attention. That's not just a feeling, either; there are numbers to bear that trend out. Year over year, Meta said last week that Meta Reality Labs sales were down from $440 million in Q1 of 2024 to $412 million in Q1 of 2025. That drop, by Meta's own admission, is due in large part to tanking Quest sales. What makes that trend interesting isn't just the precipitous drop in Quest sales; it's the fact that at the same time, sales of its increasingly popular Ray-Ban smart glasses seem to be through the roof. According to Meta CFO Susan Li, Reality Labs' revenue loss was "due to lower Meta Quest sales, which were partially offset by increased sales of Ray-Ban Meta AI glasses." That increase in revenue, by the way, is apparently threefold year-over-year. If I'm reading into Li's breakdown, Quest sales are tanking, and smart glasses are saving the day. Or, I guess that's one way to look at it. Another way to see that trend is that VR is skidding out, and smart glasses (and the potential for XR/AR inside them) are picking up the pieces. That may sound sensational, but it also makes perfect sense in other ways. Don't get me wrong, I actually like VR/XR headsets; I think the Quest 3S is the perfect headset for the moment because it's light, affordable, and still offers a full-featured XR experience that will satisfy established XR/VR fans and impress beginners. That being said, they're still a burden. Strapping a headset to your face is inherently invasive and not exactly what most would call a pleasant time. Even the Quest 3, which is lightweight and generally comfortable compared to the competition, wears on your eyes, your face, and your poor, sweaty skin after a while. As much as I love the Quest for what it could be, it's hard to reconcile that with what it is right now. Smart glasses, on the other hand, don't share those problems. Sure, Meta's Ray-Ban glasses don't do even half of what we want them to at this point in timeâ€"they don't even have a display in them yetâ€"but they're a light touch. If you want a taste of smart features like Bluetooth audio, a voice assistant, and some (very hit-and-miss) AI, then you can snag a pair, slap them on your face, and wear them for hours without feeling like you're wearing anything other than a regular pair of glasses. The lesson here is that it's better to do a little of what you would expect (enough to qualify as a good start) than try to do everything at the cost of a clunky form factor, and that's shining through when it comes to smart glasses versus VR headsets. Not only that, but smart glasses, though still very much a work in progress, are also potentially a lot more useful in your day-to-day. Because of their light form factor, you can actually bring them into the real world and use them for real stuff like taking pictures or calling your mom. Which is a reminder: you should call your mom, smart glasses or not. No matter where your priorities or allegiances lie, the tides, as they often do in emerging tech, seem to be turning. On top of that, all of the latest rhetoric around smart glasses and XR/AR from Apple's Tim Cook and Meta's Mark Zuckerberg should give you some indication of which way those waters are flowing. If you're a fan of smart glasses, you've got nothing to worry about, but if you were excited to realize our full VR future, you may have to wait a while longer.
[3]
Smart Glasses Revolution: Inside the biggest tech trend of the next 10 years
Ever since I sprinted across Las Vegas in 2017 to pick up a pair of Snapchat Spectacles from a vending machine, smart glasses have changed drastically over the last eight years. From glorified camera glasses and a wearable external monitor, and all the way to an AI-infused pair of specs, we've been through it all to make it to this very moment - and the moment we're in is an interesting one. That's because we all see what we want our smart glasses to be, but in something significantly bigger: VR headsets. Currently, these are very different devices, running along two parallel trajectories of development. But after speaking to Snap, Qualcomm and more, it's clear that the race is on to find the middle ground between these two -- to be first to a truly AI-infused augmented spatial future of wearables. With significant developments tackling the key challenges, this 10-year race could very much see the device that could kill the smartphone and be the next big thing. Every big company you know is in the running, with Meta's Project Orion prototype charging into the lead, Android XR and Snap's new consumer specs catching up, and even Apple is "hell-bent" on making its own glasses. Let's take a look at where we are now, why smart glasses are indeed the next big thing, and what it will take to get there. If you take a look at the best smart glasses you can buy right now, you've got two categories: AI and AR specs. AI glasses like the Ray-Ban Meta Smart Glasses bring the power of multi-modality to something that is super wearable. And you can see the real benefits they bring -- from asking quick questions like your standard smart assistant to detailed prompts understanding the world around you. In fact, sales of Ray-Ban Meta glasses so far this year have more than tripled compared to the same time last year, which is more than 200% growth. That's according to EssilorLuxottica, which owns smart glasses brands like Ray-Ban and Oakley. For me, they really come into their own when I'm travelling. Putting ingredients on a counter and asking for a recipe of what to cook is always a massive help; live translation is a huge move to bridge the gaps of understanding; and asking for more information on historic locations gives you new context like a tour guide. Then you've got AR glasses -- essentially a portable external monitor that has been shrunk down into a pair of glasses. With the micro-OLED display tech projecting into prisms in front of the lenses, you can get a 100+ inch display wherever you go. That is huge for long distance travel. Something like the Xreal One Pro specs really come in clutch for reducing the neck strain of looking down at my laptop or Steam Deck. Those prisms don't make them great for walking around with, but they are the best realization of a screen in your glasses right now. And the ever increasing capabilities to simulate an ultra-wide display or use depth-of-field tracking tech (known as 6DoF) to anchor something in place is a signifier of far greater capabilities going forward. I mean, just take a look at the spatial widgets announced in visionOS 26 -- with 6DoF, that is possible with glasses. It's clear that while Apple Vision Pro opened the door to spatial computing, a whole lot of software from Cupertino's AR play to SnapOS and even Meta's OS in the Quest 3 are all previews of what you will get in glasses. Or if you wanted to go even more "tin foil hat conspiracy" with me, I'd argue that the new Liquid Glass design motif of Apple's software is subtly training us to get used to smart glasses. That transparency does make things a little harder to read, but users will adapt -- just in time for new specs. But the end-goal is far greater than that. The mission for the future is to bring both AR and AI together, as the possibilities are huge. Removing the smartphone from the equation to ensure someone is present in the moment is the pinnacle to the digital detox movement that is starting to happen -- smart glasses that bring both AI and AR to the table are key to this. "I am somewhat worried my kids think I look like this," said Scott Myers, VP of hardware engineering at Snap Inc. -- holding up a smartphone to his face and talking about how they have become distraction devices. "Specs are the next generation of computing, and they're a powerful, wearable computer in a lightweight glasses form factor. And because they naturally integrate digital experiences with the physical world and enable me to look up at the world, I'll stop pulling out my phone so much, or maybe I don't need to take my tablet with me on trips anymore." Imagine that same recipe situation as above, but with an image-based guide supplementing it, too. Or that same moment of discovering historical monuments, but having map pins identify every single one to visit. While all these companies have their own ideas of what the dream smart glasses are, all are in agreement that there are fundamental key challenges to be solved here. Displays need to get better Right now, you've got a pick of two ways to do this: a glass prism that an OLED picture is projected into (commonly called "bird baths" and seen in the Viture Luma Pros), or a particular section of the glasses lens being etched to refract light from the arm (named "waveguide"). Bird baths have the better, wider picture quality, but glasses have to be slightly bigger to house them -- looking like the spy glasses you get at the Scholastic book fair. Meanwhile, the waveguide is certainly a lot more subtle, but being the size of a miniature postage stamp on one lens does lead to the display being way smaller and worse in quality. But companies like Lumus are quietly working on this in the background, and working with a lot of big names in the industry. The secret sauce is reflective Waveguides. "With the geometric waveguide lenses we're making, you can get a far wider field of view while not compromising on the picture quality or brightness needed to see it in daylight, " said David Golman, VP of marketing and communications at Lumus. "Not only that, but with the liquid crystal display potential, you can actually improve a person's vision too." The challenge is to get the best of both worlds here -- ditching the bird baths to provide full clarity of the world around you like a regular pair of glasses, while still offering that same level of screen quality for both full immersion and augmenting your surroundings. Break the reliance on other devices This comes down to one thing: getting a chip powerful enough to stuff entirely on the glasses without any need to connect to another device. At the moment, we're either limited to AR glasses having a chip that tricks your laptop into thinking you have a 32:9 ultrawide monitor on your face (typing this on my ultrawide Xreal Ones right now on a plane), or a fast but limited chip to keep latency sort of low between making an AI request through your specs and the phone doing the heavy lifting (looking at you, Ray-Ban Metas). Looking forward to the mid-term future, the answer seems to be a puck, like what you see in Meta's Project Orion - a dedicated device to fuel the experience. Other companies agree. You see this in Xreal's Project Aura and Qualcomm believes this concept is on a spectrum. "Some operators would love a glass that is connected directly to 5G, and we will work on that. Others want sports glasses for going on a run, and others will just want a general assistant," said Said Bakadir, VP Product Management at Qualcomm. "I think we're gonna see that spectrum evolving from something that is minimum possible in the glass to getting rid of other devices." However, if smart glasses are truly going to take off, there can't be any pucks or separate devices. We need it all to work entirely on the glasses for this to be the same truly disruptive iPhone-esque moment for consumer tech. Developers, developers, developers! Speaking of the iPhone, you may not know this given how much of a global icon it is now, but the real breakthrough for Apple's mini slab didn't really arrive until the app store one year later. Opening up a platform for developers to create their own experiences for people to use creates an evergrowing list of reasons to buy your device, and AR glasses need that moment. So far, there hasn't really been a shared app marketplace for people to download onto AR glasses like the app store. But two things may flip this entirely on its head: Android XR bringing the Google Play Store to specs, and Snap's new consumer glasses channeling the word of devs creating hundreds of thousands of lenses over the past few years. "We're really here to build this with the community because it's an entirely new paradigm," said Snap's Myers. "There's a lot of things that will take time for people to understand and figure out. It's not just going to be, "oh, here you go, developers -- come build for this!" That's not going to work in my opinion. It's a community-led discussion and I couldn't be happier with that." The constant stream of new apps to the smart glasses of the future needs to become as synonymous as the app store is to the iPhone. All-day stamina guaranteed Batteries are not ready for prime time in smart glasses -- the longevity of lithium ion cells are always heavily compromised by the limited capacity balanced by ensuring the glasses are not too heavy on someone's face. The end result is making sure you're careful with the number of interactions you make with your Ray-Ban Meta shades at the moment. Fortunately, Meta is on the right track of improving this with the Oakley Meta HSTN glasses effectively doubling the longevity. That being said, there's still a way to go. What's the answer? Nobody is quite sure yet, but it seems to start with the direction smartphones are heading in: silicon carbon. This next generation battery tech is able to pack more power within the same space, meaning this could be a starting point to move forward. The other thing the industry has learned, just like Meta did with the Ray-Bans, is how battery life is all about calculating and optimizing the software usage to every microwatt. "I worked on smartphones for a very long time, said Myers. "While the battery capacity has grown pretty consistently, it's really the way people are using the software that has gotten much better. We see the same trajectory for Snap OS." If Ray-Ban Meta smart glasses prove one thing, it's that when it comes to AI devices, glasses are the best realization of that vision -- better than Rabbit R1, better than the Humane AI Pin. But even more than that, we've seen multi-modal AI unlock some truly useful features in a pair of smart glasses. Because at the end of the day, you want your glasses to do more than tell you you're looking at a tree. "XR, for me, is the best interface to interacting with the digital world. What happened in the digital world is being transformed with AI. So it just happens that this AI requires multi-modality." said Qualcomm's Bakadir Whether I'm exploring the world and want extra facts about a landmark, or I'm stuck on things to eat and want some assistance on what to make from the things in my fridge, having AI directly on your face is the most natural form factor. "AI will be the core intelligence layer. It will understand context, proactively assist, personalize the interface in real time. Wearables will evolve from tools into true companions -- adaptive, discreet, and intuitive." said David Jiang, CEO of Viture. We've made small steps towards that with Snapdragon AR1+ Gen 1 -- allowing you to run a 1-billion parameter AI model entirely locally. That is significant for the future of smart glasses, but it's only one step forward. Now the next step is moving into agentic AI and personalization -- using data to train your own device around you for more proactive, more agentic assistance that can help before you even think you were going to look for help. Remember when the Apple Watch came out? The real reason for it existing didn't come until a few years in. When those sensors came into their own, it became the go-to health tracker that it is now. I feel that the moment is coming for smart glasses. The use cases are currently limited, but the moment we start sticking sensors on them, not only would you be able to track physical health, you could even track emotional health, too. "We believe that understanding emotions is a force multiplier for AI, in terms of it being effective for you in the context of wearing glasses all day. If you want AI to be really effective for you, it's critical that it understands how you're feeling in real-time." said Streen Strand, Emteq CEO. And why wouldn't you? In a February survey by Sentio University, 96% of AI users reach out for some therapeutic advice. Sensor tech is looking like a key focal point of the future of smart glasses -- fueling not just eye-tracking and hand gestures, but pairing with AI for more personalization. We've done this dance before. Remember Google Glass? There's a reason why the phrase "glassholes" exists, and it's because of the social stigma that came with wearing this advanced piece of tech directly on your face. Every new tech category goes through a settling-in process around the way they disrupt common social cues, as they move from seeming traditionally impolite to just being the way things are. But with display tech in smart glasses, I feel that hump of social acceptance is going to take a bit more time to get used to. A great example is the Halliday glasses, which beams a 3.5-inch projected display into your eye from the top rim of the specs. All you have to do is look up at it, which on paper is seriously impressive. However, during my time talking to people wearing them at CES 2025, the amount of perceived eyerolls I got as they looked up to the screen did certainly make me feel like an inconvenience! And then more broadly with the display tech of tomorrow, you'll never really know whether someone is actually looking at you. At least with current bird bath panels making for slightly larger specs, you're giving off a big enough "do not disturb" signal. But when they disappear and the transition to waveguide happens, it will take time for society to acclimatize. "We all lose our time to these black rectangles called smartphones, so I see waveguides on smart glasses as a great thing to just glance at when notifications roll in without taking my phone out. But my wife is always on edge about whether I am actually paying attention to her." said Lumus' Goldman. Then, of course, there's the privacy concerns of wearing an always-on device on your face. How do you give permission to be seen by these glasses? What does that look like? We saw these become big issues with Google Glass in the early 2010s, and with a personalized AI assistant that needs to be always running to understand you, the worries will be significant and warranted. "It's not like smartphones in that it's passive AI. There needs to be an AI actively listening to you that memorizes your routines, your conversations, everything about your day to deliver that efficient lifestyle." said Carter Hou, CEO and Co-founder of Halliday. I know there are significant technical challenges on the road between where we are now and 2035, but more than anything, the cultural one is going to be the bigger mountain to climb. We've already gotten over the "wearing glasses even though you don't need to" one (look at hipsters wearing spectacles with no lenses for example) -- and surely it'll be a matter of time before the technological aspect just becomes a social norm, rather than people asking "is that a camera in your glasses?" There is a grand vision for 2035, but the future of smart glasses is a lot closer than you think. I initially thought that the race to XR is only just beginning to heat up, but in reality, it's already at fever pitch. With rumored next-gen Ray-Ban Meta smart glasses, the impending launch of Snap Specs in 2026, and let's not forget Apple being "hell-bent on creating an industry-leading product before Meta can," we're on the precipice of seeing the next step forward in this space. But what makes this category so fascinating to me is that no one company has all the answers. Every dreamer in this area has one piece of the puzzle, and I do believe that in ten years time, these will all come together to become that next category-defining product -- that smartphone moment for wearable technology. So buckle up, because it's going to be a helluva ride over the next decade.
[4]
The future of wearable wellness tech: 5 wild predictions for 2035 according to experts and industry leaders
In 2035, your Apple Watch Series 36 could arrive with enough battery power to last the lifetime of the wearable; I'm talking years rather than days, i.e., no recharging required. Don't like wearing a watch? All those holistic sensors may come in an assortment of new forms, including flexible and near-invisible stick-on 'smart patches' that look kind of like a Band-Aid but stay put for weeks or even months. As the speed in which health data can be processed and analysed continues to improve, while physical holistic sensors get tinier and tinier, keeping tabs on your vitals ten years from now will likely not be handled by one piece of dedicated wearable tech, but by an array of health-sensing devices you don't even have to think about, like the steering wheel of your car if you commute by automobile, your contact lenses (even if you don't require corrective vision), or even the waistband of your favorite underpants. With so much information to decipher, AI will be the backbone that powers future wearable wellness technology. The promise of early detection of not just chronic disease but everyday illness will be another crucial selling point of tomorrow's wearable devices, and the best devices will offer sensible, actionable steps to follow if something does come up. To get a clearer picture of our possible wearable/embeddable future, I spoke with an array of experts and industry leaders in the field, including, Angela McIntyre, the director of the Stanford's Wearable Electronics Initiative; Amaury Kosman, the founder and CEO of the smart ring brand, Circular; Jason Russell, the vice president of software at Oura Ring; Antoine Joussain, a lead product manager at the consumer health technology brand, Withings; Roman Axelrod and Dr Valentyn Volkov, cofounders of the smart contact lens startup Xpanceo; and Michael Hayes, the CEO of the smart contact startup InWith Corp. These conversations resulted in five major trends surrounding wearables and embeddables for the year 2035: More form factors, batteries that last the life of the device, predictive monitoring for both chronic diseases and everyday conditions, AI connecting the dots between wellness metrics and healthcare, and further incorporation of smart features that make life easier/less stressful. Battery life, or lack thereof, is one of the biggest factors holding back today's wearables. Relatively reliable subscription-free wearables can be picked up for $100 or less (see the Amazfit Active 2), but few last longer than a week on a single charge. Fortunately, in 2035, the need to plug in may be as antiquated as the away message. "Our goal is for [the battery] to last the lifetime of the device," says Antoine Joussain, a lead product manager at the French wellness tech brand, Withings. "So if a device is lasting for five years, we'd like [the battery] to last for five years too." This will come through both innovations in battery technology and reductions in power consumption. Nearly everyone I spoke to for this article mentioned flexible or even stretchable batteries. Such technology would be crucial for developing a truly band-aid-style "smart patch," notes Angela McIntyre, the Executive Director of Stanford University's Wearable Electronics Initiative (eWEAR). More on that below. Some wearables brands like the smart ring manufacturer, Circular, already use bendable batteries in their product design. However, at the rate at which battery technology is currently developing, the batteries of 2035 will likely look vastly different than today's. "We already have flexible batteries in our rings, and we're trying to max them out. Over the past six years, I've seen three different technologies used in batteries, so different materials that can withstand more and more capacity," says Amaury Kosman, the Founder and CEO of Circular. Power management improvements won't only come in the form of better batteries. "More efficient signal paths and the ability to disable unused sensors will also contribute meaningfully [to improved battery life]" says Jason Russell, Oura's VP of consumer software, when asked what a theoretical Oura Ring 10 might look like. Another hot topic: energy harvesting. While ten years is likely too soon for our smartwatches to be powered solely by body heat, McIntyre reports that researchers at Stanford and elsewhere are hard at work making the concept a reality. "Motion of a person could be harvested as well," says McIntyre. Of course, self-charging wearables do exist in 2025. The Garmin Instinct 3 Solar, which features a light-sensative cell behind the device's screen, is a great example. However, by 2035, solar charging capabilities might be small enough to fit directly into a contact lens. "We are developing light-harvesting features integrated into the lens surface, allowing ambient sunlight or indoor lighting to contribute to the power supply. While energy harvested this way is modest, the low power demands of contact lenses make even small boosts valuable," says Dr Valentyn Volkov, the cofounder of Xpanceo. While the founder of the Circular Ring, Amaury Kosman, seemed skeptical of wearables' self-generating energy by 2035, Oura's VP of Consumer Software, Jason Russell, sounds more optimistic. "By 2035, it's plausible that wearables could integrate hybrid energy systems that passively recharge throughout the day, vastly extending runtime and reducing dependency on charging cycles," says Russell. Don't expect watches or rings to disappear anytime soon, because whether smart or not, this style of jewelry is likely here to stay. On the flip side, do expect the sensors you already see in smart rings and smartwatches to eventually appear in other wearable products, like earbuds, bracelets, stick-on patches, contact lenses and smart clothing. "The idea is to make it disappear," says Joussain when asked what the future of health-sensing technology looks like for Withings. That's a pretty bold statement for a brand that makes a somewhat chunky metal smartwatch with considerable heft in 2025. Ultimately, ten years from now, holistic sensors will be small enough to be installed just about anywhere, not just in wearables but also your computer mouse and even your car's steering wheel. Essentially, wherever you're most likely to interact with them. "All these new [health tracking] technologies will be implemented in everyday objects. So, you take your car every day, when you are holding the steering wheel, it will monitor your vitals," predicts Joussain. Ultimately, Joussain suspects that health sensors will be embedded directly into the user's body. However, he confesses that the concept is almost certainly more than a decade off. Stanford's McIntyre agrees. Instead, she thinks stick-on smart patches packed with holistic sensors are more likely to make an impact in the next ten years. "There are new sensors that are coming, and with your flexible, stretchable capabilities, they'll be even more that we can do from a sticky patch," says McIntyre. Much to my surprise, Circular's founder, Amaury Kosman, also thinks that smart patches could be the way of the future when it comes to at-home health monitoring. "A patch, which is tiny and anybody can wear, I think that's the future of where we're heading. As time goes by, everything gets miniaturized, everything gets more precise, and it gets cheaper. So it's just a logical next step for me," says Kosman when asked what future wearables will most likely look like. Similarly, Oura's Jason Russell acknowledges that the future of wellness monitoring might go beyond the singular smart ring. "We foresee stretching the boundaries of biometric sensing via the ring while integrating complementary wearables that together enable an even more complete picture of your health," Russell says. Outside of smart patches, what other new wearable health-monitoring tech can we expect to take off in the next decade? "Smart contact lenses, being in direct contact with the eye's surface and tear film, function as a tiny biochemical laboratory on the eye. This close proximity enables continuous, noninvasive monitoring of a variety of health metrics," says Dr. Volkov. The best smartwatches already alert users to potential signs of chronic health issues. Popular models like the Apple Watch 10 and Samsung Galaxy Watch 8 monitor for signs of sleep apnea and heart abnormalities, like AFib. The Google Pixel Watch 3 can even trigger an alert and send for help if a loss of pulse is detected. However, these tools are just scratching the surface. In ten years, your smart wearable may be able to screen for a whole range of chronic conditions, like diabetes, cancer or heart disease. These devices may also be able to give you a 72-hour heads-up to an upcoming cold, or alert you to heightened biomarkers that could indicate elevated stress, with actionable advice to return to your baseline. "As sensors become more advanced and miniaturized, the depth and granularity of data will also increase significantly. But the biggest shift will be in how insights are delivered: instead of just showing you the data, future insights could anticipate changes in your health, offer personalized, real-time guidance, and adapt to your unique physiology and goals -- making the experience more predictive, proactive, and deeply personalized than ever before," says Oura's Jason Russell. Ultimately, the future of disease detection may rely less on developing new sensor technology and more on making the most of the data already coming off the sensors we currently have. Enter, the promise of AI. "AI is getting a lot better, being able to discern what's a 'real' signal out of very noisy data, and then being able to make insights that are more valid for us from that data," says McIntyre. The use of artificial intelligence to analyze health data, effectively replacing manually written code, will exponentially increase the ability for software to sniff out health trends and make personalized recommendations, notes McIntyre. Tomorrow's wearables might even analyse your blood, urine, or sweat, as all three contain a multitude of easily trackable biomarkers that could indicate whether you're dehydrated, stressed, or a whole host of other conditions. Monitoring stress, in particular, is a focus of researchers. "Cortisol is another ingredient that people are trying to sense with sensors on wearables. I should say that people have cortisol and sometimes feel very pumped and excited, and other people might have a lot of cortisol and feel very afraid. So it really depends on circumstances and on the individual what putting out cortisol means, " says McIntyre. Future wearables may additionally be able to take the guesswork out of taking medication, says Michael Hayes, the CEO of the smart contact startup InWith. "There's a plethora of health applications with smart contacts. From early warning of disease to therapeutic delivery of drugs to the eyes to prevent certain conditions, to bringing new focus capabilities. The tear fluid is a rich medium for biomarkers," says Hayes. yes. Smart contact lenses could even one day replace today's blood-based health monitoring methods. "Glucose levels in tears can be tracked to assist people with diabetes in managing their condition without the need for finger-prick blood tests. Similarly, fluctuations in hormone or vitamin concentrations in the tear film can offer valuable insights into a person's metabolic or nutritional status," says Dr. Volkov. The concept of a faceless, nameless artificial intelligence interface spitting out wellness advice based on the augmentation of my sleep, workout, dietary, etc., data is beyond unsettling to me. However, everyone I spoke to on the subject assured me that the aggressive AI analysis of my holistic metrics is actually a positive thing. Doctors are busy. Wearable data is useful, but in 2025, there's no conduit to make that data easily accessible to the medical field. And even if there was, the amount of data would likely be entirely overwhelming. This is where AI can help. With more sensors and more users, it will become better at finding patterns that may warrant alerting your doctor or wellness team. In a time-sensitive health emergency, AI could potentially trigger an alert to your medical provider on its own, similar to Google's Loss of Pulse Detection or crash/fall detection. Representatives from Oura, Circular, and Withings all emphasised the importance of wearable data being more accessible to a user's healthcare team in the future, with AI playing a crucial role as the middleman. "In the future, [wearable tech] could support clinical applications like remote patient monitoring, early detection of chronic conditions, or continuous tracking of biomarkers relevant to metabolic, cardiovascular, or hormonal health. They may enable secure sharing of health data with care teams, integrate with electronic health records, or even assist with medication adherence through real-time prompts," predicts Russell. If you're like me, not so hot with remembering names, I've got great news. Tomorrow's wearable tech may make awkward social situations a thing of the past. "The smart contact lens will act as the ultimate personal assistant embedded directly into your vision and capable of analyzing complex social environments in real-time," says Roman Axelrod, the (other) co-founder of Xpanceo. "Yes, at a party, the lenses could scan the room and instantly recognize faces, drawing on your personal contacts and social databases to remind you of people's names, how you met, and important details about them before you even approach," says Alexlrod, though he acknowledges that privacy concerns and regulations for such features are still far from being sorted out. You can also expect these next-gen devices to potentially improve our human capabilities, Inspector Gadget-style. For example, InWith CEO Michael Hayes predicts that smart contacts in 2035 will not only offer night vision but potentially even zoom capabilities. "Seeing better in the dark is an advanced function, but we've already made significant progress. We can engineer lenses that enhance low-light vision. Nanoparticles alter the way the lens interacts with incoming light, effectively expanding what the eye can perceive in dim environments," says Dr. Volkov. "The idea of zooming in on distant objects is perhaps the most futuristic, but not impossible. This feature would require smart lenses with materials whose refractive properties can be dynamically controlled. Using electrical signals, the lens could adjust how it focuses light, effectively creating a variable 'optical zoom' function." "Although this technology is still in the research phase," Volkov says, he also suspects that rapid progress in the field should mean working prototypes well before 2035.
Share
Copy Link
As VR headset sales decline, smart glasses gain popularity, signaling a shift in the wearable tech landscape. Major tech companies are investing in AR and AI-infused glasses, potentially revolutionizing how we interact with technology.
The world of virtual and augmented reality is undergoing a significant transformation, with smart glasses emerging as a potential successor to traditional VR headsets. Recent reports and industry trends suggest that while VR headset sales are declining, smart glasses are gaining popularity, signaling a shift in the wearable tech landscape 12.
Meta, the parent company of Facebook, has reported a threefold increase in sales of its Ray-Ban Meta smart glasses compared to the previous year 2. This surge in popularity comes at a time when sales of Meta's Quest VR headsets are declining. The company's CFO, Susan Li, attributed the drop in Reality Labs revenue to lower Meta Quest sales, which were partially offset by increased sales of Ray-Ban Meta AI glasses 2.
Source: Tom's Guide
Major tech companies are investing heavily in the development of AI-infused and AR-capable smart glasses. Meta CEO Mark Zuckerberg has declared that AI-infused glasses are the gadget of the future 1. Other industry giants like Apple, Google, and Samsung are also focusing on AR and smart glasses technology 1.
Smart glasses offer several advantages over traditional VR headsets:
Comfort and wearability: Unlike VR headsets, which can be bulky and uncomfortable for extended use, smart glasses are lightweight and can be worn for hours without discomfort 2.
Real-world integration: Smart glasses allow users to interact with the real world while accessing digital information, making them more practical for everyday use 1.
AI capabilities: The integration of AI in smart glasses opens up new possibilities for real-time information processing and contextual awareness 1.
Source: CNET
While VR headsets may be experiencing a temporary decline, the technology is far from obsolete. Companies are working on developing more advanced and user-friendly VR and mixed reality devices:
Project Moohan: Samsung and Google are collaborating on a high-end VR headset with mixed reality capabilities and onboard AI 1.
Meta's AR ambitions: Meta is reportedly working on high-end smart glasses with a display and a gesture-control wristband, building on their Orion prototype 1.
Apple's Vision Pro: Apple is expected to release an updated version of its Vision Pro headset with improved graphics and support for controllers and accessories 1.
The transition from VR headsets to smart glasses presents both challenges and opportunities for the tech industry:
Display technology: Improving the quality and size of displays in smart glasses remains a key challenge 3.
Battery life: Developing long-lasting, compact batteries for smart glasses is crucial for widespread adoption 3.
Source: Gizmodo
As the technology evolves, we can expect to see a convergence of VR, AR, and AI technologies in more compact and user-friendly form factors. The race is on to create the perfect balance between functionality, comfort, and style in the next generation of wearable tech 3.
Summarized by
Navi
[1]
NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.
9 Sources
Technology
8 hrs ago
9 Sources
Technology
8 hrs ago
Google's Made by Google 2025 event showcases the Pixel 10 series, featuring advanced AI capabilities, improved hardware, and ecosystem integrations. The launch includes new smartphones, wearables, and AI-driven features, positioning Google as a strong competitor in the premium device market.
4 Sources
Technology
8 hrs ago
4 Sources
Technology
8 hrs ago
Palo Alto Networks reports impressive Q4 results and forecasts robust growth for fiscal 2026, driven by AI-powered cybersecurity solutions and the strategic acquisition of CyberArk.
6 Sources
Technology
8 hrs ago
6 Sources
Technology
8 hrs ago
OpenAI updates GPT-5 to make it more approachable following user feedback, sparking debate about AI personality and user preferences.
6 Sources
Technology
16 hrs ago
6 Sources
Technology
16 hrs ago
President Trump's plan to deregulate AI development in the US faces a significant challenge from the European Union's comprehensive AI regulations, which could influence global standards and affect American tech companies' operations worldwide.
2 Sources
Policy
36 mins ago
2 Sources
Policy
36 mins ago