4 Sources
4 Sources
[1]
I Tried Snap's Upcoming AR Glasses (Again). Get Ready for More Apps
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps Among the flood of smart glasses expected in the next couple of years, Snap is preparing its own new pair of Spectacles. CEO Evan Spiegel told me the new augmented reality glasses will be smaller than the thick, developer-focused set I've tried before. I stepped back into those developer Snap Spectacles glasses again recently to test-drive Snap OS 2.0, part of what the company is planning in advance of those glasses arriving. What I realized is that Snap's pushing forward into territory that Meta and Google haven't fully entered yet, but will. And Snap's news is clearly trying to preempt Meta's expected reveal of display-enabled glasses with gesture controls this week. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. Snap is an interesting player in the AR glasses arena because, right now, it's pretty much the only one making a truly self-contained pair that can run a variety of 3D apps with hand-tracking controls. Spectacles in their developer version are rough-edged, and Snap doesn't even make prescription inserts that match my eyes, but the apps it runs feel sort of like what Apple's Vision Pro can do, shrunk way down. And developers are already using them a fair amount to workshop real-world outdoor AR experiences that other hardware can't do yet. Snap's upgraded OS has a better web browser, along with a gallery-viewing app for looking at video captures made on the glasses and an app to browse Snap's own vertical social videos and comments. I tried all of those, but the experiences that surprised me were live translation, a generative AI assistive tool called Spotlight, and a surprise port of a fitness game called Synth Riders that I got to play. Snap's Spectacles in their current form are weirdly bulky and have a limited vertical field of view that feels like projecting a phone screen in the air. They can't accommodate my prescription yet, either, so I had to do with a step-down insert and squint a bit. But their hand-tracking and 3D spatial features feel, at best, like a mini version of the Quest 3 and Vision Pro. There aren't any captures that can show off what I demoed that impressed me, but I'll explain them as best I can. Live translation is everywhere now, from Apple's latest AirPods to Meta's Ray-Ban glasses. Snap's spin on its glasses translation shows pop-up text boxes with translation captions on the fly, but what made it interesting to me was that these boxes would float directly below the speaker's head in 3D space. Because Spectacles can float 3D objects and even flat 2D screens in space at different depths, it ended up making following the conversation feel more organic. Instead of feeling like my glasses had interrupting text, the text seemed more embedded in the world around me. That feeling continued with the assistive AI demo, where I wandered up to appliances in the room and asked what they did...and pop-up instructions with suggested steps were suddenly overlaid over parts of the thing I was looking at. A coffee machine was suddenly labeled with helpful steps. Or a refrigerator. Or a bunch of condiments. I've seen demos like this before, in a sense, but this was apparently done on the fly using generative AI. Would the AI get the steps and information right? I don't know. I only used it a couple of times, and it seemed OK... and the steps seemed to guess relatively accurately at what the parts of each appliance were. Or what the condiments were. I appreciated how the labels for things were in text that was large and clear, which could make this an interesting model for future assistive glasses to come. But I'd want smaller glasses, ones that work with my eyes...and ones I could wear all the time. It's unclear how close Snap will get to those goals. A final game app I tried surprised me, too. Synth Riders, a popular hand-tracking rhythm-based VR fitness game, was playable on Spectacles. It felt rough to play, and Snap's limited field of view for the glasses meant the experience wasn't as expansive as on the Quest 3 or Vision Pro, but it's a little sign of how glasses will aspire toward fitness gaming. Meta might be trying to enter that space soon, too, at Meta Connect. Fitness is already the biggest reason I use VR every week. Putting fitness games into glasses is the obvious next step if someone can find a way to do it without compromising the experience. While Synth Riders on existing Snap Spectacles seemed clunky, I'm extremely curious what Snap has in store for its consumer-edition glasses next year that could have better processing, size, and battery life. Fitness in glasses like those might make a lot more sense. Snap clearly announced these OS moves to step ahead of Meta's imminent reveal of its own next-gen smart glasses, but it's a sign of how many players are entering the glasses space in the next year. At some point, someone's going to figure out full standalone AR glasses, and Snap's still got its tech in the game.
[2]
Snap's next smart glasses get a major OS overhaul to rival Meta Ray-Bans
Do you remember when, not too long ago, everyone was convinced the metaverse was the next big thing? We are at a similar inflection point with smartglasses, with every company gearing up to release its own AI-smart wearable. Snap is in a unique position, as it has been releasing wearables for years, and it is now gearing up to compete with the Metas of the world. Also: Meta wears Prada? Why its next-gen AR glasses might out-style the Ray-Bans On Wednesday, Snap unveiled Snap OS 2.0, the operating system underlying its Spectacles, its AR glasses only available to developers. The new operating system unlocks helpful experiences that make activities like browsing the internet, consuming, and sharing content easier. The best part? The public will be able to experience Snap OS 2.0, its new lightweight smartglasses, soon. The operating system underlying smartglasses is nearly as important as the hardware, as it ensures users can get the most out of their smartglasses experience. With Snap OS 2.0, the company is improving how users experience the AR layer, starting with a revamped browsing experience. The Spectacles Browser was overhauled with a new minimalist design, faster loading speed, and optimized power usage for a more intuitive and simple browsing experience. It also features a new home screen with widgets and bookmarks, an updated toolbar that lets users type or speak a website URL, and a new option that lets users resize their window to a preferred aspect ratio, like on a laptop. It also includes WebXR support so users can access AR experiences from any WebXR-enabled website. Also: 5 Meta Ray-Ban upgrades that have me truly hyped for September 17 The new Spotlight Lens allows users to overlay all of their favorite content onto their glasses, including vertical video and comments. Users can anchor the content via Spotlight in one place or have it follow them around, depending on what is more convenient for the activity they are partaking in. Lastly, if you are interested in sharing content, Snap unveiled a new Gallery Lens that makes it easier to replay your Spectacles' content captures. Users can play them in a new interactive layout in which they can scroll through a curving carousel, zoom in for more detail, and upload them straight to their Snapchat Story. A new Travel Mode allows users to take the experiences everywhere they are, even on the move. Also: I tried smart glasses with a built-in display, and they beat my Meta Ray-Bans in key ways Of course, the new Lenses above are only some of the ones offered on the Spectacles, as hundreds of developers have been creating their own Lenses for Spectacles. Snap also has other fun ones to try, such as Finger Paint, Chess, and Imagine Together. While the fifth generation of Spectacles, released in 2024, was only available to developers, Snap will publicly launch Specs in 2026. Specs are meant to be a lighter take on their Spectacles, powered by Snapdragon. I had the opportunity to try the fifth-generation version myself, and I was very impressed with the realistic AR experience, vivid colors in the displays, and how well they were anchored in the three-dimensional world around me. My biggest issue was how heavy the glasses were. The new form factor will allow them to better compete with the growing number of AI-smartglasses on the market, including the Meta Ray-Bans, which have seen lots of interest from consumers. To best compete, the glasses will also have to offer competitive experiences, which is why the Snap OS 2.0 upgrades matter. Also: These ultra-thin AI glasses make the Meta Ray-Bans look outdated (with 3X the battery) Meta is slated to release new glasses itself at Meta Connect next week. The rumors include a new collaboration with Prada, and even smartglasses with in-lens displays -- a take on AI smartglasses the company has yet to explore.
[3]
Snap OS is finally ready for Snap Specs in 2026 -- I just tested the game-changing update
As we found out a few months ago at AWE 2025, Snap Specs are set to launch for consumers to buy next year -- ending a multi-year developer path and putting them directly in the hands of people like me and you. After talking to Snap's VP of hardware, Scott Myers, it's clear that the company is working on the biggest challenges when it comes to the physical design of these glasses. But what about software? Well, I got to try out the first significant consumer-friendly update in Snap OS 2.0 that will run on these upcoming specs, and to say I've been left a little mindblown would be an understatement. As I've said, the real next generation of smart glasses will occur when you get AR and AI coming hand-in-hand in the same product. In just a few demos of what the consumer Snap Specs will have, I believe these will usher in that next generation -- thanks to the huge ecosystem of Lenses. Let me tell you about them. In terms of what you can buy right now, there are plenty of impressive pairs of smart glasses -- most recently, the Rokid Glasses have been impressing me quite a bit (more on those in a future review). However, every pair you see has one fundamental crutch. Namely, it needs a phone to be connected. Not that this is a problem at the moment, because everybody walks around with a phone, so of course, they might as well take the computing power from that device to do all the thinking/processing. But this can introduce latency, as well as not really being my idyllic future of ditching the phone and just having specs do everything. The alternative workaround we're seeing from the likes of Meta's Project Orion and Project Aura from Xreal is popping all that silicon in a puck. Instead, the current developer Spectacles and the incoming Snap Specs look set to change that paradigm by putting everything you need into the glasses themselves. In this current model, Snap's pulling this off with dual Snapdragon processors, four cameras (two of them being infrared computer vision), stereo speakers and a six-microphone array. Finish that off with waveguide displays sporting a 46-degree field of view, and you've got a pretty fully-loaded package that will enable all the speed and smarts of your standard VR headset in a pair of glasses. And the end result is (in my opinion) a game-changer for spatial computing, augmenting the world around you and AI smarts to make sure you're the smartest person in any room. I've done developer demos before, but this is the first time I've seen the consumer vision for Snap Specs, and it's mighty impressive. First off, with Snap OS 2.0, I dipped into something called Spatial Tips: an app that uses multimodal AI to understand what it sees and give you advice around it. It's all good having a voice to give you instructions on how to do an ollie on a skateboard, but it's an entirely different thing seeing the steps in front of you. Not only that, but those steps are clearly labelled at each individual part of the board to let you know where to push up from and where your feet should be. Of course, this works with more than just skateboarding, and in every scenario, object recognition comes with the AI smarts to put a clearly pinned label on the item you're talking about. This is seriously cool. When it comes to the explosion of on-device language translation, there's a lot of smart stuff happening to make it possible, but the social interaction is a little weird. Either you're looking at your phone for text to appear, or if you're on audio-only smart glasses like the Ray-Ban Metas, you're waiting for the voice to tell you what's being said. Snap Specs brings that power of AI and AR together again with its Translation Lens, not just rapidly translating and providing subtitles for whoever is talking, but it pins them to the person who's talking, too. That could mean multiple people talking could be highlighted in what they're saying, so you don't miss a beat of the conversation. Oh, and shoutout to the visual intelligence around written language translation. This Lens is rapid to use -- just pinch and drag over what you want to translate, the Specs take a picture and you rapidly get everything on that document (be it a menu in Mandarin, like in my situation) in English. Then we get to the three elements that really start to make this more of an enticing all-in-one system for any consumer to pick up next year. First, an updated Spectacles Browser -- page loading speeds have been sped up considerably, and you are able to pin your browser anywhere you want. That means you could pop it above the stove with recipe information if you're cooking. And as Qi Pan, Snapchat Director of Computer Vision, confirmed to me, multiple tab support is in the works. You could have an additional browser window pinned elsewhere with a YouTube video playing. Perfect for my batch cooking while I'm waiting for my food to finish in the oven. Next is the Gallery Lens. It's a small addition, but a necessary one to see what you've captured, and Snap has really embraced the spatial element of this by allowing you to turn around and see the entire camera roll surrounding you. Moving on over to the Snapchat Collection. This is easily the biggest introduction to the Snapchat app in a long time, which has seen huge amounts of watch time of content, putting this directly into the Specs makes the world of sense to flick through vertical video and possibly even create first-person content in the future. Finally, I got to try out Synth Riders -- a rhythm game that requires you to duck under barriers and hit orbs on time. Think like Beat Saber, but more about pinching rather than slashing. And beyond having fun in this game, it opened my eyes to a real massive benefit of Snap Specs. A growing category of apps on the Meta Quest 3 and Quest 3S is focused on fitness. To make keeping fit at home fun, these titles can give you a gamified workout, but no matter how many times I try them, it's just not comfortable to exercise with a whole VR headset on my face. That changes significantly when all the computation is in something the size of glasses. The fitness opportunity here is huge, whether it's giving you collectibles on a long walk or something more fixed in place like Synth Riders. For the record, I managed to get a high score of over 7,000, but the battery went flat just as my playthrough was ending. Yes, I'm a little too competitive at times, and definitely don't want that being lost to stretching the stamina too thin! Ultimately, this is the biggest obstacle between the likes of Snap and making true smart glasses a reality. These are going to be incredibly visible on your face, so they have to look sleek, stylish and feel comfortable enough to wear all day long. While talking to Snap, it's clear this is a challenge -- maintaining the same spec list of hardware to fuel all these experiences in Lenses, and massively condensing the size of everything at the same time. And in my mind, I've no doubt that the biggest challenge is probably trying to maintain battery life. Currently, the developer Spectacles provides up to 45 minutes of use on a single charge, and daily demands will require more than this. But after speaking to the team, it's clear they're up to the challenge and there is a plan to get there. As I left the company office, I had to take a hot minute to just take stock of what I'd witnessed. So far, any demo of Snap's tech has always been with the asterisk of it being a developer project. It's cool and all, but it is housed in a giant pair of glasses and is purely a proof of concept of what is truly possible when AR and AI come together. Now, I've seen what this could really mean to any consumer who may buy the upcoming Snap Specs with Snap OS 2.0, and it's a mighty impressive vision for how we understand the world without us sans smartphone. We're still set for a 2026 launch of Snap Specs, and I can't wait.
[4]
Watch out, Meta - Snap just gave its AR glasses a big upgrade and I was shocked by how good they are
Meta Connect 2025 is barely two days away at the time of writing, but Snap's latest AR glasses update is threatening to make me forget all about Meta before it even has a chance to showcase its new tech. Snap's current best model of its Spectacles AR glasses are one of the most technically impressive gadgets I've been able to demo. Unfortunately they're intended to be a developer device meaning they aren't yet available to regular folks - nor are they the most stylish or cheapest tech out there. Thankfully Snap promises "to introduce lightweight immersive Specs to the public next year," and ahead of that launch I got to demo the upgraded operating system - complete with some new apps and tools - that will bring its AR glasses to life. Boy was I impressed. My Snap OS 2.0 demo began with the spatial tips app - a feature powered by AI. It starts out much like the Look and Ask tool on Meta's smart glasses where you ask the glasses to tell you about the things it can see. Thanks to its displays, Snap's AR glasses could in fact label everything of not in my vision with surprising accuracy - including a custom 'modular couch' in the shape of the Snap logo. It tools things a step further by also being able to provide me with tips on how to use objects when asked - for example I asked for a trick I could try using a yellow skateboard the office had as decoration (unfortunately I wasn't able to follow the instructions as the demo organiser was, perhaps rightly, worried I might injure myself). We then move onto the Specs' AR translation tools. The first demo was cool, but a tad on the basic side. I could drag my hand over a menu written in mandarin to highlight it and then have the glasses provide me with a translation of what it says. I call this basic as while the app was very easy to use, the translation appeared in a separate window rather than being next to the menu or overlaid like with, say, the Google translation tools on my phone. In the real world this AR setup would make it difficult to know which translation corresponded to which item on the menu - though I appreciate this app is still a work in progress, so I hope an update will help solve my one gripe. This is also juxtaposed to the live conversation translations which were a dream. As my host spoke in mandarin I saw a line of english text automatically appear below their head like real-life subtitles. What was especially neat was this app can support multiple users beyond just two, and supports over 40 languages. So you could have a small group of people all talking in their respective native languages and each other person (provided they're using Snap's AR glasses) would see translated subtitles in the language they understand. Seeing the demo in action was like seeing into the future - a feeling Snap's AR glasses gave me the last time I demoed them. Lastly, I got to experience some new games like Synth Riders - an AR rhythm game ported from the VR version - and also browse the internet and videos on Snap's Spotlight platform. Nothing here was especially mind-blowing, but these are the sorts of day-to-day tools that will help make consumer AR specs feel like a generally useful tool, rather than a hyper-specialized device. I could imagine myself digitally flicking through the news on my AR glasses' browser while eating my breakfast each morning. This demo reaffirmed my belief that AR glasses will likely be the next big thing - maybe even supplanting smartphones as our go-to gadget in the next decade or sooner. Sure, the hardware is a little goofy looking right now, but the utility Snap's Spectacles provided me - and also the inventive social interactions they facilitate - more than made up for how silly they might make me look. And as the tech improves, they should only get slimmer and more normal looking. Snap's rivals should be on notice, especially meta as Connect is fast approaching, as the AR revolution is coming, and at least one of the players aiming for the top spot is already bringing its A game.
Share
Share
Copy Link
Snap unveils Snap OS 2.0 for its upcoming AR glasses, showcasing advanced features like real-time translation, AI-powered spatial tips, and improved browsing capabilities. The update positions Snap as a strong competitor in the evolving smart glasses market.
Snap has announced a major update to its augmented reality (AR) glasses operating system, Snap OS 2.0, setting the stage for the release of its consumer-focused Snap Specs in 2026
2
4
. This update introduces several new features and improvements that position Snap as a formidable competitor in the rapidly evolving smart glasses market.Source: ZDNet
One of the most impressive features of Snap OS 2.0 is the Spatial Tips app, which combines multimodal AI with AR capabilities. This app can recognize objects in the user's environment and provide detailed, visually augmented instructions. For example, when asked about skateboarding, the app can display step-by-step instructions with labels pinned to specific parts of a skateboard
3
4
.Source: TechRadar
Snap's Translation Lens takes language translation to a new level by combining AI and AR technologies. The system not only provides real-time translation of spoken words but also pins subtitles to the person speaking, making multi-lingual conversations more intuitive and natural
1
3
. Additionally, the visual translation feature allows users to quickly translate written text by simply pinching and dragging over the desired area3
.The Spectacles Browser has been overhauled to provide a more intuitive and efficient browsing experience. Key improvements include:
2
3
Users can also enjoy a new Gallery Lens for reviewing captured content and a Spotlight Lens for overlaying favorite content onto their glasses
2
.In a surprising move, Snap has ported the VR fitness game Synth Riders to its AR platform. While the current implementation may feel rough due to hardware limitations, it hints at the potential for fitness applications in future AR glasses
1
3
. This development could position Snap to compete in the growing VR/AR fitness market.Related Stories
The current developer version of Spectacles features dual Snapdragon processors, four cameras (including two infrared for computer vision), stereo speakers, and a six-microphone array. The waveguide displays offer a 46-degree field of view
3
. However, Snap acknowledges that the current model is bulky and plans to release a lighter, more consumer-friendly version in 20261
2
.Source: CNET
Snap's announcement comes just ahead of Meta's expected reveal of new smart glasses at Meta Connect. With companies like Apple, Google, and others also entering the AR glasses space, Snap's early moves and established ecosystem of Lenses could give it a competitive edge
1
2
4
.As the AR glasses market heats up, Snap's latest developments demonstrate the potential for these devices to become powerful, standalone computing platforms that could eventually challenge the dominance of smartphones in our daily lives
3
4
.Summarized by
Navi
[3]