Curated by THEOUTPOST
On Thu, 7 Nov, 12:02 AM UTC
4 Sources
[1]
I went shopping wearing the AI-powered Meta Ray-Bans -- here's what I love and what annoyed me
Let's just say I like running running errands a little more now I've used Meta AI a lot, but I had no idea how much it could transform a typical Target run and a grocery shopping errand. Using Meta Ray-Ban glasses. I was eager to test their AI functionality in a real-life scenario. These smart glasses combine iconic style with Meta's cutting-edge features, including AI technology such as voice-command texting, music streaming, video capture and real-time data. At first glance, they look like any regular pair of glasses or sunglasses. Well, on me they look like an oversized pair of glasses -- right now the glasses only come in standard size or large, so if you have a small face like me, it might be a little awkward at first. I opted for the clear version so I could use them indoors. The glasses quickly proved their worth at my local grocery store. As a parent juggling three kids, I'm constantly managing a mental list of what to buy. With the glasses on, I could easily take hands-free notes via voice commands. I could also get texts from my husband such as, "Don't forget trash bags," without ever needing to pull out my phone. The result? My focus stayed on finding everything I needed instead of fumbling with a device. I started off my weekend errands with a trip to Target. Notorious for being a place I often impulse buy, I was able to take quick photos of products using the built-in camera, rather than adding them to my cart on a whim. This feature allowed me to visually track what caught my attention while also asking Meta for alternative products in real-time. Using AI, I really liked talking to Meta and getting immediate answers. For example, I asked Meta if freezer Ziplock bags were really necessary or if regular Ziplock bags would suffice. Along those lines, whenever I needed more information about a particular product, I asked Meta and was able to get a response without pulling out my phone. At one point, I found myself looking at an apple/pear hybrid fruit and asked Meta what I was looking at and then how the fruit could be used. Additionally, because I connected my Meta Ray-Bans to Meta platforms like Messenger and WhatsApp, I was able to make a call, and my husband could see what I was seeing and add his thoughts. It was great to get his opinion without pausing my shopping flow. The video quality is surprisingly clear and it felt natural to chat without holding a phone. Did I get a few strange looks? Yes, a few. Yet, these days with so many individuals chatting while wearing wireless earbuds, it really didn't feel much different. For fun, I decided to test the glasses' livestreaming feature. While in Target's seasonal decor aisle, I shared a live tour with my social media followers. The Meta Ray-Bans streamed the shopping experience directly to my Instagram, letting my followers see what I was experienced in real time. Of course, I could have simply made a video and posted it later, because everything from photos and videos to conversations with Meta AI are saved within the app. The hands-free nature of this feature allowed me to engage with my audience while still browsing effortlessly. I'm not typically one for live streaming, but because it was so easy with the Meta Ray-Bans, I might consider it more often. One unexpected perk of the Meta Ray-Bans was how they made errands more enjoyable. As I strolled through the aisles, I listened to my favorite playlist using the built-in speakers. The sound quality was immersive without being disruptive to others around me. And when I remembered I needed to confirm a schedule with a friend, a quick "Hey Meta, call Sarah" handled the task in seconds. While the Meta Ray-Ban glasses were impressive overall, there are areas for improvement. Battery life is one concern; after about 2 hours of use, the glasses needed recharging. The battery life is about 4 hours, but with heavy use such as live streaming, listening to music, and several video and image captures, I felt as though the battery faded faster for me. This might not be ideal for those who plan to use them for extended outings. Additionally, some features, like the voice-to-text functionality, struggled slightly in louder environments, such as the busy Target checkout line. At times, it took several minutes for Meta to acknowledge me. Something else to keep in mind is that the Meta Ray-Bans require Wi-Fi. Luckily, my local grocery store and Target have guest Wi-Fi. Otherwise, I would have used the hotspot on my iPhone to give them the Wi-FI they need to be fully useful. Despite these minor drawbacks, the Meta Ray-Bans proved themselves to be a valuable companion for running errands. They seamlessly combine style with practicality, allowing me to shop, capture images and stay connected -- all without ever reaching for my phone. Shopping with the Meta Ray-Bans turned an ordinary errand run into a much better and more productive experience. These smart glasses elevate routine tasks with innovative convenience. While they're not perfect, their potential to streamline and enhance everyday life makes them a worthy investment for anyone looking to streamline productivity.
[2]
I'm convinced Meta's Ray-Ban smart glasses are the most underrated gadget of 2024
Back in my day, smart glasses turned you into a social pariah. In 2013, Google's ill-fated smart glasses, Google Glass, were released unto the world, inadvertently entering the term "Glasshole" into our collective lexicon. Google Glass's ability to record more covertly than a phone or other handheld device wound up labeling them a blatant invasion of personal privacy and, by proxy, branded anyone wearing them an unethical voyeur. Just two years after its release, Google said it would stop producing Google Glass. However, a decade has passed since then, and naturally, public perception and tolerance for potential incursions on privacy have shifted. Smart glasses -- once a third rail of a form factor -- now represent a bright new future, pursued by even Apple, one of the most conservative forces in consumer tech, via its reportedly underway "Atlas" study. So, all these years later, it turns out Google was right -- or more accurately, Meta was right for them. When I got my hands on a pair of Meta's Ray-Ban smart glasses for the first time a couple of months ago, I didn't know what to expect. I wasn't opposed to a pair of camera-clad glasses, but I also wasn't exactly itching for a pair either. I mean, how smart can a pair of smart glasses without a display really be? With them in hand, however, I decided to see what Meta has been pitching, and to my surprise, I've barely left the house without them since. Meta Ray-Ban might not be the explosive future being teased by bleeding-edge prototypes like the Orion glasses -- Meta's lightweight, experimental pair of true AR glasses that can superimpose the digital world onto the world around you -- but they're objectively a good start. Let me be clear: There are lots of things that Meta's Ray-Ban glasses don't do. They don't have a screen, so you can't use them to browse the web, play games, or doom-scroll on social media. The glasses can, however, pair with the most important device in everyone's arsenal: their smartphone. The most surprising and exciting utility that I've found in Meta's Ray-Ban glasses is as a Bluetooth speaker. While the glasses don't fully mesh with my iPhone (Apple isn't known for its inclusivity with third-party products), they do play nice with apps like Spotify. A Spotify integration, Meta's voice assistant, and surprisingly quality audio all combine to make a more-than-pleasant listening experience. Arguably, the best part of that listening experience is that you can listen to your audio of choice while also listening to your surroundings, much like open-air earbuds made by Bose or Nothing. Trust me when I say that you'll feel the benefits of that one foot in and one foot out experience when you hop on a bike, for example, and can actually hear traffic and pedestrians around you. Add navigation to that equation and the deal sweetens even further. And their usefulness as fashionable speakers isn't restricted to listening to music. In my couple of months of experience, taking calls on them is arguably the best use. There's something that just feels incredibly natural using Meta's Ray-Ban glasses for audio calls -- no holding a phone and no plugging up your ears with buds. Every mutual caller I surveyed also reported that my audio was crisp and clear, which is more than I can say for other open-air earbuds that I've tested, which can be susceptible to ambient audio bleed from your surrounding environment. And if you're rolling your eyes while reading this, wondering why anyone in their right mind would pay $300 for glasses with speakers, there's still more. While audio quality was a big surprise for me, the Meta Ray-Ban's headlining feature is definitely the camera on the glasses frames. Again, I wasn't dying for a pair of glasses that could take pictures, but once I had the option to record video and take pictures from a device that was already on my face, it felt like a game-changer. I'm not a massive fan of whipping out my phone and taking pictures, so having the option to press a button on the glasses or tell Meta's voice assistant to start recording felt less invasive and made me more likely to capture a moment. In addition, it helped me take some videos that I might not otherwise be able to capture -- riding a bike, for example, isn't usually an ideal time to whip your phone out. But what excites me most about Meta's glasses is not just what they can do now; it's what the future has in store. Being able to take pictures and videos or play music aren't groundbreaking features in and of themselves, but when packaged into a form factor as appealing as a pair of sunglasses, they feel genuinely exciting. There are suddenly quite a few tech companies trying to smash the status quo of smartphones -- Humane with its Ai Pin or Rabbit with its R1 -- but arguably none do it as effectively as a pair of Meta Ray-Bans. Humane not only overpromised and underdelivered with its Ai Pin, but it also arguably miscalculated appetite for a new wearable form factor. And sure, Meta's Ray-Ban glasses aren't a standalone computer yet, but they're doing a pretty damn good job of mimicking one while the technology gets there. And the future looks promising. In an update this October, Meta started rolling out Meta AI features that are meant to expand the glasses' features even further, giving them computer vision that can identify objects in your line of sight or translate text into different languages. My experience of those features hasn't been particularly compelling yet -- sometimes Meta AI just seems to be confused by commands and sometimes it's just blatantly misleading. If there's one thing I've learned about AI, however, it's that you can't rule much out. There's a chance that Meta AI never advances beyond its current hit-and-miss nature, but there's an equally as big chance that it lays the groundwork for the next-gen smart glasses we want. In any case, it's time to stop sleeping on smart glasses because if you're anything like me, you might just see something you like.
[3]
The Ray-Ban Meta Glasses Work Great and Don't Look Dorky
If this was 1987, I'd create a syndicated adventure show around a pair of Ray-Ban Meta Glasses. Here's my pitch for CodeName: SPEX: "Sgt. Steve Johnson, accused of a crime he did not commit, liberates a pair of super-intelligent Wayfarers from a secret government lab. Steve and SPEX (Surveillance, Proximity, Enhancement, eXtraction) roam the country using SPEX's powers to solve mysteries, all while keeping one step ahead of the agents pursuing them." All this to say that Meta's smart glasses are 1980s-syndicated-TV-show, living-in-the-future kind of cool. Not just cool for their impressive technology, but cool because they're actually useful; maybe not for outsmarting government agents, but for solving everyday mysteries, like "where did I park my car?" For the three people who haven't seen Meta's ubiquitous advertising campaign: Ray-Ban Meta Glasses are sunglasses/eyeglasses with a built-in camera, speakers, and AI that can be controlled with your voice and simple gestures. They do not have a display screen, though, so you'll need to look elsewhere if that's your bag. Meta is working on true AR smart glasses with a built-in display (Orion), but that's likely far off. The design of the Ray-Ban Meta glasses may prove to be the "killer feature" that elevates them above the competition. With glasses, looks are important -- you're wearing them on your face, after all -- and unlike the infamous Google Glasses of years ago or other brands of smart glasses on the market, Ray-Ban Meta Glasses are stylish enough that I'd wear them if they didn't have built-in technology. They come in three time-tested Ray-Ban frame shapes -- Skyler, Wayfarer, and Headliner -- and offer multiple colors and lens combinations, including the option of prescription lenses. Ray-Ban Metas weigh 49 grams (10 more than my regular specs) and the built-in camera is unobtrusive, so you can wear them all day and not look like a dork (until you say "Hey Meta, what's the score of the Eagles game?" to yourself on a crowded bus.) There has been a lot of talk in techie circles lately about the possibilities of wearable AI assistants like Humane's AI Pin or the Rabbit R1, but early reviews have not been positive about either. The idea of replacing your phone with a phone-sized gadget (but only for some tasks, so you still need to carry your phone) just isn't appealing to most. But cramming AI into your eyeglasses means there's no extra gadget to take up pocket space, and, because it's voice activated, it can be operated hands-free. The set-up and pairing with the companion app, Meta View, was uneventful. A lot of thought seems to have gone into making the user experience as easy as possible. After a brief tutorial, you're on your own, but if you forget the gesture controls or something, you can ask your glasses to explain them to you again. In basic terms, Ray-Ban Meta's AI assistant can see what you're seeing, translate text, and answer questions. If you're looking at a cool flower, you can say, "Hey, Meta, what kind of flower is that?" Or you can ask, "Hey, Meta, what am I looking at?" And it will describe your view with scary accuracy. Meta's AI can translate signs and other text into multiple languages, tell you what the breakfast hours are of the McDonald's you're looking at, tell you whether it gets good reviews, and give you a suggestion for what to order. (Meta recommends the Egg McMuffin.) You can ask it general questions too, like "When does the new season of Severance premiere?" or "What's the address of Circus Liquor in North Hollywood?" It can even tell jokes -- not funny jokes necessarily, but things that are technically jokes. You can use it to remember things for you, too. Tell it "remember that I have a doctor's appointment on the 12th" or "remember that my car is parked in the orange section in space 435," then later have it recall the information. As cool as it would be to say, "Hey, Meta, book me a room at the MGM Grand Hotel for this Saturday," it's not there yet. Complex tasks that would involve potentially using other apps on your phone aren't possible. It also can't give you turn-based-directions, identify the song you're listening to, or remember the name of the person you're looking at. Also: It only responds to "hey Meta," not "Hey SPEX" as I'd prefer. For influencers and other perpetually-online folks, the Ray-Ban Meta's main selling point is likely its ability to capture images and video, then instantly upload them to Instagram or Facebook with a word. You can also livestream, but only to Instagram and Facebook. A click of the button on the glasses arm, or saying "Hey, Meta, take a picture," will take a snapshot of what you're looking at, so you can capture a still or a video while you're riding a bike or driving. The resolution of the Ray-Ban's photos doesn't equal a modern smartphone, but a 12 MP camera that takes 3024x4032 still images and 1080p video isn't potato-quality, either. It does a fairly nice job with lower light situations, too. Speaking of the video: I was impressed with the Meta-Ray Ban's image stabilization and the wide field of view, but bummed that it only shoots in one, vertical aspect ratio: perfect for TikTok but bad for a feature film. Check it out yourself: Because there's no viewfinder, it's difficult to frame shots, so it's best used for casual, on-the-fly images instead of careful compositions, and you'll probably need to crop everything later for best results. Here's a straight-from-the-glasses photo, taken in a lower-light hallway, to give you an idea of the image quality: Conversations with your eyeglasses are cool and all, but if you want to interact with other humans, you can use Ray-Bans to send texts, make and answer voice calls, and make video calls. You can switch between your glasses-camera and your phone's camera in a video call on WhatsApp and Messenger, so if you need to show someone something, your pal can see the world through your eyes. (It won't work on FaceTime or other non-Meta platforms.) Kind of creepy, but kind of cool. This all worked exactly as expected, with little hassle -- all I really want out of tech gadgets. Along with taking snapshots and telling jokes, the Ray-Ban Meta glasses pair directly with Spotify, Apple Music, Amazon Music, and Calm through a connected device, and can be used as a Bluetooth speaker to play whatever you like. With a command of "play music" you can start the tunes, and then skip ahead with a tap on the glasses or a "skip song" command. The volume can be controlled the same way. Like the video quality, the audio is fine, but not near the level of a decent set of headphones or earbuds. The highs and mids are clear; the bass is weak, but it's adequate overall. The glasses boast a battery life of "up to four hours," but this varies based on usage. While it might seem short, especially if this is your everyday wear, the Ray-Ban Meta case contains additional battery power, allowing for eight more charges on the go. Any discussion of what something is worth is subjective, but $329 for the base Wayfarer model is less than I paid for my last set of frames, and they don't ever answer me when I talk to them. For comparison, the cost of the cheapest Humane AI pin is $499 and requires a monthly subscription, while the Rabbit R1 runs $199. With their retro design, practical AI capabilities, and hands-free operation, Ray-Ban Metas are the kind of glasses Q would have given to James Bond. While there are limitations, like the lack of a display and some complex task constraints, overall, Ray-Ban Meta glasses are an "I didn't know I always needed this" gadget that makes many things I do anyway, like taking pictures and sending texts, easier and cooler.
[4]
Meta Orion AR glasses hands-on: The first AR glasses I actually want to wear
Meta's AR glasses not only have a good fit, but they function well, too Meta's Orion AR glasses may still be in the prototype stage, but they've already managed to pull off something few other mixed reality headgear devices have been able to do. This is the first time in a long time that I've slipped on this kind of device and didn't start the mental countdown on how long it would be before I could take it off. Spatial computing headsets may be the future, but if so, it's a future full of devices that I do not like to wear. As a rule I find headsets uncomfortable -- even the few that aren't too heavy on my headmake me sweat around the strap on the back and the viewfinder up front. I also don't care for the sensation of feeling cut off from the world around me, even on visors with pass-through capabilities. AR and VR glasses certainly don't feel as heavy, but they can still feel hot the longer I wear them, and they tend to slide down my nose at inopportune times. It's just not a fun experience no matter the device. But the Meta Orion glasses didn't pose these problems at all during a recent hands-on demo I experienced at Meta headquarters in Menlo Park, California. The prototype I wore felt fairly lightweight and certainly didn't heat up as my 20-minute demo went on. I could always see and hear the world around me, without my peripheral vision getting cut off. Even better, the glasses didn't slide down my face, keeping the AR visuals right in front of me at all times. That's encouraging since Meta would be the first to tell you that the version of the Orion glasses that I wore isn't quite ready for prime time. The glasses themselves are still pretty thick -- not Buddy Holly-thick like the Snap Spectacles AR glasses I tried out last month, but still bulky enough to make people think you'd been cast in a "Revenge of the Nerds" reboot if you wore them in public. Among other improvements, Meta wants to slim down the form factor before its AR glasses launch commercially. "For all of this to work well, it really has to be socially acceptable," said Ming Hua, Meta's vice president of wearable device. "You feel comfortable wearing the glasses all day long, with your friends and also when you're going out and about." That's a ways down the road, though not that far off as you might think. (Meta has merely said that a shipping version will be ready in "the near future.") And while the company works on perfecting the design of the glasses, improving the display and bringing the cost down to what you'd spend on a high-end smartphone, Meta can at least take comfort in knowing that it's already accomplished a much tricker task -- it's actually come up with some compelling uses cases for AR glasses. The world got its first look at the Orion AR glasses during an introduction at September's Meta Connect conference. What we saw at that time was the culmination of five years of work in a which a series of ridiculous oversized prototypes gradually got shrunken down into the current Orion setup -- a pair of glasses that connects wirelessly to a small puck and a wristband that's used to help navigate through the AR interface. (More on those controls in a bit.) Instead of glass, Meta uses silicon carbide for Orion's lenses. The lightweight material reduces optical artifacts and also provides a high refractive index, which Meta says is necessary for a wide field of view. The Orion glasses boast around a 70-degree field of view, which felt a lot less cramped than other mixed reality eye glasses that I've used where images often get cut off. For instance, when I had a video chat via the Messenger app on my Orion glasses, I could see a full-screen view of the person I was conversing with, as easily as if the chat were taking place on a phone or computer screen. The big difference is that with the glasses, the conversation took place right in front of me, making it feel more immersive than your typical video chat on a flat display. There's another nice effect to the silicon carbide lenses. While people looking at me might see some bluish-purple streaking on the lenses from the AR images appearing before me, they can still see my eyes, instead of an opaque screen -- or worse, a digital recreation of my eyeballs. It should allow more natural interactions with anyone sporting a pair of Orion glasses. The glass frames are made out magnesium -- a material that's rigid as well as lightweight. Hua tells me the rigidity is importnat to prevent any misalignment between the two display engines on the glasses. Magnesium also helps with heat dissipation, which is one of the things that helps the glasses feel to comfortable on my head even after prolonged periods. And that's important because there's a lot of components hidden in the frames of those glasses. Besides uLED projectors -- they're very small and power efficient -- you've got multiple custom chips and seven cameras embedded in the frame. "We roughly need to reduce the power consumption for each of those operations to, like, a tenth of what's in the form," Hua said. The processors on borad the glasses are handling things like simultaneous location and mapping, eye tracking, hand tracking and AR world=locking graphics algorithms, while the elongated puck uses dual processors to take care of apps. The puck also manages low-latency graphics rendering AI tasks. The glasses and puck connect wirelessly, and they don't need to be right next to each other. Meta says you can slip the puck into a backpack as you're using the glasses without having to worry about a loss of connectivity. During my demos, I never needed to carry around the puck as I was using the glasses. At this stage the puck has enough battery power to get through a day, while the glasses are good for three to four hours of use, depending on what kind of activities you're engaging in. There's no input device for Orion other than the gaze of your eyes and the fingers on your hand, though that wristband I mentioned earlier is there to help with controls. It's an electromyography, or EMG, wristband and it senses the electrical signals of your muscle movements, beaming that gesture to the glasses. "When you're making gestures, your brain is sending electric signals to the hand," Hua said. "So we use sensors... to capture the voltage change when you're making gestures." The idea behind the wristband is that you can keep your hand at your side, making subtle gestures to control the glasses instead of waving your hands in your field of view and attracting the stares of curious onlookers. Did that stop me from raising my hand into my field of view and making those gestures anyhow? It did not, but perhaps with more practice I'd get used to keeping my hand out of sight while working the controls. The wristband fit snugly but comfortably on my arm, and I really took no notice of it while using the Orion glasses. That's significant since I hate having anything strapped to my wrist -- I won't even wear a smartwatch for this reason -- so the fact that I could don Meta's EMG band without a whimper of complaint implies that the company's done a pretty good job at making it feel light and natural. The gesture controls are pretty natural, too. You pinch with your index finger and thumb to select things, with your eyes acting as a sort of cursor, as the glasses detect what button your looking at. A middle finger/thumb pinch takes you to the app launcher, and repeating that gesture hides the same control. Make a fist and flick your thumb forward and back when you want to scroll through something like an Instagram Reel. There was a brief tutorial at the start of my demo session to acquaint me with those controls, but the fact that Meta has kept things so simple makes it easy to retain what it is your supposed to be doing to find your way around Orion's menus. It's ideal that the controls require nothing more than hand gestures and a steady gaze, as the big appeal of the Orion glasses is the ability to use them hands free. That's what struck me during a cooking demo meant to showcase the image recognition features of the glasses by having Orion identify different ingredients and whip up something incorporating those same ingredients. That's all well and good, but as someone who does a lot of cooking and has to refer back to recipes from time to time, I appreciate having the instructions floating in front of my vision, while I use my hands to scroll forward and scroll back. That can be problematic if you're spatchcocking a chicken, for example, and you've got to then touch your iPad screen to advance to the next step. With Orion, you won't have to wash off those chicken-covered hands -- just flick your thumb forward, and keep cooking. If there's something I dislike about headsets and glasses almost as much as their comfort level -- or lack of same -- it's how they cut me off from the rest of the world. Just as the Orion glasses proved to be surprisingly comfortable, they also raise the possibility of a more collaborative experience with AR. One of Meta's demos allowed me to play pong with another Orion glass-wearing participant. We stood a few feet apart, sending a virtual ball ricochetting back and forth by moving our paddles up and down or left and right. It was certainly a fun way to pass the time, but it also illustrates how those augmented images appearing in front of you don't have to be for just you alone. To be fair, the Snap Spectacles also featured a collaborative demo involving finger painting when I tried out those AR glasses prior to my Orion test drive. But I don't think it's unfair to Snap to say that the fifth generation of its smart glasses aren't as far along, graphics-wise, as Orion is at this point. And the wider field of view for the Orion glasses makes collaboration and cooperation a little bit easier. This fits in with Meta's overall hope for Orion, as it wants its AR glasses to be the successor to the smartphone as the device we use to interact with the world around us. "We're hoping to make it so that with glasses, a lot of what you're doing today with your phone, like checking messages, notifications, making a phone call, can be more seamless and hands free," Hua said. Certainly, that makes sense on some levels. Use a phone, and your gaze is locked on a screen, limiting your ability to be a part of what's happening around you. AR glasses let things unfold in front of you -- and Meta argues that its approach would let you still see your surroundings and remain a part of conversations and interaction with the world around you. But not every demo I saw made me ready to trade in my phone for a pair of souped-up specs. To show off Orion's multitasking capabilities where I was watching Instagram Reels when a message came in. I switched over the Messages app and fielded a video call, with all three panels appearing relatively clearly in front of me. The demo's supposed to showcase not only the wide field of view on the Orion glasses, but also how you can use the glasses to multitask. To me, however, it felt a built overwhelming. I get the same feeling when Apple shows off all the floating workspaces that can hover around your head when you're wearing a Vision Pro headset. Maybe some people find that convenient, but to me it's just a reminder of my ever expanding to-do list, only floating directly in my face. Less of that, please. It'll be some time before Orion glasses are ready to be worn out in the field by civilians like you and me. Meta is showing off the glasses at this point so that the company's devices team can get feedback from both Meta employees and external partners on what features to develop and which functions to leave on the cutting room floor. I'd also wager that app makers are getting a chance to build AR versions of their apps optimized for Orion so that there will be plenty of options ready to run once the glasses do hit the market. I don't know how long it will Meta to improve the Orion displays and shrink down the form factor of the current version, but judging by the progress that the company has made evolving from prototype to prototype, it may not be all that long. As recently as 2019, Meta's stab at holographic AR glasses featured a backback and a headset that looked like you were about to perform some heavy welding. Meta's desire to get the glasses to the same price level as a high-end phone may be a tougher roadblock. Assuming we're talking about conventional phones, that would be in the $1,199 to $1,299 range of the iPhone 16 Pro Max and Galaxy S24 Ultra; your top foldable phones are in the neighborhood of $1,799 to $1,899. I haven't priced components for smart glasses lately, but given the kind of tech Meta is packing into the Orion specs, I imagine it's going to be a challenge getting to that range. But -- and I never thought I'd be saying about any kind of AR product -- I hope Meta gets there. The Orion glasses in their current form hold a lot of promise -- not just for fit, but for functionality as well.
Share
Share
Copy Link
Meta's Ray-Ban smart glasses combine AI capabilities with stylish design, offering features like hands-free photography, AI assistance, and audio playback. While current models have limitations, future versions promise more advanced AR functionality.
Meta's collaboration with Ray-Ban has produced a line of smart glasses that are making waves in the wearable technology market. These AI-powered glasses combine iconic style with cutting-edge features, offering a glimpse into the future of everyday tech 1.
The Meta Ray-Ban glasses come in three classic frame shapes - Skyler, Wayfarer, and Headliner - with multiple color and lens combinations 3. Weighing just 49 grams, they're only slightly heavier than regular glasses, making them comfortable for all-day wear. The built-in camera is discreet, allowing users to blend in seamlessly 3.
One of the standout features is the integration of Meta's AI assistant. Users can activate it with a simple "Hey Meta" command, allowing for hands-free operation. The AI can identify objects, translate text, answer questions, and even remember important information for later recall 3.
The glasses feature a 12 MP camera capable of taking 3024x4032 still images and 1080p video 3. Users can capture moments hands-free, either by voice command or by pressing a button on the frame 1. The built-in speakers provide surprisingly good audio quality for music playback and calls 2.
The glasses pair with smartphones and integrate with apps like Spotify, Instagram, and Facebook. Users can easily livestream or share photos and videos directly to social media platforms 1 2.
While impressive, the current model has some limitations. Battery life is around 4 hours with heavy use, and the glasses require Wi-Fi or a hotspot connection for full functionality 1. They also lack a display screen, which sets them apart from true AR glasses.
However, Meta is already working on more advanced versions. The prototype "Orion" AR glasses promise a wider field of view, silicon carbide lenses for better optics, and more powerful onboard processing 4. These future models aim to offer a more immersive AR experience while maintaining comfort and style.
Users report that the glasses make everyday tasks more enjoyable and productive. From hands-free shopping assistance to easy note-taking and communication, the Meta Ray-Ban glasses are proving to be a valuable companion for many 1 2.
As Meta continues to refine and expand the capabilities of these smart glasses, they may well represent the future of wearable technology, potentially changing how we interact with digital information in our daily lives.
Reference
[1]
[3]
The Consumer Electronics Show 2025 highlights significant advancements in smart glasses technology, featuring AI integration, augmented reality displays, and more natural designs.
20 Sources
20 Sources
Meta's Ray-Ban smart glasses receive a significant AI update, introducing multimodal features that enhance user interaction and functionality, potentially revolutionizing the smart glasses market.
3 Sources
3 Sources
Ray-Ban Meta smart glasses are outselling traditional Ray-Bans in many stores, with new AI features rolling out globally. The glasses' success has led to an extended partnership between Meta and EssilorLuxottica.
4 Sources
4 Sources
Meta showcases groundbreaking technologies at Connect 2024, including the Quest 3S headset and AI innovations, positioning itself as a leader in the tech industry and challenging Apple's dominance.
10 Sources
10 Sources
Meta launches advanced smart glasses, sparking debate on the future of smartphones. CEO Mark Zuckerberg predicts smart glasses will replace phones by 2030, as the technology rapidly evolves.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved