Curated by THEOUTPOST
On Mon, 30 Sept, 4:02 PM UTC
4 Sources
[1]
The Next Big Thing Is Still ... Smart Glasses
Last week, Mark Zuckerberg stood on a stage in California holding what appeared to be a pair of thick black eyeglasses. His baggy T-shirt displayed Latin text that seemed to compare him to Julius Caesar -- aut Zuck aut nihil -- and he offered a bold declaration: These are Orion, "the most advanced glasses the world has ever seen." Those glasses, just a prototype for now, allow users to take video calls, watch movies, and play games in so-called augmented reality, where digital imagery is overlaid on the real world. Demo videos at Meta Connect, the company's annual conference, showed people playing Pong on the glasses, their hands functioning as paddles, as well as using the glasses to project a TV screen onto an otherwise blank wall. "A lot of people have said that this is the craziest technology they've ever seen," Zuckerberg said. And although you will not be able to buy the glasses anytime soon, Meta is hawking much simpler products in the meantime: a new Quest headset and a new round of software updates to the company's smart Ray-Bans, which have cameras and an AI audio assistant on board, but no screen in the lenses. Orion seems like an attempt to fuse those two devices, bringing a fully immersive computerized experience into a technology that people might actually be comfortable putting on their face. And it is not, you may have noticed, the only smart-glasses product to have emerged in recent months. Amazon, Google, Apple, and Snap are all either officially working on some version of the technology or rumored to be doing so. Their implementations are each slightly different, but they point to a single idea: that the future is about integrating computing more seamlessly into everyday life. Smartphones are no longer exciting, and the market for them has been declining for the past few years. The primary new idea there is foldable screens, which effectively allow your phone to turn into a tablet -- though tablet sales have slowed too. The virtual-reality headsets that companies have spent billions developing aren't being widely adopted. These companies are betting big that people want to be able to check the weather without pulling out a smartphone -- and that they are more willing to wear a pair of Ray-Bans with cameras than spend hours in the metaverse. And after years of false starts on the glasses front, they're betting that AI -- despite some high-profile flops -- will be what finally helps them achieve this vision. Tech companies have been working on smart frames for decades. The first real consumer smart glasses started appearing in the late 1980s and '90s, but none broke through. At last, in 2013, Google released its infamous Glass eyewear. A thin metal frame with a camera and tiny screen above one eye, Glass could be used to check emails, take photos, and get directions. They were advanced for their time, but the general public was spooked by the idea of face-cameras constantly surveilling them. In 2015, Google abandoned the idea that Glass might ever be a consumer product, though the frames lived on as an enterprise device until last year. Glass's failure didn't deter other companies from taking a swing. In 2016, Snapchat launched its first generation of Spectacles, glasses that allowed users to capture pictures and videos from cameras mounted above each eye, then post them on their account. In 2019, Amazon jumped in, teasing its Echo Frames -- camera-less smart glasses with Alexa built in -- which went on sale to the public the following year. Meta, then called Facebook, launched the first iteration of its collaboration with Ray-Ban in 2021, though the frames didn't catch on. Then there are the virtual-reality headsets, such as Meta's Quest line. Last summer, after Apple announced the Vision Pro, my colleague Ian Bogost deemed this the "age of goggles," pointing out that companies have been spending billions developing immersive technology, even though the exact purpose of these expensive headsets is unclear. In some ways, this glasses moment is something of a retreat: an acknowledgment that people may be less likely to go all in on virtual reality than they are to throw on a pair of sunglasses that happens to be able to record video. These devices are supposed to look and feel more natural, while allowing for ambient-computing features, such as the ability to play music anywhere just by speaking or start a phone call without having to put in headphones. AI is a big part of this pitch. New advances in large language models are making modern chatbots seem smarter and more conversational, and this technology is already finding its way into the glasses. Both the Meta and Amazon frames have audio assistants built in that can answer questions (How do whales breathe?) and cue up music (play "Teenage Dirtbag"). Meta's Ray-Bans can "look" using their cameras, offering an audio description of whatever is in their field of vision. (In my experience, accuracy can be hit or miss: When I asked the audio assistant to find a book of poetry on my bookshelf, it said there wasn't one, overlooking an anthology with the word poetry in the title, though it did identify my copy of Joseph Rodota's The Watergate when I asked it to find a book about the Washington landmark.). At Connect, Zuckerberg said that the company plans to keep improving the AI, with a couple of big releases coming in the next few months. These updates will give the glasses the ability to do translation in real time, as well as scan QR codes and phone numbers on flyers in front of you. The AI will also, he said, be able to "remember" such things as where you parked your car. One demo showed a woman ruffling through a closet and asking the AI assistant to help her choose an outfit for a theme party. But whether AI assistants will actually be smart enough to realize all of this is still somewhat of an open question. In general, generative AI struggles to cite its sources and frequently gets things wrong, which may limit smart glasses' overall usefulness. And, though the companies say the technology will only get better and better, that's not entirely certain: The Wall Street Journal recently reported that, when Amazon attempted to infuse Alexa with new large language models, the assistant actually became less reliable for certain tasks. Products such as Orion, which promise not just AI features but a full, seamless integration of the digital world into physical reality, face even steeper challenges. It's really, really difficult to squish so many capabilities into eyewear that looks semi-normal. You need to be able to fit a battery, a camera, speakers, and processing chips all into a single device. Right now, even some of the most state-of-the-art glasses require you to be tethered to additional hardware to use them. According to The Verge's Alex Heath, the Orion glasses require a wireless "compute puck" that can be no more than about 12 feet away from them -- something Zuckerberg certainly did not mention onstage. Snap's newest Spectacles, announced earlier this month, don't require any extra hardware -- but they have a battery life of only 45 minutes, and definitely still look big and clunky. The hardware problem has bedeviled generations of smart glasses, and there still isn't a neat fix. But perhaps the biggest challenge facing this generation of smart glasses is neither hardware nor software. It's philosophical. People are stressed right now about how thoroughly technology has seeped into our everyday interactions. They feel addicted to their phones. These companies are pitching smart glasses as a salve -- proposing that they could, for example, allow you to handle a text message without interrupting quality time with your toddler. "Instead of having to pull out your phone, there will just be a little hologram," Zuckerberg said of Orion during his presentation. "And with a few subtle gestures, you can reply without getting pulled away from the moment." Yet committing to a world in which devices are worn on our face means committing to a world in which we might always be at least a little distracted. We could use them to quietly read our emails or scroll Instagram at a restaurant without our partner knowing. We could check our messages during a meeting while looking like we're still paying attention. We may not need to check our phones so much, because our phones will effectively be connected to our eyeballs. Smart glasses walk a thin line between helping us be less obsessively on the internet and tethering us even more closely to it. I spent some time this spring talking with a number of people who worked on early smart glasses. One of them was Babak Parviz, a partner at Madrona, a venture-capital firm, who previously led Google's Glass project. We discussed the history of computers: They used to be bulky things that lived in research settings -- then we got laptops, then smartphones. With Glass, the team aimed to shorten the time needed to retrieve information to seconds. "The question is, how much further do you need to take that? Do you really need to be immersed in information all the time, and have access to much faster information?" Parvis told me he'd changed his mind about what he called "information snacking," or getting fed small bits of information throughout the day. "I think constant interruption of our regular flow by reaching out to information sources doesn't feel very healthy to me." In my conversations, I asked experts whether they thought smart glasses were inevitable -- and what it would take to unseat the smartphone. Some saw glasses not as a smartphone replacement at all, but as a potential addition. In general, they thought that new hardware would have to give us the ability to do something we can't do today. Right now, companies are hoping that AI will be the thing to unlock this potential. But as with so much of the broader conversation around that technology, it's unclear how much of this hype will actually pan out. These devices still feel more like sketches of what could be, rather than fully realized products. The Ray-Bans and other such products can be fun and occasionally useful, but they still stumble. And although we might be closer than ever to mainstream AR glasses, they still seem a long way off. Maybe Zuckerberg is right that Orion is the world's most advanced pair of glasses. The question is really whether his big vision for the future is what the rest of us actually want. Glasses could be awesome. They could also be just another distraction.
[2]
'Smart Glasses Will Replace Phones By 2030,' Says Meta Chief Mark Zuckerberg
It's 2030. You wake up, and instead of reaching for your phone, slip on sleek smart glasses. Throughout the day, you interact with a powerful AI assistant that uses the glasses to prep you with useful information, seamlessly merging your physical and digital worlds. Using advanced display technology, it projects holograms directly into your field of vision. Sounds exciting? This is what Meta CEO Mark Zuckerberg predicted five years ago while announcing the AR glasses. On track with the projection, smart glasses are expected to replace phones by 2030 and become the next major computing platform. At Meta Connect this year, Zuckerberg unveiled a range of new tech, including the $300 Quest 3S VR headset. However, the real highlight was Project Orion, a pair of AR smart glasses. Many companies have attempted to create AR glasses, but the results have often been bulky or tethered by cables. Meta's Project Orion stands out by avoiding these issues. Despite all the high-end technology packed into the frame, these hologram-generating glasses almost resemble regular eyewear, bearing a striking similarity to Meta's Ray-Bans. The Orion glasses pack in a host of features like eye-tracking, hand-tracking, voice controls, and even a neural interface, although it reads signals from your wrist rather than your brain. The glasses also come with a wireless compute puck that resembles a sleek power bank. While they don't require a laptop or phone to operate, the puck must be within a few feet for them to function, which means you'll likely need to carry it in your pocket. Several publications tested the advanced hardware at the event. CNBC said the holograms "felt totally normal and very natural" thanks to the high-quality displays. NVIDIA CEO Jensen Huang, renowned for his AI advocacy, got a chance to try on the Orion glasses. "The head tracking is good, the brightness is good, the colour contrast is good, and the field of view is excellent," he said enthusiastically. Coincidently, Zuckerberg has returned at a busy time for Meta, which continues to expand its reach in the tech industry. While Google Glass may have been an expensive failure for tech giant Google, it was a valuable case study for the rest of the tech industry. It taught the developers a lot about what people do not want, and those insights were instrumental in shaping the current crop of smart glasses. To make matters worse, it also did appear like a prop from 'The Matrix', whereas most of today's smart glasses have a much more low-key design. This helps wearers avoid being stigmatised as modern-day "glassholes" trying to film others without their consent (even if it doesn't actually stop this from happening). Meta envisions that within the next ten years, up to 2 billion people who wear regular glasses will transition to the smart ones, with even those who don't need thema medically, eventually adopting them. While the smart glasses market is growing, there's still no guarantee that the device will ever become mainstream, let alone replace smartphones as our go-to personal gadget. While the industry is still searching for its "killer app" - a feature that would allow people to access everything with just one tap - this might be the one thing Google got right about smart glasses with Google Glass. Zuckerberg's 'normal-looking glasses with a camera, microphone and great audio that can stream video and capture content without a display' is Ray-Ban Meta, which was jointly developed by Meta and Ray-Ban. However, he noted that AI was the most important feature for smart glasses and described Ray-Ban Meta, which has access to Meta AI, as 'on the way to building a complete holographic pair of glasses'. However, as impressive as Orion may appear, the company has admitted that this specific model will never reach the consumer market. The estimated production cost of the glasses is around $10,000 per unit, which is prohibitively expensive for consumer sales. The CEO indicated that substantial work is required to make the glasses commercially viable and affordable for the average consumer. "It may be really difficult to make normal-looking glasses that can do holograms at an affordable price but you can now get normal-looking glasses with a camera, microphone, and good audio that can stream video and capture content, even if they don't have a display," Zuckerberg said. He also emphasised that the technical hurdles involved in developing AR glasses capable of delivering a seamless experience are immense. These challenges include creating a compact form factor while integrating features like holographic displays and interactive capabilities. Meta's team believed there was less than a 10% chance of successfully achieving their ambitious goals when they began this project nearly a decade ago. Currently, the glasses are a prototype available only to Meta employees and select partners, not slated for consumer release. They envision that future iterations could potentially be available by 2027, contingent on technological advancements that lower production costs and enhance functionality. For now, they are focusing on internal development and demonstrating the technology rather than rushing to market with an unready product.
[3]
Meta has launched the world's 'most advanced' glasses. Will they replace smartphones?
Humans are increasingly engaging with wearable technology as it becomes more adaptable and interactive. One of the most intimate ways gaining acceptance is through augmented reality (AR) glasses. Last week, Meta debuted a prototype of the most recent version of their AR glasses - Orion. They look like reading glasses and use holographic projection to allow users to see graphics projected through transparent lenses into their field of view. Meta chief Mark Zuckerberg called Orion "the most advanced glasses the world has ever seen". He said they offer a "glimpse of the future" in which smart glasses will replace smartphones as the main mode of communication. But is this true or just corporate hype? And will AR glasses actually benefit us in new ways? Old technology, made new The technology used to develop Orion glasses is not new. In the 1960s, computer scientist Ivan Sutherland introduced the first augmented reality head-mounted display. Two decades later, Canadian engineer and inventor Stephen Mann developed the first glasses-like prototype. Throughout the 1990s, researchers and technology companies developed the capability of this technology through head-worn displays and wearable computing devices. Like many technological developments, these were often initially focused on military and industry applications. In 2013, after smartphone technology emerged, Google entered the AR glasses market. But consumers were disinterested, citing concerns about privacy, high cost, limited functionality and a lack of a clear purpose. This did not discourage other companies - such as Microsoft, Apple and Meta - from developing similar technologies. Looking inside Meta cites a range of reasons for why Orion are the world's most advanced glasses, such as their miniaturised technology with large fields of view and holographic displays. It said these displays provide: compelling AR experiences, creating new human-computer interaction paradigms [...] one of the most difficult challenges our industry has ever faced. Orion also has an inbuilt smart assistant (Meta AI) to help with tasks through voice commands, eye and hand tracking, and a wristband for swiping, clicking and scrolling. With these features, it is not difficult to agree that AR glasses are becoming more user-friendly for mass consumption. But gaining widespread consumer acceptance will be challenging. A set of challenges Meta will have to address four types of challenges: ease of wearing, using and integrating AR glasses with other glasses, physiological aspects such as the heat the glasses generate, comfort and potential vertigo, operational factors such as battery life, data security and display quality, psychological factors such as social acceptance, trust in privacy and accessibility. These factors are not unlike what we saw in the 2000s when smartphones gained acceptance. Just like then, there are early adopters who will see more benefits than risks in adopting AR glasses, creating a niche market that will gradually expand. Similar to what Apple did with the iPhone, Meta will have to build a digital platform and ecosystem around Orion. This will allow for broader applications in education (for example, virtual classrooms), remote work and enhanced collaboration tools. Already, Orion's holographic display allows users to overlay digital content and the real world, and because it is hands-free, communication will be more natural. Creative destruction Smart glasses are already being used in many industrial settings, such as logistics and healthcare. Meta plans to launch Orion for the general public in 2027. By that time, AI will have likely advanced to the point where virtual assistants will be able to see what we see and the physical, virtual and artificial will co-exist. At this point, it is easy to see that the need for bulky smartphones may diminish and that through creative destruction, one industry may replace another. This is supported by research indicating the virtual and augmented reality headset industry will be worth USD 370 billion by 2034. The remaining question is whether this will actually benefit us. There is already much debate about the effect of smartphone technology on productivity and wellbeing. Some argue that it has benefited us, mainly through increased connectivity, access to information, and productivity applications. But others say it has just created more work, distractions and mental fatigue. If Meta has its way, AR glasses will solve this by enhancing productivity. Consulting firm Deloitte agrees, saying the technology will provide hands-free access to data, faster communication and collaboration through data-sharing. It also claims smart glasses will reduce human errors, enable data visualisation, and monitor the wearer's health and wellbeing. This will ensure a quality experience, social acceptance, and seamless integration with physical processes. But whether or not that all comes true will depend on how well companies such as Meta address the many challenges associated with AR glasses.
[4]
Meta has launched the world's 'most advanced' glasses. Will they replace smartphones?
University of Queensland provides funding as a member of The Conversation AU. Humans are increasingly engaging with wearable technology as it becomes more adaptable and interactive. One of the most intimate ways gaining acceptance is through augmented reality (AR) glasses. Last week, Meta debuted a prototype of the most recent version of their AR glasses - Orion. They look like reading glasses and use holographic projection to allow users to see graphics projected through transparent lenses into their field of view. Meta chief Mark Zuckerberg called Orion "the most advanced glasses the world has ever seen". He said they offer a "glimpse of the future" in which smart glasses will replace smartphones as the main mode of communication. But is this true or just corporate hype? And will AR glasses actually benefit us in new ways? Old technology, made new The technology used to develop Orion glasses is not new. In the 1960s, computer scientist Ivan Sutherland introduced the first augmented reality head-mounted display. Two decades later, Canadian engineer and inventor Stephen Mann developed the first glasses-like prototype. Throughout the 1990s, researchers and technology companies developed the capability of this technology through head-worn displays and wearable computing devices. Like many technological developments, these were often initially focused on military and industry applications. In 2013, after smartphone technology emerged, Google entered the AR glasses market. But consumers were disinterested, citing concerns about privacy, high cost, limited functionality and a lack of a clear purpose. This did not discourage other companies - such as Microsoft, Apple and Meta - from developing similar technologies. Looking inside Meta cites a range of reasons for why Orion are the world's most advanced glasses, such as their miniaturised technology with large fields of view and holographic displays. It said these displays provide: compelling AR experiences, creating new human-computer interaction paradigms [...] one of the most difficult challenges our industry has ever faced. Orion also has an inbuilt smart assistant (Meta AI) to help with tasks through voice commands, eye and hand tracking, and a wristband for swiping, clicking and scrolling. With these features, it is not difficult to agree that AR glasses are becoming more user-friendly for mass consumption. But gaining widespread consumer acceptance will be challenging. A set of challenges Meta will have to address four types of challenges: ease of wearing, using and integrating AR glasses with other glasses physiological aspects such as the heat the glasses generate, comfort and potential vertigo operational factors such as battery life, data security and display quality psychological factors such as social acceptance, trust in privacy and accessibility. These factors are not unlike what we saw in the 2000s when smartphones gained acceptance. Just like then, there are early adopters who will see more benefits than risks in adopting AR glasses, creating a niche market that will gradually expand. Similar to what Apple did with the iPhone, Meta will have to build a digital platform and ecosystem around Orion. This will allow for broader applications in education (for example, virtual classrooms), remote work and enhanced collaboration tools. Already, Orion's holographic display allows users to overlay digital content and the real world, and because it is hands-free, communication will be more natural. Creative destruction Smart glasses are already being used in many industrial settings, such as logistics and healthcare. Meta plans to launch Orion for the general public in 2027. By that time, AI will have likely advanced to the point where virtual assistants will be able to see what we see and the physical, virtual and artificial will co-exist. At this point, it is easy to see that the need for bulky smartphones may diminish and that through creative destruction, one industry may replace another. This is supported by research indicating the virtual and augmented reality headset industry will be worth US$370 billion by 2034. The remaining question is whether this will actually benefit us. There is already much debate about the effect of smartphone technology on productivity and wellbeing. Some argue that it has benefited us, mainly through increased connectivity, access to information, and productivity applications. But others say it has just created more work, distractions and mental fatigue. If Meta has its way, AR glasses will solve this by enhancing productivity. Consulting firm Deloitte agrees, saying the technology will provide hands-free access to data, faster communication and collaboration through data-sharing. It also claims smart glasses will reduce human errors, enable data visualisation, and monitor the wearer's health and wellbeing. This will ensure a quality experience, social acceptance, and seamless integration with physical processes. But whether or not that all comes true will depend on how well companies such as Meta address the many challenges associated with AR glasses.
Share
Share
Copy Link
Meta launches advanced smart glasses, sparking debate on the future of smartphones. CEO Mark Zuckerberg predicts smart glasses will replace phones by 2030, as the technology rapidly evolves.
Meta, the parent company of Facebook, has recently launched what it claims to be the world's most advanced smart glasses, igniting discussions about the future of personal computing devices 1. The new product, part of Meta's ambitious foray into augmented reality (AR), represents a significant leap in wearable technology and could potentially reshape how we interact with digital information.
The smart glasses, developed under the codename "Orion," boast an impressive array of features. They incorporate advanced AI technology, allowing users to perform tasks such as real-time language translation, object recognition, and even control smart home devices through voice commands or subtle hand gestures 2. The glasses also feature high-resolution displays that can overlay digital information onto the user's field of vision, effectively blending the physical and digital worlds.
Meta's CEO, Mark Zuckerberg, has made a striking prediction about the future of personal computing. He believes that smart glasses will replace smartphones by 2030, marking a significant shift in how we access and interact with digital information 3. This forecast underscores Meta's commitment to developing AR technology and positions the company at the forefront of what could be a revolutionary change in consumer electronics.
The introduction of these advanced smart glasses raises questions about their potential impact on various aspects of daily life. While they offer unprecedented convenience and connectivity, concerns about privacy, data security, and the social implications of always-on AR technology have been voiced by experts and consumers alike 4.
Meta's launch of these smart glasses intensifies the competition in the AR and wearable technology market. Other tech giants, including Apple and Google, are also investing heavily in similar technologies, setting the stage for a fierce battle for market dominance in the coming years 1.
As Meta continues to refine and expand the capabilities of its smart glasses, the technology is expected to evolve rapidly. The company is likely to face both technical and societal challenges as it works towards realizing Zuckerberg's vision of smart glasses replacing smartphones. The success of this endeavor will depend not only on technological advancements but also on user adoption and the ability to address concerns surrounding privacy and social norms 4.
Reference
[1]
[2]
Analytics India Magazine
|'Smart Glasses Will Replace Phones By 2030,' Says Meta Chief Mark Zuckerberg[3]
Meta CEO Mark Zuckerberg envisions a future where AI-powered smart glasses become the primary personal computing device. He believes this transition could happen within the next few years, but challenges remain.
3 Sources
3 Sources
Meta unveils Orion, a prototype of advanced augmented reality glasses that could potentially replace smartphones. Despite production challenges, the technology showcases significant advancements in AR capabilities.
6 Sources
6 Sources
Meta unveils Orion, its next-generation AR glasses, showcasing advanced spatial computing capabilities. The prototype aims to revolutionize mixed reality experiences and compete with Apple's Vision Pro.
25 Sources
25 Sources
Major tech companies are intensifying their focus on AI-powered smart glasses, with 2025 shaping up to be a pivotal year for the industry. This emerging technology promises to revolutionize how we interact with digital information in our daily lives.
3 Sources
3 Sources
Meta showcases groundbreaking technologies at Connect 2024, including the Quest 3S headset and AI innovations, positioning itself as a leader in the tech industry and challenging Apple's dominance.
10 Sources
10 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved