Curated by THEOUTPOST
On Fri, 13 Dec, 12:03 AM UTC
43 Sources
[1]
Googles New AI Glasses - Android XR powered by Gemini 2.0
Google, in collaboration with Samsung and Qualcomm, has unveiled Android XR, a new platform that merges artificial intelligence (AI) with extended reality (XR) to transform wearable devices like glasses and headsets. Central to this innovation is the Gemini 2.0 AI model, which redefines how you interact with technology. By combining innovative hardware and software, Android XR provides an open platform for developers, manufacturers, and users to create applications that seamlessly integrate into everyday life, enhancing productivity, communication, and convenience. Imagine a world where your glasses do more than just help you see -- they guide you through unfamiliar streets, translate foreign languages in real time, or even teach you how to cook a new recipe, all without lifting a finger. Whether it's glasses or headsets, these devices promise to make technology more intuitive, accessible, and, most importantly, useful. At the heart of this breakthrough is Gemini 2.0, Google's advanced AI model, which powers natural, real-time interactions tailored to your needs. But this isn't just about flashy tech -- it's about solving real problems and enhancing everyday experiences. From navigating a busy city to managing tasks hands-free, Android XR is designed to simplify life in ways that feel effortless. In this article, we'll explore how this platform works, the exciting possibilities it unlocks, and what it means for the future of wearable technology. Get ready to see the world -- and your tech -- in a whole new way. Android XR establishes a robust foundation for the next generation of wearable technology. Using Google's existing augmented reality (AR) tools, including ARCore, Lens, and geospatial APIs, the platform enables precise spatial computing. This capability allows devices to interpret and interact with the physical world in real time, opening up new possibilities for immersive and practical applications. The platform is designed to be open and collaborative, encouraging developers and device manufacturers to contribute to a diverse ecosystem of applications. By fostering innovation, Android XR aims to unlock new opportunities for both personal and professional use. Whether it's enhancing workplace efficiency or simplifying daily tasks, this platform is poised to redefine how technology integrates into your life. At the core of Android XR is the Gemini 2.0 AI model, a sophisticated system capable of processing complex inputs and delivering intuitive, real-time interactions. This AI model is designed to adapt to your needs, making sure a seamless and responsive experience across various scenarios. For instance, imagine navigating a bustling city with turn-by-turn directions displayed directly in your field of view. The AI not only guides you but also adapts to your surroundings, providing contextual assistance. Similarly, when cooking a new recipe, Gemini 2.0 can offer step-by-step instructions, adjusting to your pace and making sure you stay on track. This level of real-time support simplifies complex tasks, making them more efficient and manageable. Take a look at other insightful guides from our broad collection that might capture your interest in Artificial Intelligence (AI). Android XR supports two primary categories of wearable devices, each tailored to specific use cases and user preferences: For example, XR glasses can enhance your travel experience by offering real-time language translation and navigation assistance. Whether you're exploring a foreign city or attending a business meeting, these devices ensure you stay connected and informed. The development of Android XR is the result of a strategic partnership between Google, Samsung, and Qualcomm, each contributing their unique expertise: This collaboration ensures that Android XR is not only technologically advanced but also user-friendly and accessible. By pooling their resources and expertise, these industry leaders aim to accelerate the adoption of XR technologies and create a unified ecosystem that benefits developers, manufacturers, and users alike. The versatility of Android XR opens the door to a wide range of practical applications, enhancing both personal and professional experiences. Some key use cases include: These applications demonstrate how Android XR can transform everyday activities, making them more efficient, engaging, and accessible. The launch of Android XR marks a pivotal moment in the evolution of wearable technology. By transitioning from traditional screens to XR-enabled glasses and headsets, Google envisions a future where technology becomes an unobtrusive and integral part of your daily life. This shift has the potential to redefine how you interact with the digital and physical worlds, creating new opportunities for innovation and integration. As AI and XR technologies continue to advance, the Android XR platform is expected to expand the Android ecosystem, encouraging broader adoption across industries such as healthcare, education, and entertainment. By fostering collaboration and innovation, Google, Samsung, and Qualcomm are paving the way for a future where wearable devices are not just tools but essential companions in your everyday life.
[2]
Google AI Glasses and Android XR : The Future of Augmented Reality
Imagine a world where your glasses do more than just help you see -- they guide you through unfamiliar streets, translate foreign languages in real time, or even assist you with cooking dinner. Sounds like something out of a sci-fi movie, right? Well, Google, in partnership with Samsung and Qualcomm, is turning that vision into reality with their latest innovation: Android XR. This new platform combines augmented reality (AR) and artificial intelligence (AI) to seamlessly integrate digital tools into your everyday life. Whether it's navigating a new city or tackling a home improvement project, Android XR promises to make these tasks more intuitive, immersive, and efficient. But what exactly does this mean for you? Think AI-powered glasses that feel like an extension of your smartphone or headsets that transport you into fully immersive digital environments. Android XR isn't just about flashy tech -- it's about making technology work for you in ways that feel natural and practical. In this guide by AIGRID explore how this platform, powered by tools like Gemini AI and ARCore, is set to redefine the way we interact with the world around us. Android XR is Google's latest platform, purpose-built to deliver enhanced augmented reality experiences via wearable devices. By building on existing AR technologies such as ARCore and geospatial APIs, it creates a robust ecosystem for developers and users alike. The platform bridges the gap between the physical and digital realms, allowing you to interact with your surroundings in more intuitive and meaningful ways. Whether you are navigating a new city, collaborating on a project, or exploring creative ideas, Android XR provides tools that make these tasks more immersive and efficient. This platform is not just a technological upgrade but a reimagining of how AR can be integrated into daily life. By focusing on usability and adaptability, Android XR ensures that its applications are both practical and engaging, catering to a wide range of needs. The Android XR platform introduces two distinct categories of hardware, each designed to cater to specific use cases and preferences: This hardware ecosystem ensures that Android XR can cater to both immersive and practical applications, making it versatile enough to suit a variety of user needs. At the core of Android XR lies Gemini AI, Google's advanced artificial intelligence engine. This technology processes real-world inputs in real time, delivering contextual assistance that is both relevant and actionable. For example, while traveling, Gemini AI can overlay navigation directions, translate foreign text, or provide insights about landmarks and local attractions. In your daily life, it can guide you through complex tasks such as cooking, assembling furniture, or troubleshooting appliances. By combining real-time reasoning with AR capabilities, Gemini AI ensures that the information you receive is not only accurate but also tailored to your specific needs. This integration of AI and AR creates a dynamic user experience that adapts to your environment and activities, making technology feel more intuitive and accessible. Android XR is designed to enhance various aspects of your life by offering practical and innovative solutions. Here are some of its key applications: These applications highlight the versatility of Android XR, showcasing its potential to improve both practical tasks and leisure activities. The development of Android XR is a collaborative effort between Google, Samsung, and Qualcomm. Each partner brings unique expertise to the table, making sure a well-rounded and powerful platform: This partnership fosters a cohesive ecosystem that supports developers, designers, and manufacturers in creating innovative applications and devices. By working together, these companies ensure that Android XR integrates seamlessly with existing technologies while paving the way for future advancements. Google's roadmap for Android XR begins with the launch of immersive headsets, starting with Samsung's Project Muhan device, which is set to debut next year. This initial rollout marks the beginning of a broader adoption of AR and AI technologies in wearable devices. As the platform evolves, it is expected to introduce even more features and applications, further enhancing its utility and appeal. By integrating AR and AI into everyday interactions, Android XR has the potential to redefine how you experience and interact with technology. Whether through immersive headsets or AI-powered glasses, this platform offers a glimpse into a future where digital tools are seamlessly embedded into your daily life, making tasks more intuitive, efficient, and engaging. Android XR stands out with several innovative features that enhance its usability and functionality: These features underscore the platform's commitment to delivering a user-friendly and versatile AR experience, making Android XR a promising addition to the world of wearable technology.
[3]
Android XR: Google and Samsung bet big to finish Vision Pro
Google and Samsung have announced the development of Android XR, a new operating system aimed at enhancing virtual and augmented reality experiences through headsets and glasses. This collaboration marks a significant entry into the wearable tech market, directly competing with Apple's Vision Pro and Meta's Quest 3. The first headset, code-named Project Moohan, will be launched in 2025, although pricing details remain undisclosed. Android XR seeks to leverage advancements in AI to deliver seamless interactions between users and devices. The platform is designed for headsets that provide immersive environments while maintaining interface options for real-world engagement. Gemini, Google's AI assistant, will be integrated, enabling conversational interactions and supporting user tasks such as planning and research. The launch of Android XR highlights Google and Samsung's efforts to create a supportive ecosystem for developers and device manufacturers. Tools like ARCore, Jetpack Compose, and OpenXR will facilitate app development, ensuring a range of content is available at launch. Companies including Lynx, Sony, and XREAL are collaborating on hardware options powered by Android XR, indicating a broad intent to meet the diverse needs of consumers and businesses alike. The anticipated Project Moohan headset aims to provide a versatile viewing experience, integrating popular Google applications like YouTube, Google Photos, and Google Maps into the XR environment. Users will have access to multitasking features with Chrome on virtual screens, while Circle to Search will offer immediate information through simple gestures. Android applications currently available will also be compatible, allowing users to enjoy their favorite mobile experiences. Looking towards the future, Android XR will extend beyond headsets to include stylish glasses designed for daily wear. These glasses will integrate Gemini for easy access to real-time information, such as directions and translations, presented seamlessly in the user's line of sight. Initial testing is set to begin soon with a small group of users, aiming to refine product features and address privacy concerns. The partnership between Google and Samsung reflects a strategic push against established competitors like Apple and Meta, with both companies using their respective strengths in AI and consumer technology. Analysts expect Samsung to price its XR headset below Apple's $3,499 Vision Pro, positioning itself favorably in a market still considered niche. This market dynamic creates opportunities for growth and experimentation, making XR products a focus for ongoing technological development. Despite the excitement around these advancements, experts caution that immediate demand may remain low, with wearable tech categorized as "nice to have" rather than essential. The history of Google's past ventures into wearable tech, such as Google Glass, serves as a reminder of the challenges faced in gaining widespread adoption. However, the integration of Google's AI capabilities into the Android XR operating system may provide a new avenue for success as the competition heats up. Google's Ted Mortonson believes the Android XR platform could influence how users interact with technology, potentially changing consumer expectations. As the launch date approaches, interest in how these developments will reshape the landscape of wearable technology continues to grow. The industry is watching closely, understanding that both AI and XR have transformative potential, and they may soon play a greater role in everyday technology interactions. Currently, the focus lies on shaping Android XR's ecosystem and preparing for its market introduction, with open invitations extended to developers and creators to contribute to its advancement.
[4]
How Google's AI Glasses Are Redefining Augmented Reality with Android XR
Google, in collaboration with Samsung, is transforming the landscape of augmented reality (AR) and artificial intelligence (AI) with the introduction of the Android XR platform. Powered by the Gemini 2.0 AI engine, this advanced system seamlessly integrates AR and AI into daily life through wearable technology. Focusing on accessibility, real-time functionality, and intuitive interaction, these AI glasses aim to redefine how you engage with both digital and physical worlds. By merging innovative technology with practical design, Google is paving the way for a more connected and immersive future. Imagine a world where your glasses do more than help you see -- they help you navigate, communicate, and interact with your surroundings in ways you never thought possible. Whether you're walking through a bustling city, managing a packed schedule, or staying connected without constantly reaching for your phone, technology that blends seamlessly into your daily life feels like a dream. But what if that dream is closer to reality than you think? Google, in collaboration with Samsung, is on the brink of transforming how we experience technology with the Android XR platform and AI-powered glasses. These aren't just gadgets -- they're tools designed to make life simpler, smarter, and more intuitive. At the heart of this innovation lies the Gemini 2.0 AI system, a powerhouse of intelligence that promises to transform how we interact with both digital and physical worlds. From projecting real-time navigation onto your lenses to offering contextual information at a glance, these glasses are more than just a tech upgrade -- they're a glimpse into a future where technology works with you, not against you. What makes these AI glasses truly exciting isn't just their features but their focus on accessibility and practicality, making sure they fit seamlessly into your everyday routine. So, how exactly does Google plan to make this vision a reality? The Android XR platform is a next-generation AR and AI ecosystem co-developed by Google and Samsung. Specifically tailored for headsets and glasses, it provides developers with the tools to create immersive and practical applications. This platform minimizes user friction by allowing intuitive interactions, making AR and AI more accessible than ever before. Whether you're navigating a crowded city, managing complex tasks, or simply enhancing your daily routines, the Android XR platform serves as the foundation for a smarter, more connected experience. By using this platform, developers can create applications that seamlessly blend digital content with the physical world. The result is a system that not only enhances productivity but also enriches your overall interaction with technology. At the heart of the Android XR platform lies the Gemini 2.0 AI system, a sophisticated engine capable of processing multiple inputs such as voice commands, visual data, and contextual cues. This system delivers natural, immersive experiences by using spatial computing to ensure that digital overlays are contextually relevant and seamlessly integrated into your environment. For example, Gemini 2.0 can project real-time navigation directions or display contextual information directly onto your glasses' lenses. This capability enhances your interaction with the world around you, allowing you to stay informed and connected without disrupting your focus on the physical environment. By combining advanced AI processing with intuitive design, Gemini 2.0 represents a significant step forward in wearable technology. Google's AI glasses are designed with both practicality and comfort in mind, offering a lightweight form factor that resembles traditional eyewear. This design ensures that the glasses are suitable for extended use, making them a versatile tool for various scenarios. Key features include: This combination of functionality and comfort positions the glasses as a practical solution for both personal and professional use, bridging the gap between technology and everyday life. The Android XR platform supports a wide range of applications, catering to both episodic and continuous use cases. These applications demonstrate the versatility of XR technology and its potential to enhance various aspects of your life: For instance, while navigating a busy city, the glasses can project step-by-step navigation instructions onto your lenses, helping you stay oriented without the need to check your phone. Similarly, in professional settings, they can provide instant access to critical information, streamlining workflows and improving efficiency. Google's partnership with Samsung and other hardware providers, such as Sony XR, underscores the importance of a unified ecosystem. By collaborating with these industry leaders, Google ensures seamless integration between hardware, software, and user experience. This collaborative approach accelerates innovation and fosters a robust developer community, allowing the creation of diverse applications tailored to your needs. Through these partnerships, Google is not only advancing the capabilities of AR and AI but also making sure that these technologies are accessible and practical for a wide range of users. This emphasis on collaboration highlights the company's commitment to delivering a cohesive and user-friendly experience. The potential applications for AI glasses are vast and varied, offering solutions that simplify and enrich your daily life. Here are a few examples of how these glasses could transform your interactions with technology: These use cases illustrate how AI glasses can bridge the gap between the digital and physical worlds, making technology more intuitive and accessible. By integrating AR and AI into wearable devices, Google is allowing a new level of interaction that enhances both convenience and functionality. Google's vision for AI glasses aligns with similar efforts by competitors like Meta and Apple, who are also exploring the integration of AR and AI into wearable devices. Meta's AR glasses and Apple's Vision Pro share the goal of creating immersive and practical experiences. However, Google's emphasis on accessibility and real-world functionality sets it apart. By prioritizing lightweight designs and practical applications, Google aims to make AI glasses a mainstream tool rather than a niche product. This focus on usability ensures that the technology is not only innovative but also relevant to everyday life, positioning Google as a leader in the wearable technology space. Currently in the late prototype stages, Google's AI glasses are not yet available to the public. However, the company's commitment to innovation and collaboration suggests that these glasses could soon become a staple of everyday life. By combining AR, AI, and wearable technology, Google envisions a future where computing is more natural, intuitive, and seamlessly integrated into your surroundings. As the development of these glasses progresses, the potential for new applications and use cases continues to grow. From enhancing productivity to simplifying daily tasks, Google's AI glasses represent a significant step forward in the evolution of wearable technology.
[5]
Android XR -- everything you need to know about Google's answer to Apple's visionOS
Google is introducing its own spin on headsets and smart glasses with the announcement of Android XR: a new operating system built for extended reality (XR) devices. As announced, Google will fully use Gemini AI and collaborate with Samsung and Qualcomm to bring "helpful experiences to headsets and glasses" through Android XR. The first device to show off what it can do is Samsung's Project Moohan -- which is set to be available in 2025. With Android now entering the XR space, the Apple Vision Pro and visionOS will see some heavy competition. But what is Android XR, and what can it do for a new generation of AI-powered headsets and glasses? Here's everything we know so far for all your burning questions about Android XR. Android XR is Google's new operating system for extended reality devices, including XR headsets and glasses. It extends the tech giant's Android operating system to smartphones, tablets and TVs. The operating system aims to allow developers and device makers to create new XR experiences for headsets and glasses using familiar Android apps and tools. This includes Google's suite of apps, such as Google Photos, Google Maps, Google TV, Chrome, YouTube, and more. This means you'll get to use these apps in a virtual landscape, similar to what the Meta Quest 3 and Quest 3S mixed-reality (MR) experiences have to offer. And, of course, the Apple Vision Pro. As Google states, Android XR allows developers to build apps and games for its devices. The initial tools include ARCore, Android Studio, Jetpack Compose, Unity and OpenXR. Along with partnering with Samsung to develop the operating system, Google will get support from chip maker Qualcomm to power these devices. Qualcomm partners like Sony, Lynx, and XREAL can make their own devices using Android XR. "Advancements in AI are making interacting with computers more natural and conversational," the VP of XR at Google announced in the blog post. "This inflection point enables new extended reality (XR) devices, like headsets and glasses, to understand your intent and the world around you, helping you get things done in entirely new ways." It will be a while before we see Android XR in action on headsets, but we're already excited about Android XR in smart glasses. If you're familiar with the capabilities of the Apple Vision Pro or Meta Quest 3 headsets, imagine Android XR adding a "Google" spin on things. While developers will create different XR experiences based on Android XR, we already have a sneak peek at what it can do. For example, headsets can switch from a fully immersive virtual environment to a real-world setting. As with Apple and Meta headsets, it allows you to "fill the space around you" with different apps and content -- layering them on top of what you can see in the real world and a virtual setting. Google Gemini will also be a highlight. If asked, the AI assistant will tell you what you see and allow you to control the device. The Ray-Ban Meta smart glasses have a similar approach with AI capabilities, and Google is following suit. Then there's the apps. As mentioned, Google will bring many of its apps straight to an Android XR device, but in a "reimagined" way. So, you can watch YouTube videos and shows on Google TV on a virtual big screen, see your Google Photos pictures and albums in 3D, and create several virtual screens when surfing the web on Google Chrome. Plus, Circle to Search is added, allowing you to look up anything you see with a "simple gesture" (which I expect will be a circular motion with a finger). As another perk, mobile and tablet apps on the Google Play Store will be available for Android XR headsets, with Google announcing that more XR-focused apps, games and content will be available next year. As for glasses, Gemini will take center stage, with Google wanting its AI to be "one tap away." As expected, Android XR glasses will be able to translate what you see and hear in real-time (as seen in the video above), offer directions to destinations you need to go and show message summaries you received on your phone. Google expects Android XR to "seamlessly" work with other Android devices, and since it runs on the same operating system, I can imagine it doing so. As with the Ray-Ban Meta glasses, Google wants to make Android XR "stylish, comfortable glasses you'll love to wear every day." There's still much to learn about Android XR, but we already know what Google's new operating system will bring to XR headsets and glasses. It aims to be a unified platform for developers and manufacturers to create new XR experiences and devices. We will surely see even more headsets and glasses targeting the Apple Vision Pro and Ray-Ban Meta smart glasses. Google will soon be testing Android XR on prototype glasses to a small group of users, but there's no word yet of when smart glasses with Android XR support will arrive. In the meantime, we have Samsung's XR headset to look forward to, which will be the first device to bring Android XR. While we wait to see what else Android XR brings to the table, check out how Xreal's smart glasses are a leap forward for AR.
[6]
Android XR Is Google's Final Shot to Overthrow Meta's AR, VR
Samsung's first XR headset, codenamed Project Moohan, is expected to debut next year. Earlier this year, Meta built the 'Android' of AR/VR devices. But its biggest competitor, Google, took one notch up and beat them in the game by releasing Android XR, integrating AI, AR, and VR for innovative headset and glasses experiences. The tech giant partnered with Samsung for this product. The first extended reality (XR) headset, codenamed Project Moohan, is expected to debut next year. Google expects more hardware devices, including glasses, to adopt Android XR in the future. Google said it wants to offer a variety of stylish and comfortable glasses that not only people would enjoy wearing daily but also work seamlessly with other Android devices. This headset will allow users to effortlessly transition between virtual spaces and the real world. Google Gemini offers natural conversations to assist users in interacting with devices, planning, researching, and completing tasks. In a recent interview, Google DeepMind chief Demis Hassabis shared that the company is exploring new form factors and devices for more intuitive AI interactions. Using cooking as an example, Hassabis pointed out that hands are often occupied, and while seeking advice from AI agent Astra, one would prefer a wearable device that they wouldn't have to hold. Google's track record with AR/VR has been less than favourable. Last year, the tech giant cancelled its 'Project Iris', while Google Glass was effectively discontinued in January 2015. However, with Android XR, Google is attempting a comeback in this category. This time, the tech giant is taking a different approach. Instead of focusing on hardware, the company is developing an operating system similar to what Meta is pursuing with its Horizon OS. "Played around with the Android XR simulator. Seems like a great foundation for a spatial OS, much more so than Horizon OS, in my opinion. Overall, spatial aspects are quite similar to visionOS, which is a good thing," Dylan McDonald, owner of Sun Apps LLC, said in a post on X. "Android XR worked well in my tests on the Samsung headset, but the interface is essentially an 'Android-ified' version of visionOS. I think most consumers wouldn't even be able to tell them apart. It can also run phone/tablet apps from Google Play," Bloomberg journalist Mark Gurman, who tested the product, said. Android XR is supported by tools like ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR, making it easier for developers to create apps for upcoming devices. "Literally all AndroidXR has to do to win is be more open than visionOS and be less predatory than Meta," a user said on X. This time, Google has gained an edge with Astra, its smart assistant that processes text, voice, images, and video. The assistant allows it to understand and respond to situations more naturally and in real-time. In the latest Project Astra demo video, Google introduced its new glasses, equipped with a camera, and an internal display and always connected to Astra for instant information. Interestingly, these glasses can also recall things one does and says. Google isn't alone in this competition. Meta is already all-in with its Ray-Ban Meta glasses. According to Yann LeCun, Meta's chief AI scientist, smartphones are expected to be replaced by augmented reality glasses in the next 10-15 years. Notably, during his recent trip to India, he used these glasses to take selfies and photos and was also spotted wearing them in various public appearances. Moreover, at Meta Connect 2024, the social media giant unveiled Orion, which Meta chief Mark Zuckerberg described as "the most advanced AR glasses the world has ever seen". The company also launched the Quest 3S VR headset, priced at $299.99 - almost half the price of its predecessor Quest 3. Meanwhile, Apple hasn't been able to taste much success with its Apple Vision Pro. According to a report, Apple has sold fewer than 5 lakh units of the device, with many buyers using it less than Apple had hoped for. On a recent Y Combinator (YC) Podcast, Dianu Hu, a group partner at YC, discussed how the hardware in AR/VR glasses must become more lightweight, a factor she feels is hindering their popularity. "There are actually constraints with physics to fit all that hardware into such a small form factor. Fitting enough computing power and optics is just super challenging" "There's still more actual engineering and physics that needs to be discovered, and that's it. I think the algorithms are there, but it's just lots of really hard hardware and optics problems," she added. Many expect OpenAI to announce a hardware device soon, as the company is collaborating with legendary former Apple design chief Jony Ive. Ive's design firm, LoveFrom, which he co-founded after leaving Apple in 2019, is working on the initiative. Currently, it is in its early stages with a small team of about ten people, including former Apple designers Tang Tan and Evans Hankey, who played key roles in developing the iPhone. OpenAI recently launched its advanced voice mode with vision, which appears to be the next logical step if the company intends to enter the AI hardware market. So far, no company has fully succeeded. While startups like Rabbit and Humane AI have emerged, they have been struggling to win over customers.
[7]
Google lays out its vision for an Android XR ecosystem
With VR, augmented reality and mixed reality in play, it's much more than Google Glass 2.0. Google's latest push into extended reality is taking shape. While the company isn't entirely ready to show off any products just yet, it has laid out a vision for a unified Android XR ecosystem that will span a range of devices -- such as virtual reality headsets and mixed reality glasses -- . This is evidently Alphabet's latest attempt to compete with the likes of Meta and Apple on the extended reality front. The company has dabbled in this arena in the past with the likes of , Daydream and , programs that have found their way to the . Android XR seems much more ambitious, and having some big-name partners on board from the jump indicates that Alphabet is much more serious about extended reality this time around. Google has been beavering away on XR behind the scenes despite shutting down some of its higher-profile projects in that realm. "Google is not a stranger to this category," Sameer Samat, president of Android Ecosystem at Google, told reporters ahead of the announcement. "We, like many others, have made some attempts here before. I think the vision was correct, but the technology wasn't quite ready." One area where Google thinks that technology has advanced to the point where it's ready to try again with XR is artificial intelligence. will be deeply integrated into Android XR. By tapping into the power of the chatbot and having a user interface based around voice and natural conversation, Google and its partners are aiming to deliver experiences that aren't exactly possible to pull off using gestures and controllers. "We are fully in what we refer to as the Gemini Era, and the breakthroughs in AI with muti-modal models are giving all of us totally new ways of interacting with computers," Samat said. "We believe a digital assistant integrated with your XR experience is the killer app for the form factor, like what email or texting was for the smartphone." Google believes that smart glasses and headsets are a more natural form factor to explore this tech with, rather than holding up your smartphone to something in the world that you want Gemini to take a look at. To that end, the wide array of XR devices that are popping up, such as VR headsets with passthrough (the ability to see the outside world while wearing one) is another factor in Google's push into that space. We'll get our first real look at Android XR products next year, including one that . The first headset, currently dubbed Project Moohan (which means "infinity" in Korean), will feature "state-of-the-art displays," passthrough and natural multi-modal input, according to Samsung. It's slated to be a lightweight headset that's ergonomically designed to maximize comfort. Renderings of the Moohan prototype (pictured above) suggest the headset will look a little like the Apple Vision Pro, perhaps with a glass visor on the front. Along with the headset, Samsung is working on Google XR glasses, with more details to come soon. But nailing the hardware won't matter much if you can't do anything interesting with it. As such, Google is now looking to to create apps and products for Android XR. The company is offering developers APIs, an emulator and hardware development kits to help them build out XR experiences. On its side of things, Google is promising an "infinite desktop" for those using the platform for productivity. Its core apps are being reimagined for extended reality as well. Those include Chrome, Photos, Meet, Maps (with an immersive view of landmarks) and Google Play. On top of that, mobile and tablet apps from Google Play are said to work out of the box. On YouTube, it looks like you'll be able to easily transition from augmented reality into a VR experience. And in Google TV, you'll be able to switch from an AR view to a virtual home movie theater when you start a film. A demo video showed a headset wearer using a combination of their voice and a physical keyboard and mouse to navigate a series of Chrome windows. Circle to Search will be one of the many features. After you've used the tool to look up something, you can use a Gemini command to refine the results. It'll be possible to pull 3D image renderings from image search results and manipulate them with gestures. As for AR glasses -- essentially next-gen Google Glass -- it seems that you'll be able to use those to translate signage and speech, then ask Gemini questions about the details of, say, a restaurant menu. Other use cases include advice on how to position shelves on a wall (and perhaps asking Gemini to help you find a tool you put down somewhere), getting directions to a store and summarizing group chats while you're on the go. Thanks to advances in technology, AR glasses look much like regular spectacles these days, as we've seen from the likes of Meta and Snap. That should help Google avoid the whole "Glass-holes" discourse this time around given that there shouldn't be an obscenely obvious camera attached to the front. But the advancements might give cause for concern when it comes to privacy and letting those caught in the camera's cone of vision know that they're perhaps being filmed. Privacy is an important consideration for Android XR. Google says it's building new privacy controls for Gemini on the platform. More details about those will be revealed next year. Meanwhile, games could play a major factor in the success of Android XR. They're a focus for Meta's Quest headsets, of course. On the heels of its , Google is hoping to make it as easy as possible for developers to port their games to its ecosystem. Not only that, Unity is one of the companies that's supporting Android XR. Developers will be able to create experiences for it using the engine. Unity says it will offer full support for Android XR, including documentation and optimizations to help devs get started. They can do that now in public experimental versions of Unity 6. Resolution Games (Demeo) and Google's own Owlchemy Labs (Job Simulator) are among the studios that plan to bring titles built in Unity to Android XR. The process is said to be straightforward. "This is as simple a port as you're ever going to encounter," Owlchemy Labs CEO Andrew Eiche said in a statement. Meanwhile, Unity has teamed up with Google and film director Doug Liman's studio 30 Ninjas to make a "new and innovative immersive film app that will combine AI and XR to redefine the cinematic experience." Since gaming is set to play a sizable role in Android XR, it stands to reason that physical controllers will still be a part of the ecosystem. Not many people are going to want to play games using their voice. But that's the key: Android XR is shaping up to be a broad ecosystem of devices, not just one. This strategy has paid dividends for Google, given the spectrum of phones, tablets, cars and TVs that variants of Android are available on. It will be hoping to replicate that success with Android XR.
[8]
Samsung’s AR Headsets Will Run on AI-Centric â€~Android XR’
Samsung’s â€~Project Moohan’ will also facilitate Gemini and Galaxy AI. It will be the first headset running on Google’s Android XR. Google and Samsung are again letting their Wonder Twins powers activate, this time in the form of an all-new VR and AR platform dubbed “Android XR.†The companies dropped the details of their plans for upcoming headsets and glasses that will arrive soon. All that new hardware will have a UI that will facilitate Google’s Gemini AI model in a way that will offer controls “beyond gestures.†If Google and Samsung are following Apple into the VR rabbit hole, they’re doing it with a more cautious descent. The first device to use this all-new Android XR isn’t actually here yet. Samsung calls it “Project Moohan,†based on the Korean word for “infinity.†As you can see above, Samsung provided a rendering of its initial headset design. Based on the image, Samsung’s initial XR headset has a constrained design without many obvious AR sensors like the one you see on the Meta Quest 3, Quest 3S, or Apple Vision Pro. Samsung hinted about its XR plans last month but didn’t offer many specifics about its upcoming devices. The company said its headsets and glasses will have internal displays and passthrough capabilities. The big difference between these goggles and glasses compared to others on the market is that it's AI-forward. Samsung said you should be able to control it with your voice, akin to the Ray-Ban Meta smart glasses. The device will likely have some kind of eye and hand tracking, but Samsung said you should be able to use “natural conversation†to access the device or any of your apps. Android XR is supposed to facilitate virtual reality, augmented reality, and mixed reality features with a host of typical Android apps like Google Maps or YouTube. Samsung shared a video of this Android XR platform in which a user clicks on a YouTube VR video set in Florence, Italy. This feature is also available in the Meta Quest through the still-glitchy YouTube app. Just this week, Google announced its Gemini 2.0 model. The Android maker said the whole point of its new model is to facilitate AI agentsâ€"essentially AI that can take over control of your device on your behalf. That’s the general concept behind Android XR as well. In a statement, Google’s president of Android, Sameer Samat, said his company's AR platform’s multimodal AI will “enable natural and intuitive ways†to use these devices. Google and Samsung seem to indicate they, too, have bought into the idea of AR glasses, eventually eclipsing the phone as everybody’s go-to tech. Meta is currently building out its Orion true AR glasses. Apple is also looking to get into the smart glasses game. Conversely, Google abandoned its hold in the AR glasses space after the end of Google Glass. Now, it’s happy to let Samsung take the reins while it builds out its new form of the Android ecosystem.
[9]
Google Announces Android XR
We started Android over a decade ago with a simple idea: transform computing for everyone. Android powers more than just phones -- it's on tablets, watches, TVs, cars and more. Now, we're taking the next step into the future. Advancements in AI are making interacting with computers more natural and conversational. This inflection point enables new extended reality (XR) devices, like headsets and glasses, to understand your intent and the world around you, helping you get things done in entirely new ways. Today, we're introducing Android XR, a new operating system built for this next generation of computing. Created in collaboration with Samsung, Android XR combines years of investment in AI, AR and VR to bring helpful experiences to headsets and glasses. We're working to create a vibrant ecosystem of developers and device makers for Android XR, building on the foundation that brought Android to billions. Today's release is a preview for developers, and by supporting tools like ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR from the beginning, developers can easily start building apps and games for upcoming Android XR devices. For Qualcomm partners like Lynx, Sony and XREAL, we are opening a path for the development of a wide array of Android XR devices to meet the diverse needs of people and businesses. And, we are continuing to collaborate with Magic Leap on XR technology and future products with AR and AI. Blending technology with everyday life, with help from AI Android XR will first launch on headsets that transform how you watch, work and explore. The first device, code named Project Moohan and built by Samsung, will be available for purchase next year. With headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world. You can fill the space around you with apps and content, and with Gemini, our AI assistant, you can even have conversations about what you're seeing or control your device. Gemini can understand your intent, helping you plan, research topics and guide you through tasks. We're also reimagining some of your favorite Google apps for headsets. You can watch YouTube and Google TV on a virtual big screen, or relive your cherished memories with Google Photos in 3D. You'll be able to explore the world in new ways with Google Maps, soaring above cities and landmarks in Immersive View. And with Chrome, multiple virtual screens will let you multitask with ease. You can even use Circle to Search to quickly find information on whatever's in front of you, with just a simple gesture. Plus, because it's Android, your favorite mobile and tablet apps from Google Play will work right out of the box, with even more apps, games and immersive content made for XR arriving next year. Android XR will also support glasses for all-day help in the future. We want there to be lots of choices of stylish, comfortable glasses you'll love to wear every day and that work seamlessly with your other Android devices. Glasses with Android XR will put the power of Gemini one tap away, providing helpful information right when you need it -- like directions, translations or message summaries without reaching for your phone. It's all within your line of sight, or directly in your ear. As we shared yesterday, we'll soon begin real-world testing of prototype glasses running Android XR with a small group of users. This will help us create helpful products and ensure we're building in a way that respects privacy for you and those around you. Building the XR ecosystem Android XR is designed to be an open, unified platform for XR headsets and glasses. For users, this means more choice of devices and access to apps they already know and love. For developers, it's a unified platform with opportunities to build experiences for a wide range of devices using familiar Android tools and frameworks. We're inviting developers, device makers and creators everywhere to join us in shaping this next evolution of computing. If you're a developer, check out the Android Developers Blog to get started. For everyone else, stay tuned for updates on device availability next year and learn more about Android XR on our website. Source: Google Android XR
[10]
Google renews push into mixed reality headgear
The software is designed to power augmented and virtual reality experiences enhanced with artificial intelligence, XR vice president Shahram Izadi said in a blog post. "With headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world," Izadi said.Google is ramping up its push into smart glasses and augmented reality headgear, taking on rivals Apple and Meta with help from its sophisticated Gemini artificial intelligence. The internet titan on Thursday unveiled an Android XR operating system created in a collaboration with Samsung, which will use it in a device being built in what is called internally "Project Moohan," according to Google. The software is designed to power augmented and virtual reality experiences enhanced with artificial intelligence, XR vice president Shahram Izadi said in a blog post. "With headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world," Izadi said. "You can fill the space around you with apps and content, and with Gemini, our AI assistant, you can even have conversations about what you're seeing or control your device." Google this week announced the launch of Gemini 2.0, its most advanced artificial intelligence model to date, as the world's tech giants race to take the lead in the fast-developing technology. CEO Sundar Pichai said the new model would mark what the company calls "a new agentic era" in AI development, with AI models designed to understand and make decisions about the world around you. Android XR infused with Gemini promises to put digital assistants into eyewear, tapping into what users are seeing and hearing. An AI "agent," the latest Silicon Valley trend, is a digital helper that is supposed to sense surroundings, make decisions, and take actions to achieve specific goals. "Gemini can understand your intent, helping you plan, research topics and guide you through tasks," Izadi said. "Android XR will first launch on headsets that transform how you watch, work and explore." The Android XR release was a preview for developers so they can start building games and other apps for headgear, ideally fun or useful enough to get people to buy the hardware. This is not Google's first foray into smart eyewear. Its first offering, Google Glass, debuted in 2013 only to be treated as an unflattering tech status symbol and met with privacy concerns due to camera capabilities. The market has evolved since then, with Meta investing heavily in a Quest virtual reality headgear line priced for mainstream adoption and Apple hitting the market with pricey Vision Pro "spacial reality" gear. Google plans to soon begin testing prototype Android XR-powered glasses with a small group of users. Google will also adapt popular apps such as YouTube, Photos, Maps, and Google TV for immersive experiences using Android XR, according to Izadi. Gemini AI in glasses will enable tasks like directions and language translations, he added. "It's all within your line of sight, or directly in your ear," Izadi said.
[11]
Google wants you to strap Android to your face with Android XR
In brief: A primary selling point of Apple's Vision Pro is a software platform many users are already familiar with. Google is trying to repeat the strategy with Android XR but with a couple of crucial differences - availability on devices from third-party manufacturers and a heavy emphasis on GenAI. Google recently unveiled Android XR, a version of its mobile operating system redesigned for extended reality headsets and glasses. A developer preview is available now, and the first compatible devices are planned for release in 2025. Although third-party developers have only just begun working with Android XR, Google demonstrated how it is using XR to transform some of its in-house apps. YouTube and Google TV are viewable in large virtual screens, Google Photos displays images in 3D, Google Maps receives an immersive view mode, and Chrome can display multiple windows around users. Furthermore, users can interact with Android XR apps through hand-tracked gestures. One example shows someone viewing a picture of a soccer player in Chrome and drawing a circle around their shoes to Google search for similar pairs. Android XR also supports all Google Play phone and tablet apps. In contrast to Apple's visionOS, which doesn't employ Apple Intelligence, some Android XR apps incorporate Google's recently unveiled Gemini 2.0 GenAI model. AI agents can recognize what users look at, converse with users, and access device controls through natural language prompts. Google hasn't announced plans for a flagship headset, but Samsung is expected to release the first Android XR-supported device next year. Lynx, Sony, and XREAL might also soon begin developing headsets running on Qualcomm processors. At least one of them will probably undercut the Apple Vision Pro's $3,500 price tag. Moreover, Google is taking the opportunity to cautiously gauge interest in another attempt at smart glasses with its new XR OS. The company plans to begin testing XR glasses prototypes soon but didn't mention plans for a full release. Demo clips show Android XR offering turn-by-turn navigation through Google Maps, translating a restaurant menu a user looks at, and showing directions for building a shelf. The company is obviously trying to avoid a repeat of Google Glass. The early attempt at smart glasses first appeared over a decade ago but failed to gain traction and raised significant privacy concerns.
[12]
Android XR: The Gemini era comes to headsets and glasses
We're working to create a vibrant ecosystem of developers and device makers for Android XR, building on the foundation that brought Android to billions. Today's release is a preview for developers, and by supporting tools like ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR from the beginning, developers can easily start building apps and games for upcoming Android XR devices. For Qualcomm partners like Lynx, Sony and XREAL, we are opening a path for the development of a wide array of Android XR devices to meet the diverse needs of people and businesses. And, we are continuing to collaborate with Magic Leap on XR technology and future products with AR and AI. Android XR will first launch on headsets that transform how you watch, work and explore. The first device, code named Project Moohan and built by Samsung, will be available for purchase next year. With headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world. You can fill the space around you with apps and content, and with Gemini, our AI assistant, you can even have conversations about what you're seeing or control your device. Gemini can understand your intent, helping you plan, research topics and guide you through tasks. We're also reimagining some of your favorite Google apps for headsets. You can watch YouTube and Google TV on a virtual big screen, or relive your cherished memories with Google Photos in 3D. You'll be able to explore the world in new ways with Google Maps, soaring above cities and landmarks in Immersive View. And with Chrome, multiple virtual screens will let you multitask with ease. You can even use Circle to Search to quickly find information on whatever's in front of you, with just a simple gesture. Plus, because it's Android, your favorite mobile and tablet apps from Google Play will work right out of the box, with even more apps, games and immersive content made for XR arriving next year.
[13]
A week on, Google's Android XR is stealing the VR/AR spotlight from Meta
Android XR is proving popular with software developers and hardware makers Google's Android XR was only announced last week, but the mixed reality platform is already making a splash, securing software and hardware partners right out of the gate and impressively throwing down the gauntlet to Meta's Horizon OS. While Meta's operating system has a clear head start, Google's promise of an open, unified platform (similar to the Android OS that powers smartphones, tablets, TVs, smartwatches, and even cars) offers a bright future for the consumer and enterprise XR market by delivering a true standard for all to benefit from. As Resolution Games, the developer behind the impressive digital tabletop RPG Demeo -- one of the first games expected to arrive on Android XR through Samsung's Project Moohan headset, CTO Johan Gastrin tells Laptop Mag: "Android XR has the potential to speed up standardization in the XR space." Android XR may be the rising tide that raises all boats, granting hardware manufacturers and software developers an even foothold from which to build. In turn, they're free to build customized experiences backed by a dependable library of tools and frameworks tailored to various devices. Gastrin tells Laptop Mag, "Android XR has the potential to bring XR to a wide variety of XR devices, ranging from immersive VR headsets to AR glasses, all using the same ecosystem." It's an enticing promise, and it's one that's already attracted the attention of developers like Resolution Games, not to mention potential hardware partners in Samsung, Sony, Lynx, and XREAL. Announced on December 12, the Android XR operating system is poised to fuse mixed reality and artificial intelligence to power a new generation of VR headsets and smart glasses -- in direct competition with Horizon OS, which powers Meta's impressive Quest headsets. It's a bold challenge to Meta's dominance in this space, but Google has a wealth of experience and a vast ecosystem in its corner heading into this fight. Resolution Games' Johan Gastrin tells Laptop Mag, "Sharing a common ecosystem across devices from multiple manufacturers is great for both developers and consumers." In speaking of that ecosystem, Gastrin continues: "Android XR also has the benefit of having an existing portfolio of games and applications that will make the platform vibrant from the get-go." Following Android XR's announcement, Resolution Games became one of the first developers in the XR space to announce it would be porting its most popular title to the platform. While the impressive Demeo may be among the first games to join Google's new platform, we wouldn't expect it to be the last. Highlighting the ease at which developers can shift from a Meta-familiar landscape to Android XR, Gastrin tells Laptop Mag, "Both Android XR and Horizon OS adhere to OpenXR. That, plus us using Unity for the majority of our titles, makes the development process very similar." That's good news for gaming hopefuls when it comes to Google's new platform. Aside from Demeo, many of Meta's most popular games are built using the Unity game engine, including Beat Saber, Bonelab, Walkabout Mini Golf, LEGO Bricktales, and even Batman: Arkham Shadow. The latter of those games is unlikely to make the switch due to it being a Meta Quest exclusive title, something that Android XR has yet to claim. When asked whether Resolution Games would be interested in developing exclusively for the platform, Gastrin tells Laptop Mag, "Right now we are focusing on taking Demeo to as many players as possible on whatever platforms they wish to play." "We will continue to support as many XR platforms as possible, and the support for the OpenXR open standard from all major vendors have greatly improved our ability to do so." Clearly, Google's Android XR has an excellent opportunity when it comes to delivering great software. However, from what we know so far, Google won't be bringing Android XR to users through its own dedicated hardware -- something Meta can rely on in pairing its Horizon OS platform with its Quest headsets. However, while Google won't be supplying its own headsets or smart glasses, Android XR isn't short of interest from manufacturers looking to adopt the platform for their own devices. As part of the Android XR announcement, Google revealed that Lynx, Sony, and AR smart glasses maker XREAL are all interested in making use of the platform to power future devices. one of which may be the XR head-mounted display showcased by Sony in January. Meta also has plans to open Horizon OS for third parties, with Asus, Lenovo, and Microsoft all interested in adopting the platform, but little has been heard of this since its announcement in April. Having only been announced last week, already being lined up to appear on Samsung's future headset, and with many parties interested in bringing the platform to other devices, Android XR is off to a great start out of the gate. It serves as an excellent expansion of the Android ecosystem and brings AI to the forefront of Google's vision for a mixed-reality future. Whether the Android XR platform outpaces Meta's remains to be seen, but each provides the other with the necessary element to see innovation thrive: competition.
[14]
Google debuts Android XR operating system for VR and AR devices - SiliconANGLE
Google debuts Android XR operating system for VR and AR devices Google LLC today debuted Android XR, a new operating system for virtual reality and augmented devices. The software will initially ship with headsets. Down the road, Google will also enable hardware partners to integrate Android XR into smart glasses. The search giant has already developed several prototype glasses internally as part of an initiative called Project Astra. "Android XR is designed to be an open, unified platform for XR headsets and glasses," Shahram Izadi, Google's vice president and general manager of XR, wrote in a blog post. "For users, this means more choice of devices and access to apps they already know and love. For developers, it's a unified platform with opportunities to build experiences for a wide range of devices using familiar Android tools and frameworks." Android XR ships with Gemini, Google's flagship large language model lineup. The integration will allow consumers to look at an object and have Gemini explain it. A user could, for example, ask the LLM for pointers on how to assemble a piece of furniture. Gemini also lends itself to other tasks. An Android XR device can use the AI to fetch information such as weather updates, as well as overlay turn-by-turn navigation instructions on the user's field of view. The latter feature is powered by an integration with Google Maps. Alongside the mapping service, Android XR can run all the mobile apps in Google's Play Store. It also supports several popular development tools. That means software teams won't have to replace their existing toolchains to build apps for the operating system. Android Studio, Google's flagship desktop application for building mobile apps, is among the supported applications. The first device that will ship with Android XR is a headset from Samsung Electronics Co. Ltd, which helped develop the operating system. The device (pictured) is codenamed Project Moohan. It's reportedly similar in design to Apple's Vision Pro, but lighter and more comfortable to wear for extended periods of time. The headset runs on Qualcomm Inc.'s XR2 Gen 2 system-on-chip, which is specifically designed to power AR and VR devices. The chip's central processing unit and graphics processing units have higher clock speeds than the chipmaker's previous-generation silicon. As a result, the XR2 Gen 2 can support headsets featuring up to a dozen external cameras and internal displays with a resolution of 4,300 pixels by 4,00 pixels. According to Samsung, users can double-tap the side of the headset to switch between VR and AR modes. In the latter configuration, the wearer sees not only rendered content but also the outside world. Samsung's headset draws power from a standalone battery pack that attaches via a USB-C cable. According to Bloomberg, the company could offer multiple packs with different levels of battery life. Samsung plans to launch Project Moohan next year. Android XR will later become available with headsets from other companies and, down the line, in smart glasses. Google has not yet shared a time frame for when the latter devices will become available. As part of an internal initiative dubbed Project Astra, Google has developed several prototype smart glasses. The use a display technology called microLED that can provide higher resolutions than many current screens. It's also less susceptible to certain times of malfunctions and uses less power.
[15]
Google unveils Android XR for next-gen extended reality devices
Google has introduced Android XR, a new operating system designed for the next generation of computing, combining years of AI, AR, and VR advancements. Developed in collaboration with Samsung, Android XR is aimed at creating a new ecosystem for XR headsets and glasses, bringing immersive experiences to users. Shahram Izadi, VP & GM of XR at Google, stated that Android XR aims to build a vibrant ecosystem for developers and device manufacturers. The platform is designed to leverage the success of Android, providing developers with tools such as ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR to easily develop apps and games for XR devices. Google is partnering with companies like Lynx, Sony, and XREAL to create a wide range of devices, with further collaborations with Magic Leap to push the boundaries of XR technology. Android XR will first launch on headsets, transforming the way users work, watch, and explore. The first headset, code-named "Project Moohan" and developed by Samsung, is set to be available for purchase next year. These devices will allow users to seamlessly switch between virtual and real environments, offering an enhanced experience with the help of Gemini, Google's AI assistant. Gemini can guide users, help them plan tasks, and provide real-time information based on what users see, all through simple conversations. Google is adapting popular apps for the XR platform. Users will be able to watch YouTube and Google TV on a virtual big screen or revisit memories in 3D with Google Photos. Google Maps will offer an Immersive View, allowing users to explore cities and landmarks from new perspectives. Chrome will support multiple virtual screens for efficient multitasking, while Circle Search will enable users to find information using simple gestures. Android XR will also support mobile and tablet apps from Google Play, with more apps and games optimized for XR coming next year. Android XR is also paving the way for glasses that will assist users throughout the day. These glasses will seamlessly integrate with Android devices, putting the power of Gemini at users' fingertips. Android XR is designed as an open, unified platform, providing more device options for users and a consistent development experience for creators. Developers can use familiar Android tools and frameworks to build apps and experiences for various XR devices. Google encourages developers, device makers, and creators to contribute to shaping the future of XR. Google has also launched the Android XR SDK, a comprehensive development kit for building XR apps. The SDK allows developers to create immersive experiences that blend digital and physical worlds. Key features of the SDK include: The Jetpack XR SDK offers new libraries for creating spatial UI layouts and integrating existing Android apps into XR. The new libraries include: Google has partnered with Unity to integrate its real-time 3D engine with Android XR. Unity developers can now use the Unity OpenXR: Android XR package to bring multi-platform XR experiences to Android XR. Additionally, Chrome on Android XR supports the WebXR standard, allowing developers to create immersive web-based experiences with frameworks like Three.js or A-Frame. Android XR is built on open standards such as OpenXR 1.1 and includes advanced capabilities like AI-powered hand mesh, detailed depth textures, and light estimation. These features ensure that digital content blends seamlessly with the real world. The SDK also supports formats like glTF 2.0 for 3D models and OpenEXR for high-dynamic-range environments. To get started with development, visit the Android XR developer site, where tools and resources are available. Google also invites developers to participate in the Android XR Developer Bootcamp in 2025, offering opportunities to collaborate on the future of XR and access pre-release hardware.
[16]
Google announces Android XR platform, will launch first on Samsung's Project Moohan device
Google said that it is launching a new Android-based XR platform on Thursday to accommodate AI features. The company said the platform, called Android XR, will support app development on different devices, including headsets and glasses. The company is releasing Android XR's first developer preview on Thursday, which already supports existing tools, including ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR. Project Moohan headset The company noted that Android XR will first launch with the Smasung-built Project Moohan headset, which will be available for purchase next year. Samsung, Google, and Qualcomm announced a partnership to produce an XR device early last year. After the announcement, reports of a power struggle between Google and Samsung about the control of the project emerged. In July, Business Insider reported that while the headset was supposed to be shipped earlier this year, the launch was delayed. Google noted that the headset will be able to easily switch between a fully immersive experience and augment content on top of real-world surroundings. Plus, users will be able to control the device with Gemini and ask questions about the app and content that they are looking at. App ecosystem and Gemini Google said that because Android XR is based on Android, most mobile and tablet apps on the Play Store will automatically be compatible with it. This means anyone buying a headset with an Android headset will already have a library full of apps through the Android XR Play Store. This is possibly the company's play to counter Apple's $3600 Vision Pro, which hasn't taken off as the company would have expected. Vision Pro had a limited number of apps at launch, which has grown over time. However, the cost is a prohibitive factor for more users to adopt the headset. It's not clear where Samsung and Google will price this headset, but the company is positioning itself so that users will have better app access. Google is redesigning YouTube, Google TV, Chrome, Maps, and Google Photos for an immersive screen. Notably, Google didn't release a YouTube app for Vision Pro and also made developer Christian Selig pull out his app Juno for YouTube viewing from the App Store. The company is also adding an Android XR Emulator to Android Studio so developers can visualize their apps in a virtual environment. The emulator has XR controls for using a keyboard and a mouse to emulate navigation in a spatial environment. The company is pushing Gemini for Android XR, too. Apart from screen control and contextual information, it will also support Circle to search feature. Support for other devices Google said that it hopes that Android XR will support glasses with "all-day help" in the future. The company is seeding its prototype glasses to some users as well, but it hasn't specified a date for consumer launch. The search giant showed a demo where a person asked Gemini to summarize a group chat and ask for recommendations to buy a card for a friend. Another demo showed the person wearing glasses asking Gemini for a way to hang shelves through. Google said companies such as Lynx, Sony, and XReal, which utilize Qualcomm's XR solutions, will be able to launch more devices with Android XR. The Mountain View-based also specified that it will continue to work with Magic Leap on XR. However, it is unclear if Magic Leap will utilize Android XR.
[17]
Google steps into "extended reality" once again with Android XR
Citing "years of investment in AI, AR, and VR," Google is stepping into the augmented reality market once more with Android XR. It's an operating system that Google says will power future headsets and glasses that "transform how you watch, work, and explore." The first version you'll see is Project Moohan, a mixed reality headset built by Samsung. It will be available for purchase next year, and not much more is known about it. Developers have access to the new XR version of Android now. "We've been in this space since Google Glass, and we have not stopped," said Juston Payne, director of product at Google for XR in Android XR's launch video. Citing established projects like Google Lens, Live View for Maps, instant camera translation, and, of course, Google's general-purpose Gemini AI, XR promises to offer such overlays in both dedicated headsets and casual glasses. There are few additional details right now beyond a headset rendering, examples in Google's video labeled as "visualization for concept purposes," and Google's list of things that will likely be on-board includes Gemini, Maps, Photos, Translate, Chrome, Circle to Search, and Messages. And existing Android apps, or at least those updated to do so, should make the jump, too.
[18]
Google wants Android XR to power your next VR headset and smart glasses
Project Astra: Everything you need to know about the Google Deepmind project The Android operating system runs on billions of devices worldwide. Most of them are phones, but many of them are also tablets, smartwatches, televisions, cars, and a bunch of random IoT products. Officially, though, Google only supports Android on phones, tablets, watches, TVs, and cars, but today, the company is expanding the OS to support a new category of devices: extended reality (XR) devices. Google has announced Android XR, a new platform dedicated to VR headsets and AR smart glasses. ✕ Remove Ads Related Samsung's XR glasses could steal the Galaxy S25's thunder at the next Unpacked But in a prototype form Posts What is Android XR? Android XR is a new version of the Android operating system that was built from the ground up for XR devices like VR headsets and AR smart glasses. It's based on AOSP, the open-source foundation of all Android devices, but it's been heavily customized to support XR experiences. The logo for Android XR ✕ Remove Ads Google hasn't created a totally new flavor of Android in years; the last time it did so was back in 2017 with Android Automotive, which was well before the rise of generative AI. Compared to other flavors of Android, Android XR was completely developed around Google Gemini at its core. (Gemini, if you aren't aware, is the name of Google's AI chatbot and large language model family.) In fact, Google says that Android XR is the "first Android platform built for the Gemini AI era." The Android maker hasn't been alone in developing Android XR, though. It partnered closely with Qualcomm, Samsung, and others to bring Android XR to life. Samsung is developing the first hardware to run Android XR, which is set to debut sometime in 2025. Qualcomm, meanwhile, is creating the chipsets that'll power these devices. Source: Samsung ✕ Remove Ads Qualcomm and Samsung are just the first of many companies to work on Android XR hardware, though, as Google envisions Android XR to be the single unifying platform for a range of XR scenarios, ranging from VR headsets for gaming and productivity to smart glasses for lifestyle and healthcare. Companies like Lynx, Sony, and XREAL are already working on their own Android XR devices, for example. Google hasn't made a "full determination" on whether it'll publicly provide the source code for Android XR, though, so it remains to be seen whether any enterprising startup will be able to make hardware for it, at least not without entering into a partnership with Google. Why is Google building Android XR? Long-time readers are probably aware that Google isn't new to XR, with the company having started and discontinued the AR-focused Google Glass project and the VR-focused Daydream VR platform. Google believes its original vision for XR was "correct" but that the technology simply wasn't ready at the time. After discontinuing both projects, the company still held on to its XR ambitions, pivoting to phone-based AR initiatives like ARCore. ✕ Remove Ads Recent breakthroughs in AI have convinced Google that it can finally make VR take off. Interacting with AI chatbots in a multimodal manner, ie. through not only speech but also through vision, is going to be the "killer app" of this era, Google argues, but it's simply too awkward to do so right now through a smartphone. The company believes that VR headsets and AR smart glasses are a much more natural form factor for these kinds of interactions, which is something we can get behind after seeing the Project Astra demos from earlier this year. A demo of Project Astra on an Android phone. Source: Google. ✕ Remove Ads This is why Google believes now is the right time to launch Android XR. The company believes it's in a strong position to launch the platform given its "unique" position in AI. Google has a "full stack" of technology it can take advantage of to bring AI to XR platforms, ranging from cutting-edge AI models in the cloud to on-device AI models to an ecosystem of developers it can reach out to. Related Project Astra is the Google Glass we deserve Seeing the glasses in action has left us both nostalgic and hopeful Posts What can you do on an Android XR headset? Getting developers on board with Android XR isn't going to happen if there aren't consumers to sell apps to, and getting consumers to buy Android XR headsets is only going to happen if there are killer apps and experiences ready for them at launch. Google today offered a sneak peek at some of the experiences we can expect from Android XR headsets. This includes demos of how the OS enables a customizable, boundless immersive viewing experience that can be controlled through natural, multimodal AI interactions. ✕ Remove Ads Related Gemini 2.0 is here to bring multimodal AI closer to its agentic endgame Almost exactly one year after Gemini 1.0 Posts For example, Android XR allows apps like Google Photos, Google TV, and YouTube to be shown in windows floating above objects in the real world. These windows can be moved around, dragged or dropped, and minimized or closed by using hand gestures. Every app in Android XR has a header bar and sometimes a bottom bar with various buttons that can either be controlled through hand gestures or controlled conversationally through Gemini. Gemini in Android XR has the ability to interact with and even control your apps, as the platform allows Gemini to "see what you see, hear what you hear, and react to your gestures alongside your voice." ✕ Remove Ads In the following demo, we see specifically how Google Photos is being optimized for Android XR. The app is shown in its familiar tablet UI, but when an "Immersive" button is pressed at the bottom, the photo is shown without any borders. Tapping another button opens a carousel of photos and videos that you can seamlessly move through. In the second demo, we see the immersive UI that Google has built for its Google TV app. Movies and TV shows are shown on large, expansive cards with high-resolution thumbnails. Trailers are shown in big, floating, borderless windows that can be put inside a virtual theater room for an even more immersive experience. ✕ Remove Ads Next, Android XR devices have access to the full catalog of 180° and 360° content available through the YouTube app, as shown in this demo. You can even ask questions about the video and get answers back thanks to YouTube's integration with Gemini. Google Maps and Chrome will also support Android XR, with the former letting you view cities and landmarks in virtual space using the app's Immersive View feature and the latter letting you browse the web on multiple virtual screens. You'll even be able to use Google's Circle to Search feature on Android XR. You can use Circle to Search to select text and images in your view, look them up on Google, and then place 3D objects in your environment. ✕ Remove Ads Related Circle to Search could finally ditch a design element we hated back in the Android 12 days After so many years it's finally going away Posts These are just a few of the many experiences that'll be available for Android XR devices, according to Google. Many more will hopefully be available as third-party developers get their hands on prototype hardware and start tinkering with the new Android XR SDK to create apps. We probably won't see the full extent of what'll be possible until actual hardware hits the shelves, which is thankfully going to happen sometime next year. What will the first device to run Android XR be? The first device to run Android XR will be Samsung's VR headset code-named Project Moohan. It'll go on sale sometime next year at an unspecified price. Details are sparse about the headset, but you can read more about it here. When will smart glasses running Android XR arrive? Even if Samsung does end up showing off its smart glasses next month, the product won't ship until after the "Moohan" VR headset comes out. That's because Samsung and Google have strategically decided to focus on VR headsets first before releasing smart glasses. ✕ Remove Ads Both companies believe that VR headsets are the "most suitable form factor" to start building out a core XR ecosystem. This is due to the fact that they offer a higher level of immersion and higher resolution displays than smart glasses. They also offer eye, head, and hand recognition and can seamlessly transition between mixed and virtual realities. In contrast, XR smart glasses are a bit more limited in terms of immersion and input options as they're much smaller and thus can't pack as much hardware. Smart glasses are designed to be small and lightweight enough to be worn every day. ✕ Remove Ads This is why smart glasses running Android XR are being developed with different experiences in mind. Google envisions smart glasses as more of a lifestyle product rather than a gaming or productivity one. You're expected to wear smart glasses out in public just like regular glasses, and you're expected to converse with them while moving about. Just like with VR headsets, AR smart glasses will also support voice controls through Gemini AI, but voice interactions are even more important for smart glasses than they are for VR headsets. For example, you'll be able to ask Gemini to summarize the content of group chats in Google Messages, send messages to contacts, look up information on nearby stores and restaurants through Google Maps, and get turn-by-turn navigation directions to a location. You'll also be able to point your smart glasses at a sign and ask Gemini to translate it, ask Gemini questions about the content of the sign, and get real-time translations during conversations. ✕ Remove Ads Lastly, you can ask Gemini some general questions like you would on your phone. It'll have access to whatever you're seeing for context and can even remember things that it saw previously. ✕ Remove Ads Because smart glasses tend to have significantly smaller batteries and weaker processors than VR headsets, a lot of the computing going on behind the scenes to make these features possible is actually happening on your phone rather than on the glasses themselves. Android XR on smart glasses takes advantage of what Google calls a "split-compute configuration" where a lot of the computing is offloaded to your smartphone, which streams sensor and pixel data to the smart glasses. This allows for smart glasses to be built without bulky hardware. Rumors suggest Samsung's XR smart glasses could weigh just 50 grams, which is very close to the highly lauded Ray-Ban Meta smart glasses. Smart glasses running Android XR will be coming soon, and they'll be available in a variety of options. Google will soon begin real-world testing of its Project Astra service on prototype glasses running Android XR, and they're inviting a small number of users to sign up to take part in this testing. Related Project Astra: Everything you need to know about the Google Deepmind project Google's new vision for AI assistants Posts ✕ Remove Ads We don't know when Samsung's smart glasses will launch or whether they'll even have an in-lens display, but Google's announcement strongly suggests the initial batch of Android XR smart glasses will have displays, as shown in the demo videos that Google shared with us. There's a future where Android XR will run on display-less glasses, though for now, Google believes that displays are important to the form factor as they allow for richer content and more capabilities to be offered on the output side. The unveiling of Android XR is a big moment for Google. It represents a return to a vision that many people once derided, but some saw the potential in. In hindsight, Google did a lot of things right with Glass, but it was simply ahead of its time. Whether Google will succeed this time in bringing XR to the masses will largely depend on how successful it is in convincing developers to create immersive VR games and apps for the new OS. ✕ Remove Ads
[19]
Google announces Android XR, launching 2025 on Samsung headset
Besides phones and tablets, Android is available on smartwatches, TVs, and even cars. Google today announced Android XR as the next form factor the operating system is coming to. Google is using the catch-all term of extended reality (XR) to describe virtual (VR), mixed (MR), and augmented reality (AR). Android XR is for all device types, including headsets that offer video or optical see-through, screen-less "AI glasses," and AR glasses with displays. Going into Android XR, Google believes it has a proven track record of creating platforms. That is more than just making an operating system for themselves, but also catering to OEM partners, cultivating a developer ecosystem, and managing an app store. However, Google has done all this before to a lesser extent with Glass, and to a greater degree with Daydream VR. The phone-based approach to VR and later standalone headset ultimately failed, and Google exited around 2019. It remains to be seen how others adopt Android XR, but Google already has major partners in Samsung and Qualcomm: For Qualcomm partners like Lynx, Sony and XREAL, we are opening a path for the development of a wide array of Android XR devices to meet the diverse needs of people and businesses. And, we are continuing to collaborate with Magic Leap on XR technology and future products with AR and AI. Google thinks this time will be different, and believes that the vision it had with Glass/Daydream was correct, but that the technology wasn't ready. There have of course been advancements in displays, sensors, and processing, but Google believes that Gemini is the primary differentiator and that digital assistants are the "killer app" for XR. Gemini will see and hear what you do. This lets you ask questions about what you're viewing both in the real-world and on your device's screen. It will be a new conversational interface to control the OS and apps. In fact, Google says Android XR is the first OS built from the ground up with Gemini. For the Android XR user experience, read our hands-on. Google and Samsung are starting with the headset, which both consider a good starting point. Samsung has a developer kit called Project Moohan (or "infinity" in Korean) that is lightweight, has an external battery, and powered by the Snapdragon XR2+ Gen 2. With headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world. You can fill the space around you with apps and content, and with Gemini, our AI assistant, you can even have conversations about what you're seeing or control your device. Google imagines Android XR headsets as offering an infinite desktop for productivity. In this scenario, you're at a desk with a physical keyboard and mouse. A few partners already have this dev kit and more are being distributed to partners starting this week. Meanwhile, first-party apps like Chrome, YouTube, Google TV, Google Photos, and Google Maps are being optimized for Android XR. However, glasses are the end goal and frames running Android XR are coming for "directions, translations or message summaries without reaching for your phone," though they are paired like any other wearable. The final realization of this vision is in-lens display. However, Google does not think that displays are a must, and this opens the door to display-less glasses that have microphones and cameras for input, while Gemini capably handles output. Google will "soon begin real-world testing of prototype glasses running Android XR with a small group of users." This will help us create helpful products and ensure we're building in a way that respects privacy for you and those around you. Be sure to read our hands-on with Android XR glasses. Google today is releasing the Android XR SDK Developer Preview, which is "built on the existing foundations of Android app development." The Jetpack XR SDK includes: There's also an Android XR Emulator (as part of the latest Android Studio Meerkat preview). Unity will be supporting Android XR, while Chrome on Android XR supports WebXR. There's also support for OpenXR 1.1. An Android XR Developer Bootcamp in 2025, with interested parties able to express their interest here.
[20]
Google Launches Android XR OS for Mixed Reality Headsets, Smart Glasses
Android XR will rival Apple's visionOS that was launched last year Google on Thursday announced Android XR as a new operating system designed for extended reality (XR) devices, along with support for its Gemini AI assistant. It is expected to arrive with upcoming mixed reality headsets as well as smart glasses, and Google says that it will offer support for features that rely on artificial intelligence (AI), augmented reality (AR) and virtual reality (VR). Apple released visionOS as its dedicated operating system designed for the Apple Vision Pro in 2023, and offers support for running apps designed for the headset as well as iPad apps. The company says that the first developer preview of Android XR that was released on Thursday, will enable the development of apps and games for upcoming devices that will arrive with the new operating system. It already includes support for tools used by developers working on Android applications, such as Android Studio, Jetpack Compose, ARCore, OpenXR, and Unity. The new Android XR operating system will allow users to access Google's Gemini AI assistant, which will offer features designed for XR experiences. This means that users will be able to talk to the assistant and ask it questions about objects and locations within their field of view, or even use the Circle to Search feature that is available on select Android phones, to perform a visual lookup with a gesture. In addition to these AI features, Google says that its in-house applications like YouTube, Google Photos, and Google TV will be redesigned to work on a virtual display, which sounds similar to how Apple added support for watching content on a larger, immersive screen visible while wearing the Apple Vision Pro headset. Meanwhile, Google Maps will offer support for a revamped Immersive View feature, while users can also browse the web using Google Chrome on a much larger virtual screen, using gestures for navigation. Google has also announced that the first device that will run on Android XR is codenamed Project Moohan. Samsung will launch this XR headset in 2025. It is expected to rival the Apple Vision Pro, which was launched earlier this year, in select markets, with a $3,499 (roughly Rs. 2.96 lakh) price tag. Smart glasses (or AR glasses) are said to be the future of XR technology, and could offer most of the features available today without bulky components. Google says it is already preparing for these technologies with Android XR, and will start real-world testing for prototype glasses running the new OS soon. Previews of Android XR on these prototype devices shared by the company showcase some futuristic features such as the ability to view messages in a small popup at the bottom of the wearer's field of view, or the ability to see turn-by-turn navigation directions while using Google Maps, along with a small circular map shown in the same location. These classes could also automatically offer to translate text and offer virtual tutorials using AR technology, according to the company.
[21]
Forget Samsung's XR headset -- I'm way more excited about Android XR in smart glasses
Spatial computing isn't going to replace your computer, but it could replace your phone... So the smoke has cleared on Android XR and Project Moohan -- Samsung's Apple Vision Pro competitor. While working through all the news around this new platform and mixed reality headset, I definitely caught the vibe that this is the start of something big. Google is trying to figure out how to put Android on your face, and XR feels like a strong way to unlock virtual reality, augmented reality and mixed reality experiences that will work the same across whatever kind of hardware a company tries to make. Google is trying to create a unified platform regardless of the device, which leans on the power of Gemini AI to pair personal context with the world around you to give you an augmented experience. But if I may be so bold, this starting point may be good, but a little tame. Vision Pro is indeed a breakthrough experience, but I'm a little cold on the headset itself. Instead, I'm more excited to see what happens when Android XR comes to smart glasses -- especially now that a partnership with Xreal has been announced. We're a long ways off from it, but let me explain why. Yes, Google Glass may have been a dud, but Android XR is the result of "years of investment in AI, AR and VR" since this misstep to give you a software platform that is all things to different hardware types in the VR and AR space. Tapping into Google's services like Maps, Google Lens, Circle to Search and more, while infusing it with Gemini, this new OS platform spatializes all your key information and assistance to augment the world around you -- from the full immersion of what Google calls an "episodic product" like a VR headset, to "all-day products" like glasses. Details beyond this are sparse, but what we do see is that Project Moohan seems like it's got the Vision Pro in its sights, in terms of spatial computing capabilities -- immersive TV content, multi-screen productivity and seeing family photos. But the main issue I've always faced with the vision (pardon the pun) of bringing spatial computing to VR headsets is that whole "episodic product" thing. The capabilities are certainly there, and the Vision Pro does have the horsepower to do some big things. But taking price out of the equation for a second, the problem is that spatial computing just hasn't got to a point where it can be faster to do something here than on your trusty laptop. When you put them into glasses, however, that's where I feel this software vision comes to life. Spatial computing focuses you too much on trying to replace a computer. Set your sights on introducing something for the post-smartphone world, and you can see how the voice control, gesture control and wearability of something like a pair of specs could be a very realistic vision for where Android XR goes. I know I'm talking about something far in the future, and chances are a couple of people at Google may be panicking that I'm predicting the smartphone may be made redundant -- one of the company's biggest businesses. And currently, it's clear that Google is looking at these as a device that plays nice with your phone. But that disconnection from a slab in your pocket is a very real possibility. I predicted that an AR revolution was coming to CES 2024 -- the beginning of the parallel lines of development between VR headsets and AR glasses starting to intersect to bring you VR capabilities in something the size of a pair of specs. This, in my mind, is what the real smart glasses will be. Did it happen? ...sort of. Xreal Air 2 Ultra did give you hand tracking in a pair of AR specs, but is still stuck in trying to pull developers onto its journey. And of course, Meta's Project Orion has come closest to this intersection out of anyone, but it's a prototype that we won't be seeing come to fruition for a while. I'm quietly confident that Android XR is the platform that will take us through to another era of smart glasses. So now, it's onto the hardware makers to check off the following items on what I'm going to call my steps to smart glasses: Once any company that makes AR glasses nails these three, that's the moment we can start to ask that important question: "what comes next after the smartphone?"
[22]
Samsung Teams Up With Google On Android XR
The Android XR platform is a groundbreaking collaboration between tech giants Samsung, Google, and Qualcomm, aimed at redefining the way users interact with both the digital and physical worlds. This innovative platform combines the power of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) under the umbrella term eXtended Reality (XR). By seamlessly merging these technologies, the Android XR platform seeks to provide users with immersive experiences that transcend the limitations of traditional screens. At the heart of this platform lies innovative multimodal AI technology, which enables natural interactions through various input methods such as voice, gestures, and even gaze tracking. This intuitive approach to user interaction promises to transform the way we engage with digital content. Furthermore, the Android XR platform leverages the extensive Galaxy ecosystem and Google's comprehensive suite of apps to transform everyday activities, including gaming, entertainment, learning, and even health and wellness applications. The first device to showcase the capabilities of the Android XR platform is codenamed "Project Moohan" and is expected to launch in the near future. While specific pricing details have not yet been disclosed, Samsung has emphasized that the hardware will be designed with user comfort and accessibility in mind. The headset will feature a lightweight and ergonomically optimized design, equipped with state-of-the-art displays and advanced Passthrough capabilities. To ensure widespread accessibility, the device will be available in both glasses and headset form factors, catering to a diverse range of user preferences and needs. Project Moohan, which translates to "infinity" in Korean, embodies Samsung's vision of delivering unparalleled immersive experiences to users. The headset will boast advanced displays, natural multi-modal input, and a conversational AI assistant powered by Gemini. This AI assistant will serve as a guide, helping users navigate the vast possibilities offered by the Android XR platform. With Project Moohan, users can expect a seamless and intuitive experience across a wide range of applications. For instance, users can explore the world in a whole new way through Google Maps, immersing themselves in virtual environments that blend real-world imagery with digital enhancements. Live sports events can be enjoyed on YouTube, providing users with a front-row seat from the comfort of their own homes. Planning trips and vacations will also be transformed, as users can virtually visit destinations before making their bookings. For readers intrigued by the potential of XR technology, there are numerous other areas worth exploring. One such area is the role of XR in gaming, where it has the potential to create truly immersive and interactive gaming experiences. XR is also finding applications in education and training, allowing students and professionals to engage with learning materials in new and innovative ways. In industries such as healthcare and architecture, XR is already transforming the way professionals work. Surgeons can use XR to plan and practice complex procedures, while architects can use it to visualize and refine their designs in real-time. As XR technology continues to evolve and mature, its potential to reshape the way we interact with the world around us is truly limitless. The Android XR platform represents a significant step forward in the evolution of XR technology. By bringing together the expertise of Samsung, Google, and Qualcomm, this platform promises to deliver unparalleled immersive experiences to users across a wide range of applications. As we eagerly await the launch of Project Moohan and other devices built on the Android XR platform, one thing is clear: the future of XR is bright, and its potential to transform our lives is only just beginning to be realized.
[23]
Google Announces Android XR OS, Teams Up With Samsung for "Project Moohan"
Samsung will be the first to come out with an XR headset, currently in development. After a lot of anticipation, Google has finally unveiled its entry into the emerging market of virtual and augmented reality with Android XR OS. The new and upcoming OS will feature Gemini with a variety of Google apps. Google says it will help you perform tasks in an entirely new way. Android XR OS is a product of Google's collaboration with Samsung and Qualcomm. It's a platform that aims to bring new experiences to XR using AI. It will allow developers to develop apps and games freely for the platform using existing tools like Unity, OpenXR, ARCore, and Jetpack Compose. If you're wondering, XR stands for "eXtended reality" which encompasses AR, VR, and MR. Android's XR OS will be capable of AR, VR, and more experiences, and the apps will define these experiences. From smart glasses like the Ray-Ban Meta (hands on) to XR Headsets, manufacturers like XREAL, Samsung, and Sony are already on board. Google also mentions that Samsung will be the first to launch a headset with Android XR OS next year. It is currently called Project Moohan. We have a few details about the headset. For instance, it will be powered by the Snapdragon XR2+ and will come with eye and hand tracking. In their announcement, Samsung explains why they named it Project Moohan. It's because Moohan means "infinity" in Korean, and Project Moohan represents infinite possibilities in an immersive space. Besides, Google also shared in their blog post a few glimpses of how Android XR OS would look on XR headsets. Starting with YouTube videos, Circle to Search, Google Photos, and Google TV, and it all looks pretty great. Google also shared a demo of Android XR glasses features in action. The demo heavily involved Gemini and a nod to Project Astra-like assistant features which Google showcased in I/O 2024. You can check out more details about XR OS on Android's latest landing page. This page also confirms that the first set of headsets and glasses with XR operating system will be available in 2025. What are your thoughts on the Android XR OS launch? Will it be the next big thing in the XR space? Let us know in the comments.
[24]
Here's the Android XR headset that Google and Samsung are releasing in 2025 - and the software that powers it
Project Moohan is among the many hardware products that will be built around Google's new Android XR platform. Google today is introducing Android XR, the software platform by which artificial intelligence (AI), augmented reality (AR), and virtual reality (VR) experiences will converge and help power upcoming wearables. Like Apple's VisionOS, Android XR is centered around spatial computing applications, leveraging smart glasses, mixed reality headsets, and more to overlay useful information and visual cues onto the real-world environment. Also: Forget the Ray-Ban Metas: Samsung's upcoming smart glasses are the wearables I've been waiting for The first device to run on Android XR, code-named Project Moohan, was built by Samsung and will be available for purchase sometime next year. The headset, pictured above, features a curved glass visor similar to Apple's Vision Pro but with a larger face cover to prevent light bleeding and a Meta-esque head strap for comfort. There's also a button on the top right of the visor, potentially mapped to key actions like powering the device and switching between XR and VR modes. "Today's (software) release is a preview for developers, and by supporting tools like ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR from the beginning, developers can easily start building apps and games for upcoming Android XR devices," says Shahram Izadi, VP and GM, XR in a Thursday press release. Some Android XR features will include quickly switching between virtual environments and the real world, anchoring apps and content around you, multitasking with multiple floating Chrome windows, and communicating with Gemini about what you see. Like in competing XR operating systems, entertainment will be a big focus with Android XR, says Izadi, allowing users to watch YouTube and Google TV or view photos on large, virtual screens. Existing Google Play Store apps will be accessible on Android XR, too. With Android XR, Google is clearly leveraging its existing Android infrastructure to build out the platform, and that's great news considering how even the best hardware for mixed reality applications today is held back by software. Just a week ago, Apple finally rolled out ultrawide screen support on its $3,500 Vision Pro headset, a change that has quickly become the device's killer feature. Also: These new smart glasses remind me of Meta Ray-Bans - but have a clever privacy feature With proper and adequate developer support and a lineup of manufacturers just as excited as Google to bring XR to the masses, be prepared to see a slew of new (and better) smart glasses and VR headsets in 2025. We might even catch our first glimpse of one in January.
[25]
I tried out Android XR, Google's latest attempt to take on Meta and Apple
Google Glass. Google Cardboard. Google Daydream. Table of Contents Table of Contents Samsung's Project Moohan VR headset Moving beyond Google Glass Giving it another shot The company has had its fair shot at VR and XR -- there's no doubt about that. Android XR is Google's latest attempt at getting back in the game, and this time, the vision is entirely different. Recommended Videos First off, Gemini AI is at the core of the OS, ensuring the company's AI reaches a wider range of people and use cases. And more than that, as per the name, it's an attempt to align XR devices in the same category where it's had success in smartphones. Google invited me to its Mountain View, California, campus to check out Android XR for myself. The hardware was only on unreleased prototypes, but regardless of the form factor, it was all about Android XR as a software platform. To my surprise, I left very excited for what the future holds for the tech in 2025. Samsung's Project Moohan VR headset After putting on the Project Moohan VR headset, built by Samsung, I went through a quick procedure to detect the interpupillary distance (IPD) between my pupils to dial in the best image clarity within the goggles. Beyond that, there were no additional calibrations needed to pick it up and go. I was immediately struck by the quality of the lens and fidelity of the passthrough camera, combined with onscreen elements. My eyes have never seen VR in such pristine and sharp form. I also felt completely safe with the perspective of my surroundings, lending even more to the realism of the experience. At one moment, I encountered a bird in the Chrome browser that I could click to "View in 3D." All of Google's work with ARCore and AR search is supported natively within Android XR. I was completely taken aback by the sharpness and fidelity of the image hitting my eyes and how that interacted with the depth information of the room. Though I still don't know a lot about this headset, it is clearly meant to be a direct competitor to the Apple Vision Pro and aimed at securing a spot among the best VR headsets. Walking through the demo room, I initiated Circle to Search with my fingers in the air and drew a circle around a doughnut toy that was partially obscured by a hanging plant. The obstacle didn't phase the process and I was shown a number of places I could buy the toy. As a big fan of Circle to Search, this extra dimension really impressed me. A recurring highlight was the multimodality of control within Android XR. Hand controls match what I've experienced on HarmonyOS for clicking, stretching, and moving virtual objects. A physical desk equipped with a Bluetooth keyboard and mouse showed how control could swap between input devices on the fly. Tapping the side of the VR headset would initiate Gemini for voice commands and control. Perhaps the most interesting aspect to me was eye tracking, which had its own calibration process to start. As my eyes landed on objects, they would be highlighted, while a pinch with my relaxed index finger and thumb activated a click. Having never experienced eye-tracking control before, the potential of this feature got me very excited. Another highlight was the effective spatialization of flat footage, which was demonstrated in a few key ways. In Google Photos, pictures and videos from the library were shown with incredible depth generated by AI and, to my eyes, I had a hard time knowing that it wasn't spatial to begin with. In a demo of Google Maps, I was taken inside a restaurant for a virtual tour using Street View. The footage was shot years ago without any sort of depth information, but as I stood in the middle of the room, I gained a felt sense of being there thanks to the newly spatialized presentation. Moving beyond Google Glass Android XR is designed to adapt the UI and functionality based on the hardware's capabilities. Headsets get the full immersive experience, while smaller devices like glasses get a more focused, streamlined interface. Not only that, but lighter devices can offload processing workloads to a connected smartphone through a split compute configuration. This sends sensor data to the phone to be processed, then the pixels are streamed back to the glasses for viewing. To that end, I was given a pair of black frame glasses, reminiscent of the frames shown off at Google I/O 2024. They fit my face comfortably and I immediately witnessed a sharp, full color image in the center of my right eye showing an event from the calendar app on a Pixel device in the room. Google Maps streamed written directions to the glasses while tilting my head downward showed a visual representation of my point on the map. As one of the first recipients of Google Glass when it launched, this really felt to me like the evolution of that vision. Next, I fired off Project Astra, a multimodal version of Gemini that was also demonstrated at Google I/O this year. I pressed a button on the side of the monocular frames, Gemini said "hey", and I started asking questions about objects throughout the room. At one point, Gemini told me what drink I could make with a small collection of liquor bottles sitting on the table. Later in the demonstration, I asked Gemini for the name of the book sitting next to the bottles from earlier, and it gave me all the details. I previewed tracks from an album that I was holding in my hands. Gemini summarized a page out of a book that I pulled off the shelf. A lag in response time was notable throughout the experience, though I was reminded that this is a prototype and a work in progress. Finally, I was given a brief demonstration of glasses with a Binocular display, resulting in a sharp, 3D image in the center of my eyes. Driving the high-density display in the glasses was Raxium's MicroLED technology, which features monolithically grown micro LEDs with R, G, and B micro LEDs on the same substrate. This technology enables high brightness, efficiency, and resolution at a compact size. A projector system sits in the shoulder of the frames, with a waveguide directing light to the target area in the lenses. Giving it another shot All this is very exciting. But let's not forget -- this is by no means Google's first attempt at trying to make a successful extended reality platform. Google first brought Google Glass to market little more than a decade ago, and while it grabbed attention at launch, it ultimately faced public backlash from those concerned with the privacy implications of a wearable camera embedded into the frame. Daydream VR was Google's original attempt at virtual reality with Android smartphones, but Google discontinued that hardware in 2019, three years after it launched. I tried my best to temper my expectations with that history in mind. But I'd be lying if Android XR didn't get me excited. It represents a renewed commitment by the company to create an operating system that can drive headsets, glasses, and other form factors going forward. With Meta already announcing its own plans to open up its Horizon OS ecosystem, it looks like we're about to get another explosion in XR headsets next year. Can Google pull it off? That's yet to be seen. I'm optimistic based on what I've seen so far.
[26]
Google Goes All-in on Extended Reality Technology with Android XR - Phandroid
We suppose second time's the charm - Google recently took the wraps off Android XR, a new approach to mobile computing via extended reality software platforms - think VR headsets and displays, for example. The difference this time around is that Google promises experiences beyond VR, combining newer AI technology and augmented reality for more helpful user experiences. Created in collaboration with Samsung, Google says that it's aiming to develop a rich ecosystem for device manufacturers and developers with Android XR, as well as big names in the industry including Qualcomm. READ: Apple's Vision Pro Needs an Android Competitor As such, Android XR in its current state serves mainly as a developer preview version, and will support several tools including ARCore, Android Studio, Jetpack Compose, Unity, and OpenXR, to name a few. First-party Google apps such as YouTube, Google Photos and Google TV will also be redesigned to better suit Android XR, and we can expect more Android apps to follow suit. For hardware, Google adds that it will be working towards a wide selection of devices from brands like XREAL, Sony, and Lynx. This also includes Samsung's recently-announced Project Moohan, although Samsung hasn't shared too much information regarding the product.
[27]
Google and Samsung reveal Project Moohan mixed-reality headset and Android XR, 'the first platform built entirely for the Gemini era'
After a long wait and lots of behind-the-scenes collaboration, Google and Samsung have finally revealed Google's Android XR platform and the first hardware to run it: Project Moohan, Samsung's mixed-reality headset. If you just looked at the photo of the new Project Moohan headgear and did a double-take, you're not alone - it does look like a cross between Google's old Daydream VR headset and Apple's high-end Vision Pro goggles. Google and Samsung revealed the platform and headset as a dev kit today, December 12. The pair also teased AR glasses, though there's no timeline for consumer availability. The Samsung Project Moohan headset is a different story, with Google and Samsung telling us in a presentation last week that "the first products based on this platform will launch in 2025". It's been a long journey for both Samsung and Google, each of which have had their dalliances, and even long-term relationships, with VR, AR, and mixed-reality headgear. Samsung's commitment, until now, never really matched Google's, which famously launched Google Glass and, almost a decade later finally killed the project. That failure apparently left Google undaunted. "We like many others have made some attempts [...] and I think the vision was correct but the technology wasn't quite ready," said Sameer Samat President of Android Ecosystem, Google. "But, importantly, we never stopped working on it." The tipping point, as Google sees it, has been AI and the arrival Gemini, and Android XR is its "first platform built entirely for the Gemini era," said Samat. Samsung and Google described Project Moohan as a headset that will feature the full range of XR experiences, from ones that are fully immersive to mixed reality. That makes sense when you consider that 'Moohan' means 'infinity' in Korean. The headset features eye and hand tracking, and will respond to voice queries. Gemini, Google's generative artificial intelligence platform, will sit at the center of Google's AR system. "Gemini will see what you see and hear what you hear," said This stands in contrast to Apple's Vision Pro, which, though it integrates Apple Siri, does not have that level of awareness, at least not yet (Siri with Visual Intelligence on the iPhone 16 gets halfway there). Samsung was somewhat mum on key Project Moohan specs, refusing to delve into details like price, weight, where the batteries live, and the imaging technology, beyond that it'll be "high resolution". I could see from the photos and videos that it's a mostly gray headset with a single band and supporting foam(?) around the back of the head (there may be a battery pack back there), and a glass front surrounded by a thin chrome bezel. What we do know is that the silicon powering Project Moohan comes from the third party in this partnership: Qualcomm. Samsung didn't specify which chip or chips will be inside Project Moohan, but it stands to reason that we should expect the cutting-edge Snapdragon XR2. Android XR will provide a visual experience inside Moohan that should be familiar to anyone who's experienced mixed reality inside the Apple Vision Pro or Meta's Quest headsets. In videos, I saw, floating app screens arrayed around the Project Moohan wearer, who was able to pinch and grab windows to move them around. Google's Samat said Android XR will focus on customizability and immersion, natural multi-modal AI interactions, and "open collaboration with the existing XR communities." Google, though, did not say that Android XR will be open source. Google is also focused on making sure that its native apps, like Gmail, Maps, Google TV, and YouTube, are ready for Android XR, and showed how YouTube could provide an immersive video viewing experience. Even though the demo videos primarily focused on gesture control, the platform will come with controllers, and Android XR will support controller input on other devices when they arrive. Speaking of other devices, we also got a glimpse of Samsung's AR glasses. In the video they looked similar to, say, the Ray-Ban Meta smart glasses (and maybe smaller than Meta Orion). In the demo, the wearer could speak to the onboard Gemini and ask for directions, and the in-lens displays overlayed Google Maps turn-by-turn directions. There's no timeline for the delivery of these AR glasses. Speaking to us about Android XR and Project Moohan, Kihwan Kim, Samsung's EVP and head of Immersive S/W R&D Group, told us, "This is just the beginning of our journey to create an entire XR ecosystem [...] we want to empower users to enhance everyday life in a whole new and incredibly immersive way. The possibilities are infinite and we are just getting started." Starting today, developers will be able to take home these headsets, the new APIs, and an emulator that will allow them to build and port apps to new Android XR experiences. They'll be following behind a lineup of big-name partners who already have these, and are working on porting their apps and services to the system. These include Major League Baseball, Calm, Adobe, AmazeVR concerts, Naver, and Mirrorscape. Project Moohan and the AR glasses tease are all just another reminder that 2025 could very much be the year of mixed reality for the masses.... maybe.
[28]
Hands On With Android XR and Google's AI-Powered Smart Glasses
Naturally, you can work in a mixed-reality environment with a connected Bluetooth keyboard and mouse, and you can put yourself in an immersive environment if you want to focus, or leave see-through mode turned on to make sure your coworkers aren't taking photos and giggling while you wear a ridiculous headset to get stuff done. It wasn't clear if you'd be able to connect the headset to a laptop to bring your work into mixed reality, a feature available on the Apple Vision Pro. A tap on the side of the headset brings up an app launcher, and this is where you can toggle on Gemini if you want it to persistently stay "on." Once it's on, there's an icon at the top of the virtual space so that you are aware that everything you say and look at is being registered by Gemini. In see-through mode, you can walk up to an object and ask Gemini about it -- a Googler demoing the headset (before I tried it) walked up to someone else wearing an FC Barcelona shirt and asked Gemini to find the "standings of this team." Gemini quickly registered the team name and pulled up search results with league standings and scores from recent matches. You can ask Gemini anything like this and it will answer with visual results displayed in the headset. I asked it to "take me to Peru," and it opened up a 3D version of Google Maps. I was able to move around and center on Lima, and in cities where Maps already has a lot of 3D models, you can explore areas in greater detail. You can keep talking to Gemini in these experiences, so I asked questions such as when would be the best time to visit and got a prompt answer. In another example, I peeked inside a restaurant in New York City to take a virtual tour of the space. Google says it can use AI to stitch together images of a venue's interior and display it so that it feels like you're there. It did a pretty good job, and I asked Gemini if the place takes reservations, without having to specifically say the name, because I was staring at the name of the restaurant. It does take reservations, but Gemini couldn't actually make one for me. (That integration might come later.) Next, I watched a few videos on YouTube, where 2D content looks sharp and colorful. Stereoscopic content was even better; my senses felt surrounded. I watched some hikers walking along a trail and asked Gemini where this all was, and it said, "New Zealand." I wasn't able to verify that, but it looked like the right answer. I watched some more spatialized playback of 2D videos as the virtual player added depth and layering to make them feel 3D. I hopped over to the Google TV app and enabled a "Cinema mode" to launch a virtual theater for watching movies and shows, just like on other VR headsets. Circle to Search, the feature Google debuted earlier this year on Android phones, is also available in Android XR. Just walk up to a physical object near you, press the top button on the headset, and then pinch and draw a circle around the thing you want to know more about. You'll get a Google Search page with results. Project Moohan very much feels like Google and Samsung catching up to the rest of the VR market, though the Gemini integration gives their efforts a unique layer. However, I will admit I was far more excited to try the smart glasses, where Gemini feels like it could be even more helpful. They didn't disappoint. I walked over to another room and there were several pairs of glasses in front of me. Some were sunglasses, others had clear lenses. Like the headset, you can get them loaded up with your prescription. Google did not provide a name for the prototype glasses.
[29]
Unlock the Infinite Possibilities of XR With Galaxy AI
Imagine being able to step into any world in an instant -- from the bustling streets of New York City to the snowy mountain tops of the Alps. These worlds are no longer merely places to observe from afar. They can now be explored and interacted with simply through a gaze, a gesture or your voice. What once seemed like science fiction has become a reality as eXtended reality (XR) transforms how we engage with the world around us. XR is an umbrella term for technologies that use digital elements to extend or alter reality by merging the physical and digital worlds together. This includes virtual reality (VR), augmented reality (AR), mixed reality (MR) as well as other similar technologies yet to be developed. XR offers infinite possibilities, creating a dynamic spatial canvas where users' sight, sound and motion combine to interact with the outside world. A world that will unlock unprecedented experiences across core areas of life, from working and learning to entertainment, gaming and even health and wellness. We have always placed the user experience at the core of our innovation. It is our responsibility to bring the best possible technology and experience to our users, and we demonstrated this commitment by introducing and democratizing Galaxy AI. Grounded in mobile AI leadership, we truly believe now is the time to unlock the potential of XR. With the power of multimodal technology, our XR enables the most natural and intuitive interactions with the world. Supported by the broader Galaxy ecosystem, this technology will empower and transform your everyday life -- in a way that only we can deliver. We are also making our vision a reality through open collaboration with industry leaders like Google and Qualcomm, culminating in the creation of an entirely new Android XR platform. For many years, we have worked side-by-side with partners, designing, integrating and optimizing technology -- and this next project is one of our most ambitious endeavors yet. By uniting forward-thinking ideas with industry-leading expertise, we can ensure a robust and scalable platform that offers a wide range of content. From the Galaxy ecosystem and Google's suite of apps to partnership with third-party developers, every XR interaction will be incredibly enriched. Available for both headsets and glasses, this new platform leverages the power of Gemini by bringing together a conversational user interface (UI) and a contextual understanding of the world around you. It offers cutting-edge capabilities beyond gestures or a controller, utilizing voice and natural conversation to elevate the experience. With these accurate, personalized interactions, the platform will be your helpful AI assistant. Code named "Project Moohan", the first headset designed for Android XR is poised to bring this experience to life in the near future. The name "Moohan", meaning 'infinity' in Korean, connotes our belief in delivering unparalleled, immersive experiences within an infinite space. Equipped with state-of-the-art displays, Passthrough capabilities and natural multi-modal input, this headset will be your spatial canvas to explore the world through Google Maps, enjoy a sports match on YouTube or plan trips with the help of Gemini. All these experiences come with lightweight, ergonomically optimized hardware designed to ensure maximum comfort during use. "XR has quickly shifted from a distant promise to a tangible reality. We believe it has the potential to unlock new and meaningful ways to interact with the world by truly resonating with your everyday lives, transcending physical boundaries," said Won-Joon Choi, EVP and Head of R&D, Mobile eXperience Business. "We are excited to collaborate with Google to reshape the future of XR, taking our first step towards it with Project Moohan." "We are at an inflection point for the XR, where breakthroughs in multimodal AI enable natural and intuitive ways to use technology in your everyday life", said Sameer Samat, President of Android Ecosystem, Google. "We're thrilled to partner with Samsung to build a new ecosystem with Android XR, transforming computing for everyone on next-generation devices like headsets, glasses and beyond."
[30]
Google renews push into mixed reality headgear
SAN FRANCISCO (AFP) - Google is ramping up its push into smart glasses and augmented reality headgear, taking on rivals Apple and Meta with help from its sophisticated Gemini artificial intelligence. The Internet titan on Thursday unveiled an Android XR operating system created in a collaboration with Samsung, which will use it in a device being built in what is called internally "Project Moohan," according to Google. The software is designed to power augmented and virtual reality experiences enhanced with artificial intelligence, XR vice president Shahram Izadi said in a blog post. "With headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world," Izadi said. "You can fill the space around you with apps and content, and with Gemini, our AI assistant, you can even have conversations about what you're seeing or control your device." Google this week announced the launch of Gemini 2.0, its most advanced artificial intelligence model to date, as the world's tech giants race to take the lead in the fast-developing technology. CEO Sundar Pichai said the new model would mark what the company calls "a new agentic era" in AI development, with AI models designed to understand and make decisions about the world around you. Android XR infused with Gemini promises to put digital assistants into eyewear, tapping into what users are seeing and hearing. An AI "agent," the latest Silicon Valley trend, is a digital helper that is supposed to sense surroundings, make decisions, and take actions to achieve specific goals. "Gemini can understand your intent, helping you plan, research topics and guide you through tasks," Izadi said. "Android XR will first launch on headsets that transform how you watch, work and explore." The Android XR release was a preview for developers so they can start building games and other apps for headgear, ideally fun or useful enough to get people to buy the hardware. This is not Google's first foray into smart eyewear. Its first offering, Google Glass, debuted in 2013 only to be treated as an unflattering tech status symbol and met with privacy concerns due to camera capabilities. The market has evolved since then, with Meta investing heavily in a Quest virtual reality headgear line priced for mainstream adoption and Apple hitting the market with pricey Vision Pro "spacial reality" gear.
[31]
Smart Glasses Are Going to Work This Time, Google's Android President Tells CNET
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps In a secluded room inside Google's New York offices, CNET's Scott Stein browsed a bookshelf filled with titles he's never read. After picking up one of the books, Absolution by Jeff VanderMeer, he asked whether he needed to read the other installments in the series to appreciate it. A friendly voice, coming from his glasses, told him that even though the book is set in the same universe as the author's other works, it's a standalone story. Several minutes and many interactions later, when he asked about the books he previously browsed, the assistant remembered and recited them all. In a rare one-on-one conversation with CNET, Sameer Samat, Google's president of the Android ecosystem, made it clear that he thinks exchanges like these will increasingly become the norm thanks to the arrival of Android XR, a collaborative effort between Google, Samsung and Qualcomm, unveiled on Thursday. The three tech behemoths, which together form the backbone of the Android smartphone landscape, are hoping to push Android into its next era with a new software platform designed to power mixed reality headsets, smart glasses and everything in between. Samsung is developing the first device -- a headset codenamed Project Moohan -- which will be available next year as a precursor to smart glasses that the company is also working on. It comes after Google announced on Wednesday that it's releasing prototype smart glasses to testers to gather feedback. Google and Samsung also hosted developer demos in New York this week in hopes of convincing app creators to build custom experiences for the new software. The smartphone has defined the last 15 years of our lives, and tech giants have failed to create a device as impactful since. Google, among others, has certainly tried. First there was Google Glass, the augmented-reality eyewear that ultimately flopped because of its high price, privacy concerns and technical limitations. Then came Google Daydream, a virtual reality platform that turned smartphones into VR machines, which the company discontinued in 2019 after slow adoption. The AR/VR market is still struggling five years later, with virtual and augmented reality shipments dropping 28.1% year-over-year in the second quarter of 2024, according to The International Data Corporation. Apple also reportedly cut its Vision Pro orders, according to The Information and TF International Securities analyst Ming-Chi Kuo, which some have interpreted as a signal of weak demand. It's perhaps a sign that even a company credited with inventing the modern smartphone may be struggling to make mixed reality succeed. But Samat is certain that things will be different this time. The virtual assistant we've been waiting for, one that's worth donning a pair of glasses or a visor for because it's that helpful, is finally here, he says. And of course, it's all because of advancements in artificial intelligence, particularly generative AI, which has upended the technology industry since its arrival in OpenAI's ChatGPT in 2022. "We have had attempts here before, but one thing that was clear was that the technology wasn't quite ready," Samat said. "The other thing, though, is we never stopped believing in the vision around all this, and we actually never stopped working on it either." Read more: Why Apple's Future Could Depend on Siri After trying Google's prototype glasses and Project Moohan, which looks more like a traditional VR headset, it's easy to understand why AI has been the missing piece of the puzzle. While they aren't the first head-worn gadgets with virtual assistants, the experience is vastly different from what you'll get from using Siri on Apple's Vision Pro or the AI assistant in the Meta Quest. And, while Meta's Ray-Bans also have camera-assisted AI functions already, they're not as continuously aware yet as Google's demos were. The version of Gemini in Android XR isn't like the question-and-answer bots of years past. It's not just playing music on command, telling you the weather forecast, launching apps, reciting notifications or managing the temperature on your smart home thermostat. It's actually seeing the world around you and making observations, almost like an actual human. Google teased this approach back in May when it unveiled Project Astra, an earlier prototype of the glasses-based assistant. During the Project Moohan demo, CNET's Scott Stein explored a virtual version of the Spotify Camp Nou soccer arena in Barcelona through Google Maps. All it took was a simple statement like, "Show me the most famous goals made here," to pull up YouTube clips from soccer games played in that very location. When wearing the prototype glasses using Project Astra, Gemini was able to provide CNET's Lisa Eadicicco with cocktail ideas based on the bottles she'd been looking at on the shelf in front of her. This iteration of Gemini is more present and aware than other implementations that have existed so far. Once you activate Gemini on the glasses, it passively listens for requests, making it feel more like you're having an ongoing conversation rather than using the assistant for siloed commands. Tapping the glasses' arm pauses Gemini. "We were playing around with what these models can do using the phone, and the cameras on the phone, as a way of interacting with the world," Samat said. "And it was truly blowing us away, what was possible." There's a chance privacy woes will arise, just like they did with Google Glass, and for good reason. Concerns over the way tech companies handle personal data and the outsized influence these companies have in our lives has reached an inflection point in recent years as regulators crack down on tech giants. New devices like smart glasses and immersive headsets may only serve to deepen those worries in the future. In a broader press briefing, Samat said Google was working through "special privacy controls" for head-worn devices that the company will share more details about next year. "We fully understand and agree that [privacy] needs to be carefully addressed," Samat said to a group of journalists. "And in addition, I would say it's not just for the individual that has the glasses or the device on, but for the folks around as well." Ever since ChatGPT arrived in late 2022, there's been an AI gold rush to infuse more products with chatbots, conversational interfaces and generative capabilities. Smartphones are among the biggest examples; companies like Google, Apple and Samsung have introduced more natural-sounding virtual assistants along with new tools for creating images based on prompts and summarizing text. Google also brought a feature called Circle to Search to Android in 2024, which lets you launch a Google search just by circling something on your device's screen. It's bringing a real-world version of that to Android XR, too. But these software features haven't meaningfully resonated with consumers yet, nor have they been enough to move the needle in terms of smartphone sales. AI-first gadgets like the Humane AI Pin and Rabbit R1 handheld device, both of which are meant to primarily be interacted with via voice, also fell flat this year after failing to live up to expectations. Glasses and headsets have the distinct advantage of being placed right within your line of sight, which could make them more intuitive and appealing than smartphone-based efforts, Google and Samsung believe. Even though Google, Samsung and Qualcomm announced their partnership back in February 2023, progress in AI caused them to reevaluate their approach in the middle of development, according to Samat. "With glasses, it really is the opportunity to engage with an AI assistant in the real world," he said. "We think that application can be what email and texting was for the phone, for glasses. We think that that's sort of a killer app in many ways." That vision will start with headsets, not glasses, given that these larger, visor-style devices can provide a foundation that glasses can eventually use. "There's perception technology, there's world locking, there are a number of things that we feel like made sense to focus on with the headset and get right," Samat said. If you look at the history of Google's product launches over the past decade, you can see the fragments of Android XR coming together. The glanceable widgets on Wear OS smartwatches (a platform it also created in collaboration with Samsung), Google's history of bringing augmented-reality based features to smartphone apps and the voice-first interactions on smart speakers like the Nest Audio -- it feels like there are pieces of them all in Android XR. Gemini is the thread tying them together. Wrist-worn wearables have an especially important role to play in Android's expansion into mixed reality. While smartwatches are inherently different from smart glasses, they're both optimized for showing bite-sized bits of information in a quick way. They also require computing to be split between the device itself and your phone. That's why the Wear OS and Android XR teams closely collaborate, Samat says. Smartwatches and smart glasses seem poised to work together. It's easy to imagine a future in which your smartwatch enables new types of gesture inputs, or perhaps works in conjunction with your glasses to display health and fitness statistics, the latter of which Samat hinted could be a potential use case in the future. None of those scenarios are a reality yet, and Samat couldn't say when or if they will be. But that's part of why Google is providing the glasses to testers, to get feedback on things like how watches should interact with glasses. "I think there'll be some awesome use cases between them," Samat said. Android XR is designed to span a spectrum of different types of devices, much like the version of Android that runs on phones. So it makes sense that not all apps will look the same. Samat says they'll generally be broken down into three tiers, the first being two-dimensional apps, which are essentially enlarged versions of the apps already available in the Google Play Store. Then there are spatialized apps, such as a media app that distributes certain elements of the app, like the comments section on YouTube, for example, in the space around you for easier viewing. Fully immersive apps, on the other hand, would be built specifically with mixed reality in mind. Google Maps in 3D, for instance, which CNET got to try in mixed reality, extends a 3D map landscape out that you can dive into, or flip back into a standard 2D mode. As is the case with phones, you can expect companies to put their own spin on Android XR depending on their device. Samat says device makers will be able to bring their own services and assistants to Android XR, although some elements of the operating system will remain consistent across devices for ease of use. There's a lot at stake for Google, Samsung and Qualcomm when it comes to getting this right. Google has already run into its fair share of snafus with its AI technology, particularly around errors in its AI Overviews tool in search. As devices like the Rabbit R1 and Humane AI Pin have proved, making a first impression is important. People likely won't be willing to spend their time and money on a gadget they find underwhelming, superfluous and expensive. Perhaps the biggest question looming over Google, Samsung and Qualcomm's mixed-reality endeavors is: In a world in which we're glued to our phones, is there room for yet another gadget in our lives? Samat thinks the answer is yes, but it'll be a gradual shift. Smart glasses won't replace phones, primarily because they'll still need to rely on them for some computing power in the near term. But he does think the immediate benefit will become obvious right away, particularly in that smart glasses can make us feel more connected and less distracted. "It should give you some superpowers in places that you didn't have them before," he said. "And you're like, 'Wow, this is like, better than my smartphone.'"
[32]
I Tried Google and Samsung's Next-Gen Android XR Headsets and Glasses, and the Killer App Is AI
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps I've used many VR and AR headsets and had a lot of experiences. But I've never worn one with an all-seeing, all-listening AI companion by my side until this week when I got my first taste at Google's headquarters in New York City. Android XR, which is available in early form for developers now and will fully launch in 2025, promises a whole OS for headsets and glasses of all types and is a bridge to Android phones. But its killer app, the one Google is clearly banking on, is its AI, Gemini. From what I've seen, it's a heck of a sign of how much headsets and glasses will be changing in the next few years... but I still have a lot of questions about how it'll fit into everyday life. What I remember most clearly afterward, my head buzzing with an hour's worth of demo memories in headset and glasses, is wandering through worlds with the persistence of an AI companion. For instance, I was standing on a 3D map of my own neighborhood, my home below me. I pinched and zoomed, hovering over my roof, until I could see the horizon and some buildings a few streets away. I pointed to them. "What's that building over there?" I asked. "That's the high school," Gemini said, identifying my town's school name. I get closer and ask about the municipal building next door too. We explored my town together, Gemini and I, in a new Samsung mixed-reality headset that feels a lot like Apple's Vision Pro headset. But as I asked Gemini to take me to other places -- beyond Maps to Chrome or YouTube, where it helped me recognize things in videos, or narrated scenes on the fly -- or even pointed out and searched for things in the real world in a constructed living room space at Google's New York headquarters -- I started to lose track of what app I was in. Gemini was always with me, though. And after a few demos, Gemini even told me what I'd done and jogged my memory in case I forgot. A lot of it starts to feel like those dreams of sci-fi assistants, and that's not accidental. Google's President of Android Ecosystem, Sameer Samat, equates multimodal AI to a "Tony Stark" moment: "What these [AI] models can do using the cameras on the phone as a way of interacting with the world, it was truly blowing us away. Wouldn't this be perfect for a pair of glasses?" During a long exclusive conversation with Samat, it became clear that AI has motivated Google to rewrite its future AR/VR plans and re-enter a space it walked away from years ago after ending support for the Google Daydream. And yes, Google and Samsung have a lot of AR/VR plans for 2025: Android XR will launch then, and so will Samsung's headset. But Android XR will also work with Android phones and other headsets and glasses ranging from VR to AR to Meta Ray-Ban-like smart glasses. Glasses are very much on Google's roadmap. I also got multiple demos of display-enabled, Gemini-equipped smart glasses from others, each with floating head-up displays. These glasses, part of Google's AI initiative code-named Project Astra, are part of what's coming next. It's a lot to take in. But it's all a massive taste of what will probably be a big shift to AI living on XR very soon. It's fascinating stuff and also a lot to digest. I've said it for years: the big missing piece in VR and AR has been our phones. To this point, iOS and Android haven't been deeply connected with VR and AR headsets and glasses. But Android XR, a new platform launching in 2025 that was announced for developers Thursday, will open that all up. Starting with Samsung's Vision Pro-like mixed reality headset, Google aims to create a universe of glasses, goggles and headsets to interconnect with Google Play, run multiple 2D apps at once and use Gemini AI. Google's making AI a big part of the reason for Android XR, and its biggest feature. In that sense, it's already different from Meta and Apple, which have slow-played AI in VR and AR so far. Apple Intelligence still hasn't emerged on Vision Pro yet, but is likely to arrive next year. While Meta has generative AI already running on its Ray-Ban smart glasses, the Meta Quest VR headsets don't have a lot of AI tools baked in yet. Android XR is only in its early stages, a preview form for early partners to start getting used to. Google is working with Samsung first as its starting hardware partner, with the mixed reality headset I got to briefly try being the first product next year. Samsung is also making glasses, which we don't know much about yet... in the meantime, Google also has its own in-house smart glasses called Project Astra (I got to try those too). There will be other partners and other products: Xreal, which already has a wide range of display glasses and a new set of AI-ready Xreal One glasses, is one of them. But for the year to come, it'll mostly be about Google and Samsung, with hardware using chipsets made by Qualcomm. Even though Android XR's starting point is a high-end VR headset, the endpoint is a whole range of products to come. "This is not just about one product," says Kihwan Kim, executive VP of Immersive technologies and hardware at Samsung. Kim sees it as the foundation for what will be a range of devices, including glasses. "This is more like the route to making this market," Kim says. Meta's Orion glasses, which I saw earlier in the fall, are years away from being a reality but show what AR glasses could be. No one is there yet, though, and Google, like everyone else, is splitting the difference to get there. "We have this kind of parallel approach," Shahram Izadi, Google's VP and GM for XR, says about the headset/glasses strategy. "One starts with a lot of capabilities, one starts with limited capabilities, but you're locking on form factor. Most people are attacking these two vectors to get to all-day wearable AR glasses." I was one of only a few people to get an early hands-on with Samsung's Android XR headset and only got to wear it for a half hour or so. It's called Project Moohan, and Google didn't allow me to take any photos or videos of my demos or the mixed reality headset. The feel of the hardware is very familiar: It has the fit and feel of a Meta Quest Pro but the video quality of an Apple Vision Pro. The headset's clear lenses and visor-like design perch over my forehead, resting over my eyes without needing a face piece pressing in. The headband design tightens in the back, and it's lightweight, but there's also a tethered battery pack, much like the Vision Pro, which I tuck in my pocket. Google outfitted me with prescription lenses for my demos, which was a huge help since the headset doesn't seem made to fit over glasses. The hardware has both eye tracking and hand tracking, just like Vision Pro, and uses a color camera passthrough of the real world overlaid with what's shown in VR on the headset, creating mixed reality much like Meta's Quest 3 or Vision Pro. Project Moohan is what Google and Samsung started with awhile ago, before a rapid rise in generative AI interest and capabilities that, according to Samat, led the team to pivot toward an agent-based Gemini system that would work on both headsets and glasses. But Moohan is the starting point that Google feels can cover enough bases of interaction, Google Play app compatibility, AI and interface that it can fuel ideas in a wave of other, smaller glasses that may not have all these features eventually. Tapping on the side of the headband opens up a grid of Google Play apps, much like how Vision Pro (or my Meta Orion demo) worked. I can pinch open apps with my hand by casting a pointer in space, and app windows can be dragged around by the edges and expanded in size. A top button on the headset can bring me back to the home screen, which includes an immersive 3D landscape that, again, is very Vision Pro. Google's demos were all of Google apps, several of which aren't on other headsets yet, namely, Maps and YouTube. Google Maps starts in 2D but can launch a full 3D view that feels like the Google Earth experience I tried in VR years ago. Landscapes spread around magically, with searchable locations studded throughout. Google's also adding full 3D-scanned locations over time using a technique called Gaussian splatting, which knits 2D photos into realistic (but a bit hazy) walkable rooms. I popped into a scan of Scarpetta, a New York restaurant, and entered the dining rooms. I've seen these types of scans at Meta and via companies like Varjo and Niantic, but it's fun to see them knitted into Maps. YouTube feels like a standard viewer with pop-out panes for comments and metadata, but it can also play immersive 3D, 180- and 360-degree videos that have been on YouTube for years. There's another trick too: Google is using AI to convert 2D YouTube videos into 3D. It didn't look bad, and more impressively, it works with home videos in the Photos app, too, along with 2D-to-3D photo conversions. Apple already converts 2D photos to 3D in Vision Pro, but the video trick is a next-level move for immersive memories. I also dragged my Chrome browser over to a table to demo how swapping to mouse and keyboard from hand tracking would work, and the transition was pretty seamless; the mouse cursor moved all around the room, not just in the browser window. When I lifted my hand off the mouse, hand tracking was instantly back in action. My demo didn't have eye tracking enabled (maybe because of my prescription inserts), but the headset and Android XR are made to adapt to whatever inputs are available: hands, eyes, voice or things like keyboards, mice or connected phones. (The headset does have automatic eye distance adjustment, by the way.) There's no price or release date for Samsung's headset, or even an official name -- Moohan refers to "infinity" in Korean -- and it's available only for developers now. But it feels like a very real product, and it runs off Qualcomm's XR2 Plus Gen 2 chips announced back in January. But again, it's the Gemini AI that feels like the special ingredient right now. My demos were pretty contained in a pre-set space with preconfigured apps, but Gemini seemed like a pretty compelling magic trick. The magic continued on glasses in another room. Samsung's next product will be smart glasses, with more details coming in 2025. But these glasses don't exist yet. Instead, Google is currently experimenting with its own in-house glasses, part of an AI initiative called Project Astra, which are being field-tested now to get feedback on how they work and feel in public. The second room I entered had a number of pairs of these glasses, one outfitted with a temporary pair of makeshift prescription inserts for me. The glasses look pretty normal, lightweight and wireless (like Meta's Ray-Bans), with a camera onboard and speakers and mics in the arms, along with a few input buttons. These glasses have a single display in the right lens, projected in via a Micro LED chip on the arm onto etched waveguides on a small square patch on the lens glass. They feel like a modern riff on Google Glass, but with much better tech. The display mainly shows text: directional information or captions of whatever Gemini might be saying to me through the speakers. I wandered around the room, looking at books on the shelf and asking about them (Jeff Vandermeer's Absolution, for instance, which I asked about and whether I needed to read the other books first). I opened up a Yuval Noah Harari book and asked Gemini to summarize what was in front of me. I also had them translate a poster on the wall. Meta's Ray-Bans can already do this too, but Gemini, once invoked, stays active and doesn't need additional prompts. Instead of always re-activating, I keep it on... and pause it by tapping on the side of the glasses when I want an Assistant break. I also got to demo a live translation, where someone else in the room approached me to speak in English and then Spanish. Everything she said to me was auto-captioned in the head-up display, and it all kept being delivered in English even when she changed languages. Another short demo showed where the tech aims to go next: A pair of glasses with dual displays gave me simulated map information, and when I looked down, I saw a 3D map appear to guide my orientation and show me what street I was facing. Looking up and spinning around, I saw a map appear when I was in motion, then fade away when I was still. I also saw a brief video clip to show the potential resolution of the display; the micro-LED color and pixel density looked really good, but the square field of view was pretty small. Google sees it expanding over time, but it was notably smaller than the Meta Orion prototype, Xreal's glasses and even Snap's developer Spectacles. Then again, right now, Google and its hardware partners like Samsung may be taking small steps for how much visual detail is even delivered on these glasses without feeling like an interruption or unsafe to use in public while walking around. Meta sees headsets and glasses as two parallel classes of products like PCs and phones, and Google feels the same way. "You'll probably use the more immersive products akin to laptops. On the glasses side, these are more like smartphones of the future or wearable devices of the future like watches or buds. So you have to support both," Izadi says. Through all of these demos, Gemini's one-tap-away readiness was a constant. That's clearly Google's push here by design. But it's also the most eye-opening, surprising part of everything I experienced. AI, no matter what your concerns about it might be, can be extremely helpful in a headset or glasses where inputs like a keyboard or touchscreen are harder to access. I use Siri a lot more in Vision Pro or with AirPods. Meta's Ray-Bans use voice as a deeper way to control things too. However, current VR/AR devices have limits to how aware the AI feels. Gemini, because it can see everything you're seeing in real time, feels like it's a buddy... and maybe not one you'll always want. I found Gemini bubbly and friendly at first (it said "Hi!" and I awkwardly said "hi" back) but then settled into a listening mode where anything I said could be interpreted as instruction -- there's no "hey Gemini" prompt. That makes things helpful but also intrusive. The way to stop it is to pause it or turn it off again, which feels like the reverse of how AI assistants work now: Instead of tapping to invoke, you tap to stop it. No doubt, Gemini will have limits on how much it can run continuously on small glasses, just from a battery perspective. According to Google, in mixed reality VR like Project Moohan, Gemini works as a layer that uses casting to interpret everything it sees. It can even be used while playing games, though there might be a bit of a performance hit. The advantages could be how it can continuously break the fourth wall of mixed reality, in a sense: I could "circle to search" things in Chrome and have responses pop up, or pull 3D objects into my world on command, or jump from app to app as I request a location or a video or ask to play music from an album I see in front of me (which happened during my demo). Samsung's Kim suggests I could get tutorials while playing games, for instance, if Gemini sees what I'm doing in the headset or even with glasses. And, of course, it can remember what I was doing, too, and when. Although, when I asked Gemini to recognize my colleague Lisa Eadicicco in the room with me, it said it couldn't be used to identify people (yet). Google's already laying out wide-sweeping plans for Gemini 2, just announced, to be an agent-like system that works across devices. Adding camera feeds into the AI input mix also means more data to collect and train with. It won't just be on headsets and glasses, and Google isn't the only company pursuing this vision. The implications are vast. "The assistant is coming with you, whether it's your glasses, your headset, your phone, your watch," Izadi says. Would I want Gemini to see everything I'm doing? No, of course not. Microsoft experimented with an always-on Recall AI mode in Windows before delaying it after backlash. How Google will handle that dance between always-helpful and privacy-invasive is unclear, although Google promises that video feeds used for AI recognition are kept private and local. One big thing seems clear, though: With Android XR, all sorts of headsets and glasses will be able to bridge into phones more easily than before. That could allow a whole host of otherwise isolated products to feel more knitted together in a way that Apple and Meta haven't done yet (although the exact steps for how that happens aren't clear for Google, either). Google's Samat points to Samsung being the first partner to co-explore the software, but Qualcomm's existing Snapdragon Spaces software, which bridges phones to glasses already, will also be compatible and part of Android XR. Google's also enabling WebXR and Unity tools to work with Android XR, and existing 2D Google Play apps will all run in Android XR, as long as a developer agrees to opt in to listing them there. Individual hardware makers should be able to customize their own software and tools and still connect to Google Play, but what about putting Google's already widespread services on other devices too? Right now, Google isn't offering any specifics there, but having XR Maps and YouTube -- and Gemini -- on Quest and Vision Pro headsets and elsewhere would be helpful. It could also change the way developers envision future VR and AR apps. "While we are looking to bring existing games like Demeo to Android XR, the platform also opens us up to develop entirely new ideas," Tommy Palm, head of Resolution Games, a developer that's made games for lots of existing VR/AR hardware, tells CNET. "Android XR's open nature, developer-friendly approach and unique innovations make it not only viable but allow us to consider new and novel ways to use mixed reality for storytelling. For instance, the natural language interface of ChatBots could be a very potent extension for XR and games." These moves are early, but they're also pointers to what's happening next. Apple and Meta will undoubtedly have more AI services in AR and VR in the years to come, and Apple will likely find ways to let Vision work with iPhones. Or they need to. Google's plans make a lot of sense, and they'll probably let headsets and glasses work as true peripherals to phones and, eventually, with watches too. With three partners in the equation -- Google, Samsung, and Qualcomm -- and other manufacturers too - it could get messy. But it's also the unifying progress that an already fragmented future landscape needs. We'll know more about what's really happening in 2025, which isn't very far away at all.
[33]
I saw Google's plan to put Android on your face
Google is no stranger to augmented reality. Google Glass crashed and burned with the public more than 10 years ago before being repurposed for enterprise users and eventually discontinued. But things are different now. Apple has the Vision Pro. Meta has the Ray-Ban smart glasses, and their AI features have garnered positive buzz. That's why Google is jumping back into the fray with Android XR. Google wants everyone to know the time is finally right for XR, and it's pointing to Gemini as its north star. Adding Gemini enables multimodal AI and natural language -- things it says will make interactions with your environment richer. In a demo, Google had me prompt Gemini to name the title of a yellow book sitting behind me on a shelf. I'd briefly glanced at it earlier but hadn't taken a photo. Gemini took a second, and then offered up an answer. I whipped around to check -- it was correct.
[34]
Gemini looks right at home in Samsung's latest mixed reality demo with Android XR
Summary Google Gemini is expanding into VR with Samsung's upcoming mixed reality headset, showing off its AI capabilities. Tech commentator Bilawal Sidhu highlighted Gemini's role, powering various use cases with Google Maps on the user-friendly Android XR platform. Google's integration of multimodal AI into everything, like Gemini in Android XR, is making waves in the mixed-reality space with user-friendly commands. ✕ Remove Ads Today was a big day for Google Gemini, because it just expanded into the world of VR. Google showcased its new Android XR platform for mixed-reality headsets, using a new Samsung VR headset, codenamed Project Moohan, with Gemini AI running on it. The AI looked right at home on the headset, which is running on Google's newly announced Android XR platform. Related Android XR software: Here are the VR games and AR apps coming to the new OS Not an expansive library, but it's definitely a start Posts Tech commentator Bilawal Sidhu shared the first glimpse we've had of Project Moohan during a special live demonstration of Android XR. The demo showcased Gemini's ability to power different use cases, with apps like Google Maps taking center stage alongside the fluid and user-friendly interface of Android XR. ✕ Remove Ads Gemini's role in Android XR points to Google's big plans Google's broad strategy is to integrate multimodal AI into everything. Gemini is the current consumer-facing version of that vision, built to process text, images, and more. This makes it well-suited for mixed-reality applications, where simply summoning Gemini to perform tasks is user-friendly and practical. We just learned that Google is working on a "Hey, Gemini" voice command to summon Gemini, and then we saw it in action in this video. The entire platform closely mirrors the Project Astra demo we saw at Google I/O earlier this year, but this time those concepts are real, with real hardware and software, thanks to Samsung. Project Moohan and Android XR We only started hearing about Android XR recently, mainly because it's built with Gemini AI at its core, and, well, Gemini hasn't been around too long. Project Moohan is Samsung's long-rumored VR headset. We finally got to see both of these developments merged together at this live demo. ✕ Remove Ads Samsung's headset is expected to launch officially in March 2025. The company will likely build on this demo with even more integrations and features, and we've even heard a developer version is on the way. The merging of multimodal AI with virtual reality makes this an exciting time for Android. Related AR, VR, XR: What's the difference? Let's define reality with a close look at AR, VR, MR, and other technologies to see what they mean Posts
[35]
I Tried Google and Samsung's Android XR Headset and Glasses: Gemini AI Can See My Life Now - Video
Android XR, coming in 2025, is arriving along with a Samsung mixed reality headset and glasses after that. I tried a ton of demos, and the always-on AI that can see what I'm seeing is amazing and full of unanswered questions. Google has announced Android Xr Samsung has a mixed reality headset for Google and there are glasses in the works too. This is a lot to digest and I just got out of the demo of this stuff. Let me talk to you about it. Google's been promising a collaboration with Samsung and Qualcomm for a year. Now talking about this future of mixed reality and we already have Apple in the landscape, we already have meta in the landscape and we also have other French players as well. Now, Google is back in the landscape. They had been out of VR and A R for the most part and they're back in Android. Xr is a new platform and it's going to work with phones and it's going to work with other things and it's Gemini powered in that Gemini A I is going to run throughout it to work, not only on mixed reality headsets, but Google says it'll be on A I glasses A R glasses and all sorts of other emerging wearables for your face. Now, what I got to try because right now it's only for developers is a headset made by Samsung that's a mixed reality headset and also a pair of notification glasses that Google calls Project Astra. These have been in the works for a little while and they're gonna start being field tested soon. Now, bear with me because I'm gonna be talking a lot and we don't have any photos or video of my demo because they were not allowed. That's pretty standard issue for early VR and A R in my experience. So, first of all, project Mohan, which is the name of the mixed reality headset for now that Samsung is going to release next year looks a lot like a Vision Pro or it looks like uh the meta quest pro. If you remember that headset, it's a visor type headset that fits on your head. It's got lenses over here can block out light if you want Titans the back. It has a nice crisp display, it has hand tracking and eye tracking. That's not what the exciting part is. A lot of that feels very familiar. But when I tried demos in this headset, bringing up two D apps in a very Android like interface. I was able to bring up familiar Google apps and navigate using my hands and bring up different pains. But I could also turn on Gemini to be my companion throughout the whole journey. Now, you already have things like Siri working on Apple Vision Pro and there are some voice A I things and uh the meta quest. But this is something that can also see what you're doing and why that gets fascinating is I could actually point to something and say, hey, what is that? Or tell me more about this and move my hand there and Gemini will tell me. And what I found I was able to do is browse the web. I was able to ask it a question about something or ask about where I live. It pulled up a map. And Google also has immersive apps for this. At the moment, youtube and Maps were the most notable Apple doesn't even have a vision pro immersive maps app. Yet, maps allowed me to throw this big 3d immersive environment up zoom into my home. But also I was able to point out things in the landscape and ask about them and Gemini could tell me what those were and that combination got really interesting. I felt like I was starting to get lost in the apps and navigate around. Gemini cannot just recognize what you're doing, but it can be your memory. When I got done with a whole bunch of demos, I asked Gemini to recap what I've been doing and it told me all the different things I've been doing and it has tricks I haven't seen before like I played a youtube video and Gemini improvised captions to the youtube video. Google has a few other tricks up at sleeve with Android Xr. One of them is auto converting two D photos to 3D, which is something that Apple also does now with Vision pro but they also do it for videos. I looked at a video clip on the headset that was already pre converted. Uh Google said, but also I watched a youtube clip that had been converted. I didn't get to see the live conversion process but it was pretty wild to see that that capability is going to be there. So that's all the mixed reality stuff which is going to be on this developer headset that is meant for people to start getting a feel for it and build a kind of a starting point for everything else. But then there are the glasses I put on these glasses that look like uh Meta's ray bands, kind of chunky but pretty normal and wireless. And there was one display on these glasses, a little area here etched in with wave guides and a micro led display projector over here that was able to give me a heads up display. What I was able to do with that. A whole bunch of stuff I could activate Gemini with a button here and turn it into kind of an always on Gemini mode. Identify works of art on the wall in a fixed living room set, identify questions about objects. I had questions about books. I was able to again indicate certain things and and have it tell me about them and translation. You could translate things and then also ask for the language back. And what I found fascinating is they also do live interactive translations. So somebody in the room came up and spoke Spanish to me, it instantly brought up not just the captions for what they were saying to me in English, but they also brought up still in English when they started speaking in Spanish. All of this stuff assumes that you're gonna have an always on Gemini assistant in your life. Now with conversational responses that could feel very intrusive, but on the glasses, you're able to pause it by simply tapping on the side and basically suspending it. But it was very different than something like meta ray bands which assume you won't be using it until you ask for it here. It was like it would assume you have it on until you want to pause it. And the same thing goes for VR and mixed reality. Am I gonna turn on Gemini and then suddenly have this run all the time that could really impact the experience of how I use apps? It's fascinating for things like let's say I'm playing a game and I want to know a tutorial or I have questions about something. It kind of breaks the fourth wall of VR and A R and it may change the way that apps are developed. But let's back up a step. This is all for developers right now and Android Xr is gonna be unveiled in a Fuller state next year. According to Google and Samsung is going to announce and go into more details on what it's potentially very expensive. Next reality headset will be again, expected to be kind of like a vision pro and a lot of people may not necessarily want to buy it, but it's going to be there. The other thing that's really interesting is where all the glasses go and that could be the year after starting with these assistive types of glasses, there are already other partners in the works like X reel. We just saw some glasses made by them. Google is gonna have a lot of different partners making A I and A R glasses that work now that they're gonna be working with phones. Well, the ball is gonna be in Apple's Court. When does Apple introduce generative A I to the Vision pro? And when will Apple start allowing connections to the iphone for vision products? Because right now that doesn't happen. But I think it's really key for what comes next and it looks like it's going to make uh the idea of everyday glasses feel a lot more instantaneous, but we don't really know about battery life yet. So that is one wild card here that could be pretty significant. That's a lot to digest and I'm still digesting it in my head. You have questions which I'm sure you do leave them below and I'll be following up with more stuff as we learn more about this. But I was just glad to get the demo and I'm asking a lot more questions about what A I is going to be in VR and A R than I ever had before. Thanks for watching.
[36]
Here's All the Cool Stuff Android XR Wants to Do
We may earn a commission when you click links to retailers and purchase goods. More info. Google's big Android XR reveal gave us a first look at both its own pair of XR glasses, as well as Samsung's upcoming headset that you'll be able to buy. For now, none of us are able to put on these headsets and make them a part of our daily lives, so Google has instead provided several video demos that they hope to make a reality. If you missed that bit of news, you'll want to read this story to catch-up. For those who did, it's Android XR demo time! Android XR Glasses Demos: The most impressive set of demos is without a doubt those that showcase Google's vision for its Android XR glasses. These look like normal glasses you might wear throughout a day, without the bulk, goofiness, or awkwardness of some other entries in the space, like the old Google Glass from years ago. Google has this idea that you would wear these glasses and then interact with Gemini for help with just about anything you might as your phone for. In this first video, we see a person ask Gemini for an update on their group chat, which prompts a further conversation about the vegan options at a restaurant they are discussing (with reviews) and a quick-share of the outfit they'll wear that night, all through the glasses they are wearing. They then ask for directions to a card shop and see the results with step-by-step directions projected through the lenses on the screen. In this next video, we see someone ask for Gemini for help in hanging shelves in their home. They get style or layout ideas and step-by-step instructions, but Google also wants you to believe that your glasses will help you find misplaced tools because Gemini will remember where they are. That's a bit wild for my tastes. And in this final Google glasses video, we see someone using the glasses to translate a menu through the glasses lenses and then translate the conversation from the restaurant worker. It all supposedly happens fast enough to reduce awkwardness between the two people, but you can obviously see how handy this might be when traveling to countries whose language you aren't familiar with. Android XR Headset Demos: For Samsung's headset demos, these might be slightly less interesting and that's probably only because we've seen this idea from Apple already with their Vision Pro product. Google and Samsung first show how an immersive YouTube experience might be as you launch a 3D rendering from the seat of your couch. The next video brings us the view of an office with multiple Chrome windows present and multiple tabs open, followed by a Circle to Search action that brings up a 3D rendering of the pair of soccer boots they searched for. They get an AI overview from search and an interactive view of the cleats. It's certainly a cool visual experience, even if I'm not sure I'll ever be sold on interacting with virtual browser windows for more than a quick search. In this next demo, you get to see what Google TV could look like with the idea being that an XR headset from Samsung could essentially bring you into a theater. Who needs a $3,000 TV when you can simply put on a headset, eh? Finally, Google Photos will be a big part of Android XR, with immersive photos and videos that almost live within your space. This is all very cool stuff that we don't have specific dates on just yet. I could see both device types being worth owning, assuming all of the partners involve can bring the right amount of applications, games, and usefulness.
[37]
Samsung's game-changing XR advancements turn sci-fi into reality
Samsung has announced something exciting, with a first look at its mixed-reality (MR) headset, plus the launch of a new Android XR platform in collaboration with Google and Qualcomm - all set for a 2025 launch. As defined by Samsung, XR is essentially an umbrella term that covers technologies making use of digital elements to extend or alter our reality, by merging the physical and digital worlds. Some of the best VR headsets on the market aim to blend virtual reality with mixed reality, and new developments in smart glasses such as the XREAL One series are expanding what's possible with spatial displays. This new Android XR platform will not only merge a full range of experiences from the Galaxy ecosystem and Google's suite of apps with AI, such as exploring the world through Google Maps, or watching football games on YouTube, but is said to be the first platform built entirely for the Gemini era and will be compatible with both VR headsets and AR smart glasses (is that a hint?). According to Samsung, this is one of its 'most ambitious endeavours yet', and will also feature cutting-edge capabilities beyond the need for gestures or a controller, acting on voice commands and natural conversation. In other words - it will be a helpful AI assistant that you can wear on your face and talk to freely. This isn't the first time that Samsung or Google has dabbled in VR headsets, with the release of the Samsung Gear VR back in 2015, which involved inserting your phone into it and was swiftly discontinued, followed by the Samsung HMD Odyssey in 2017 and the Google Glass smart glasses didn't take off either. It sounds like this new headset designed for Android XR (code named "Project Moohan") has no intention of flopping this time, with game-changing state-of-the-art displays and spatialized app formats that can take advantage of virtual spaces. If I'm honest, I'm not too sold on the teased design of the upcoming MR headset, and I can't help but think it looks too much like the Apple Vision Pro. I can't picture it being worn in public either, but the company says it's lightweight and ergonomically optimised so that's something. XR has quickly shifted from science fiction to a tangible reality, and perhaps if we're lucky, we'll be drip-fed more information about Project Moohan ("Moohan" means 'infinity' in Korean) at the upcoming Samsung Galaxy Unpacked 2025 event anticipated for February 2025, alongside any Galaxy S25 series announcements.
[38]
Samsung Project Moohan Android XR headset unveiled
Samsung today announced "Unlock the Infinite Possibilities of XR With Galaxy AI," highlighting its collaboration with Google's Android XR operating system for future extended reality (XR) devices. XR, or Extended Reality, encompasses technologies such as Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), and other immersive technologies yet to be developed. XR merges the physical and digital worlds, enabling dynamic interactions across work, learning, entertainment, gaming, and wellness. Samsung emphasized its commitment to enhancing user experiences through cutting-edge technology. "We believe now is the time to unlock the potential of XR through Galaxy AI," the company stated. By combining multimodal technology with the broader Galaxy ecosystem, XR offers intuitive, natural interactions that transform everyday life. Samsung highlighted its open collaboration with Google and Qualcomm in creating the new Android XR platform. The company noted its long-standing partnerships in designing and optimizing technology, calling this initiative "one of our most ambitious endeavors." The platform will support headsets and glasses, providing rich content from the Galaxy ecosystem, Google apps, and third-party developers. With Gemini's conversational UI and contextual awareness, users will experience voice-driven, natural interactions beyond basic gestures or controllers. Samsung also announced "Project Moohan," its first Android XR headset. The name "Moohan," meaning "infinity" in Korean, symbolizes the limitless immersive experiences envisioned. Equipped with advanced displays, Passthrough capabilities, and natural multi-modal input, the lightweight, ergonomically designed headset promises seamless XR experiences through Google Maps, YouTube, and Gemini-powered trip planning. Speaking about XR, Won-Joon Choi, EVP and Head of R&D, Mobile eXperience Business, said, XR has rapidly evolved from a distant promise to a tangible reality. We believe it has the potential to unlock new, meaningful ways to interact with the world, resonating with everyday lives and transcending physical boundaries. We are excited to collaborate with Google to reshape the future of XR, starting with Project Moohan. Commenting about Android XR, Sameer Samat, President of Android Ecosystem at Google, said,
[39]
This is the Moohan Project, the virtual reality glasses from Samsung and Google powered by AI - Softonic
The new mixed reality glasses from Google and Samsung are sure to sound familiar to you Google and Samsung have introduced Google's Android XR platform, and the first hardware to run this new platform will be the mixed reality glasses called Project Moohan, created by Samsung. If you have looked at the photo of the new Project Moohan glasses, they surely remind you of something, as TechRadar says they are a mix between Google's Daydream VR glasses and Apple's high-end Vision Pro. Well, today, December 12, Google and Samsung have presented the Android XR platform and the glasses in the form of a development kit. Although we don't know when they will be available for sale. Samsung and Google described Project Moohan as glasses that will offer the full range of XR experiences, from fully immersive to mixed reality. This makes sense considering that Moohan means "infinite" in Korean. The glasses feature eye and hand tracking and will respond to voice queries. And Gemini, Google's generative artificial intelligence platform, will be at the center of Google's augmented reality system. This contrasts with Apple's Vision Pro, which, although it integrates Apple's Siri, does not have that level of awareness, at least for now. Samsung did not comment on the key specifications of Project Moohan, refusing to go into details such as price, weight, battery location, and imaging technology, other than stating it will be high resolution. From the photos and videos seen, it appears to be gray headphones with a single band and support foam around the back of the head, and a glass front surrounded by a thin chrome bezel. What we do know is that the silicon powering Project Moohan comes from the third part of this partnership: Qualcomm. Samsung has not specified which chip or chips Project Moohan will have, but it is logical to expect it to incorporate the advanced Snapdragon XR2. Android XR will provide a visual experience within Moohan that should be familiar to anyone who has experienced mixed reality with Apple's Vision Pro glasses or Meta's Quest. Google is also focused on ensuring that its native applications, such as Gmail, Maps, Google TV, and YouTube, are ready for Android XR, and demonstrated how YouTube could offer an immersive video viewing experience.
[40]
Android XR software: Here are the VR games and AR apps coming to the new OS
Google finally unveiled Android XR today, and in the hopes of getting consumers interested in upcoming VR headsets and smart glasses running the OS, the company also shared some information about the software that'll be available on the platform. It also released some tools and documentation to help developers get started on bringing their VR games and apps to Android XR ahead of the launch of Samsung's VR headset next year. ✕ Remove Ads Related Samsung's XR glasses could steal the Galaxy S25's thunder at the next Unpacked But in a prototype form Posts What apps will be available for Android XR? During our briefing, Google highlighted some of its partners who are working on apps for Android XR. These partners include Adobe, Calm, MLB, Concepts, AmazeVR Concerts, Mirrorscape, Naver, Resolution Games, 30 Ninjas, Spline, Tripp, Owlchemy Labs, and Virtual Desktop. We know that 30 Ninjas is working on an immersive film app, and we also know that Owlchemy Labs and Resolution Games are porting some of their games over to the platform. Many of the other companies already offer XR apps or experiences on other platforms, but it remains to be seen exactly what they'll release for the new Android XR platform. ✕ Remove Ads Fortunately, Google says that all existing Android mobile apps will just work on Android XR. Apps built for large screen devices, which is the case for many of the best Android apps, will work especially well in mixed reality due to the wider viewing area. Most Android apps will automatically be made available through the Play Store on Android XR, but developers who want to polish their app a bit before it's made available can use a manifest attribute to opt out. We sort of knew this would be the case when we saw news that the Google Play Store was getting ready for Android XR headsets, but what we still don't know is whether you'll be able to download Android apps without the Google Play Store. The JetNews sample app is an Android large-screen app adapted for Android XR ✕ Remove Ads Although most mobile apps will run just fine on Android XR, developers will want to update their apps to take advantage of the unique experiences that it offers. To assist with this, Google is launching a developer preview of the Android XR SDK today, which contains APIs to help developers spatialize their apps with "rich 3D elements, spatial panels, and spatial audio that bring a natural sense of depth, scale, and tangible realism." It also helps developers add "natural, multimodal interaction capabilities" to their apps, such as support for hand and eye gestures. The Android XR SDK is built on familiar Android app development tools such as Material Design 3 and Jetpack Compose. Apps that use Material Design 3 and Jetpack Compose for adaptive layouts can even opt in to have these components be automatically spatialized. ✕ Remove Ads Apps optimized for large screens take advantage of sizing capabilities in Android XR The SDK also enables developers to take advantage of some of the most popular open standards for XR devices, including OpenXR 1.1 and WebXR. These standards, respectively, make it easier for developers to support multiple XR platforms as well as distribute XR experiences on the web. In the case of OpenXR, Google is also expanding upon it with its own vendor extensions to provide additional functionality like an AI-powered hand mesh feature. ✕ Remove Ads Chrome on Android XR supports WebXR features including depth maps, allowing virtual objects to interact with real world surfaces Developers who are interested in building or porting their apps to Android XR can get started by visiting its developer site. An Android XR Emulator is available in Android Studio that lets you visualize what your spatialized app will look like on real hardware. If you previously utilized Snapdragon Spaces to build XR applications, you'll want to wait for Qualcomm to release its compatibility plugin for Android XR in the first quarter of 2025, as it'll enable your app to "operate smoothly on both Android XR and devices with Snapdragon Spaces." ✕ Remove Ads If your app requires a lot of computing power, which is often the case for games, then you'll need to test it out on real hardware. Google is inviting developers to sign up for its Android XR Developer Bootcamp next year to get access to prerelease hardware as well as direct support from the company. It's also asking developers who are interested in porting their VR games to check out its getting started guide for Unity. Yes, you can play VR games on Android XR Gaming is and always has been one of the biggest reasons to own a VR headset, so Google couldn't afford to ignore it. That's why Google is partnering with Unity Technologies, the company behind one of the most popular cross-platform game engines, to bring support for the Android XR platform within Unity. This includes new, dedicated documentation on porting games to Android XR as well as optimizations to make games more comfortable and performant on the platform. ✕ Remove Ads Developers who are interested in making games for Android XR can start building them now using the Public Experimental release with any version of Unity 6, including the Personal Edition. Some of the XR tools that are available in this release include tools for hand and eye tracking, occlusion, foveated rendering, and Composition Layers. With these tools, developers can either create "fully immersive apps" where "players can step into entirely new environments that feel real and engaging," or they can go the simpler route and create "optimized apps" that involve incorporating some XR elements into existing Android mobile apps. Vacation Simulator has been updated to Unity 6 and supports Android XR ✕ Remove Ads Porting existing games from platforms that "embrace OpenXR standards" will be "especially easy," says Unity in a press release. Qualcomm says the same thing, telling developers that existing Snapdragon Spaces Unity projects can be "seamlessly" migrated to Unity 6, which can then be used to build for Android XR. In its press release, Unity cites Owlchemy Labs and Resolution Games as examples of game studios that are already working to port their Made with Unity projects to Android XR with "minimal effort." The company also said it was collaborating with 30 Ninjas, an immersive digital and interactive content studio, on a "new and innovative immersive film app that will combine AI and XR to redefine the cinematic experience." Many existing XR games are built around controller input, which, thankfully, Android XR will support. Google says there will be a "defined controller interface" in Android XR and that the platform will launch with support for a variety of controllers, so games and VR content that need them will work out of the box. ✕ Remove Ads The success of Android XR rides largely on the quantity and quality of apps and games that are available for it. While it's good to see Google release so many tools and embrace many open standards, it's not going to help if the hardware running Android XR isn't affordable or readily accessible. We won't know what the cost of Android XR hardware will be like until next year, though, which is when Samsung will release the first device running the new OS, its VR headset code-named "Project Moohan."
[41]
Meet Project "Moohan," Samsung's Upcoming XR Headset - Phandroid
The continued consumer adoption of modern XR (extended reality) hardware and software in recent years has resulted in a lot of companies and manufacturers coming up with their own vision of how XR should be. No stranger to this particular product category, Samsung recently stated that it's been working with Google and Qualcomm towards the development of a new XR product. READ: Apple's Vision Pro Needs an Android Competitor Currently referred to as "Project Moohan", Samsung says that this will be the first headset designed specially for Android XR. Named after the Korean term for "Infinity", Samsung says that the headset will feature high-quality displays and passthrough functionality, in addition to multi-modal input, and support for Google apps such as YouTube, Maps, and even Gemini. Sameer Samat, President of Android Ecosystem at Google states: We are at an inflection point for the XR, where breakthroughs in multimodal AI enable natural and intuitive ways to use technology in your everyday life... We're thrilled to partner with Samsung to build a new ecosystem with Android XR, transforming computing for everyone on next-generation devices like headsets, glasses and beyond. With that said, this isn't Samsung's first foray into VR hardware - years ago, the Samsung Gear VR was able to integrate a Galaxy smartphone inside, acting as the main smart platform for the headset. It's been a while since it was discontinued though, making Samsung's new announcement exciting. Going back to Project Moohan, Samsung didn't reveal too much info about the upcoming headset, so details such as hardware specifications, pricing and availability are unknown at the moment. Won-Joon Choi, EVP and Head of R&D at Samsung's Mobile eXperience Business comments:
[42]
Samsung Unveils Its Moohan XR Headset to Rival the Apple Vision Pro
Samsung's Moohan XR headset will compete with the Vision Pro and Quest 3 Samsung on Friday unveiled its first extended reality (XR) headset, which is codenamed Project Moohan. The upcoming device was seen in a demonstration when Google showcased its new Android XR operating system that is designed for mixed reality headsets and for augmented reality (AR) glasses. Samsung's Moohan XR headset will offer support for Google's Gemini AI assistant and will arrive with apps that are optimised to run on a large, virtual display. It will launch next year, as a competitor to the Apple Vision Pro and Meta Quest 3. The new headset, which is dubbed Moohan (or infinity in Korean) runs on Android XR, which is Google's new platform that is designed with support for features that rely on AR, virtual reality (VR), and artificial intelligence (AI). Google says that it will be the first device to arrive with Android XR next year, although there's no word from both companies on a release timeline or how much it will cost. The South Korean technology conglomerate says the Moohan XR headset will sport state-of-the-art displays and will offer passthrough capabilities, as well as support for multi-modal input. Its closest rival, the Apple Vision Pro, is equipped with Micro‑OLED displays with a per-eye resolution of 3,660 × 3,200 pixels. While we don't know much about the price of the Moohan XR headset or when it will be launched, the presence of Android XR confirms that it will support the new functionalities shown off by Google when it unveiled the new operating system. These include the ability to use Circle to Search with gestures, viewing videos and photos on a virtual display using Google TV and Google Photos, or browsing the web using Google Chrome. It will also support features like live translation of text seen in the wearer's point of view, or the ability to see immersive views of various locations using Google Maps. Samsung first teased the arrival of an XR headset at its first Galaxy Unpacked of 2023, when it unveiled the Galaxy S23 series of smartphones. At the time, the company announced that it was partnering with Google and Qualcomm to build its first XR headset. On Thursday, Google announced that it had developed Android XR with Qualcomm and Samsung, but said that it was working with other companies including Magic Leap on upcoming XR products that will feature AI and AR technology.
[43]
Samsung jumps into fray for extended reality devices
An image of Samsung Electronics' Project Moohan, an extended reality headset / Courtesy of Samsung Electronics Will Project Moohan overcome shortcomings of Apple's Vision Pro?By Nam Hyun-woo Samsung Electronics teamed up with Google and Qualcomm to introduce its latest extended reality (XR) headset, code-named Project Moohan, jumping into the fray with big tech firms over XR devices. Samsung Electronics on Thursday (local time) held a developer event titled "XR Unlocked" in New York and announced that Project Moohan will make its global debut next year. It will be powered by the new version of Google's Android XR, a platform jointly developed by the three companies, enabling users to interact with both physical and virtual realities through multi-modal artificial intelligence (AI) that integrates various sensory experiences. Samsung said Google's Gemini AI chatbot will be embedded in the headset, allowing users to explore new information through natural conversations and physical gestures. XR is an umbrella term referring to a range of virtual reality (VR), augmented reality and mixed reality technologies. Upon release, the new device is expected to compete with Apple's Vision Pro and Meta's Quest, both of which currently dominate the XR device market. Samsung Electronics' Gear VR for Galaxy Note 7 / Courtesy of Samsung Electronics This is not Samsung's first attempt in the XR device market. In 2014, Samsung unveiled Gear VR, developed in collaboration with Oculus, which is now Meta, and rolled out a series of variations to the headset, but terminated services on related apps in 2020. Since 2017, Samsung also launched a number of VR headsets under its gaming gear brand Odyssey, but failed to gain popularity. Against this backdrop, Project Moohan is expected to face tough competition, as the XR market -- whose growth potential remains uncertain -- is currently dominated by Quest, while Vision Pro continues to struggle to secure its position. U.S. tech media AppleInsider reported that Apple only sold around 370,000 headsets in the first three quarters of 2024 and anticipated that only another 50,000 units will be sold by January, while analysts are also saying the device's expensive price tag, limited content and inconvenience show that the company focused on the product, rather than the market. Against this backdrop, reports suggest that Apple is preparing to release a more affordable Vision device next year to better compete with Quest. While Samsung has yet to disclose the price or detailed specifications regarding Moohan, Apple's strategy underscores the need for Samsung to carefully consider pricing and focus on delivering diverse and engaging content. "XR will open an entirely new dimension by seamlessly transitioning between the real world and virtual environments, enabling interaction with technology without physical limitations," Samsung Electronics Executive Vice President Choi Won-jun said during the event. "Android XR platform excels in its scalability that can be applied across various form factors. Through expanding ecosystem and a broad range of content, the Android XR platform will offer users a richer and more immersive experience," he said.
Share
Share
Copy Link
Google, in collaboration with Samsung and Qualcomm, unveils Android XR, a new platform merging AI with extended reality for wearable devices, powered by the advanced Gemini 2.0 AI model.
Google, in collaboration with Samsung and Qualcomm, has introduced Android XR, a groundbreaking platform that combines artificial intelligence (AI) with extended reality (XR) for wearable devices such as glasses and headsets 1. This innovative system, powered by the advanced Gemini 2.0 AI model, aims to transform how users interact with technology in their daily lives.
At the heart of Android XR lies Gemini 2.0, Google's sophisticated AI model designed to process complex inputs and deliver intuitive, real-time interactions 1. This AI engine enables natural, contextual assistance tailored to users' needs, from navigating unfamiliar streets to translating foreign languages in real-time 2.
Android XR supports two primary categories of wearable devices:
The platform opens up a wide range of practical applications, including:
The development of Android XR is a result of strategic partnerships:
This collaboration ensures a unified ecosystem that benefits developers, manufacturers, and users alike 1.
Android XR leverages existing AR technologies such as ARCore, geospatial APIs, and tools like Android Studio and Jetpack Compose 5. This allows developers to create immersive and practical applications using familiar Android development environments.
Popular Google applications, including YouTube, Google Photos, Google Maps, and Chrome, will be reimagined for the XR environment, offering new ways to interact with content 5.
The first Android XR device, Samsung's Project Moohan, is set to launch in 2025 3. This marks Google's significant entry into the wearable tech market, directly competing with Apple's Vision Pro and Meta's Quest 3.
While the immediate demand for XR devices may remain niche, the integration of Google's AI capabilities into Android XR could provide a new avenue for success in the evolving landscape of wearable technology 3.
As Android XR continues to develop, it has the potential to redefine how users experience and interact with technology, making digital tools more intuitive, efficient, and seamlessly integrated into daily life 4.
Reference
[1]
[2]
Google unveils updates to Project Astra, its AI-powered smart glasses prototype, as part of Gemini 2.0 launch. The company is testing the technology with a small group of users, signaling a potential return to the smart glasses market.
11 Sources
Qualcomm CEO Cristiano Amon discloses a partnership between Samsung, Google, and Qualcomm to develop mixed reality smart glasses. This revelation shifts focus from VR headsets to more practical wearable technology.
8 Sources
Samsung's upcoming XR glasses, developed in collaboration with Google and Qualcomm, are set to launch in Q3 2025. The device shares similarities with Ray-Ban Meta glasses but promises enhanced AI capabilities powered by Google's Gemini.
8 Sources
Meta unveils Orion, its next-generation AR glasses, showcasing advanced spatial computing capabilities. The prototype aims to revolutionize mixed reality experiences and compete with Apple's Vision Pro.
25 Sources
Meta unveils Orion, a prototype of advanced augmented reality glasses that could potentially replace smartphones. Despite production challenges, the technology showcases significant advancements in AR capabilities.
6 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved