Apple shook up the AR/VR space with the Vision Pro headset unveiled in 2024, and we haven’t seen an attempt from any Android brand to match that experience. The lack of a standard development platform was mainly to blame until now. Google recently pulled the wraps off the Android XR operating system for mixed reality wearables. Android’s future looks more promising than what the Meta Quest and Ray-Ban glasses offer.
Android XR sits at the confluence of Gemini AI, a headset-delivered mixed reality experience, and Google’s partnership with big-name brands like Samsung, Qualcomm, and Sony. Developers got a peek at what’s possible using the new OS, and it is more than Apple Vision Pro wine in an Android bottle. We sat through the event to help you understand the key features that could make Android XR the future of AR and VR tech.
7 Hardware-agnostic design and software
It’s more than a Meta Quest on steroids
At a recent developer conference, Google announced Project Moohan, a headset that it is co-developing with Samsung. While this will likely become the reference design headset in the future, it may be the first headset to market booting Android XR. In due course, we will see hardware like smart glasses that use AR also using XR as the primary OS instead of a custom solution developed from the ground up.
Versatility like that is essential, considering we have used physical or virtual buttons to interact with computer hardware for two decades and are now shifting to multimodal interaction featuring touch, eye tracking, and more.
Related
Google wants Android XR to power your next VR headset and smart glasses
Google Glass and Daydream VR nod in approval
Such hardware-agnostic flexibility is one of the foundational pillars of XR’s success and will help Google’s solution stand out from Apple’s Vision Pro. In the long term, the latter could struggle if developer interest dwindles since a singular headset in the ecosystem can only reach so many users. On the Android side, the devs would ensure their apps work on multiple headsets, not only one product. The operating system design is firmly rooted in current technology and isn’t created from the ground up, easing the transition into the new one.
6 Open SDK that supports Unity and Jetpack Compose libraries
Familiarity for developers
To ensure developers can switch to building for Android XR, the new OS shares a lot of back-end code with the stock Android you use on your glass sandwich. Google says the two operating systems share the code base, and developers creating for Android are automatically prepared for XR. Even the app developers focused on XR will use the same Jetpack Compose libraries they use for the usual apps, with the spatial element added in using the optional spatial APIs like OpenXR and WebXR.
Google isn’t leaving Unity devs in the dust. They will enjoy the convenience of ported libraries, with the added use of AR and VR development libraries for creating XR apps. With the shared codebase, Google can hit the ground running because several existing Android apps are XR-ready. Core apps like Maps and the Chrome browser were showcased during the live demo. An Android XR emulator is available in Android Studio, the computer program every app dev uses to test their creations for launch readiness.
5 Smarts like Gemini to ease interaction
Multimodal is the way
Interacting with apps in mixed reality is a stark departure from what we do on smartphones, TVs, and gaming-focused VR headsets. From the Project Moohan hardware demo, we can liken the controls to Apple’s Vision Pro headset, with some form of head or eye tracking, body movement tracking to detect pinch gestures and arm movement, and voice input. The headset ships with Gemini onboard, and it makes sense why it replaces Google Assistant.
Gemini does most of the heavy lifting in Android XR, from rearranging your windows to pulling more information about something you’re looking at. Interaction with the AI on XR would be largely audiovisual, but you can expect it to pull off cool stunts like offering real-time translation and navigation or overlays for how-to guides. Other interactions will mostly rely on hand and eye movements detected by the hardware to move windows around or resize them. We don’t expect many buttons or physical touchscreens in the XR space.
4 Circle to Search on Android XR
Samsung collaboration all the way
Circle to Search was unveiled at the Samsung Galaxy S24 launch event earlier this year in collaboration with Samsung. A full calendar year has passed, and Google’s partnership with the Korean brand remains strong, extending CtS to Android XR. At the annual developer conference, we saw the first evidence of the extension of Gemini Live and CtS-like functionality in Project Astra. However, Circle to Search will operate differently on the new OS.
Circle to Search on XR headsets
In a mixed reality context, this tech will use a simple pinch gesture for activation, so you don’t need to draw massive circles in your field of view while wearing the headset. You can supplement your query with a voice prompt. When triggered, CtS will work like it does on your phone, but it will let Gemini deliver the search results via audio, so you don’t need to read a wall of text. Moreover, we saw a glimpse of this on stage when the Hey Gemini feature was used.
3 Immersive Google Photos and TV experiences
Familiar apps with one additional dimension
Google seems to be banking heavily on user familiarity with its current apps ported to Android XR to drive the new OS’s success. The company clarified that Samsung’s upcoming headset, Moohan, will have apps such as Google Photos, YouTube, Google TV, and Chrome. These apps are ready for XR and will feature depth-related elements to make the user experience extra special.
Google Photos on Android XR
In Photos, users can enjoy a lifelike reconstruction of panoramic shorts, immersing them in the moment. YouTube for XR starts as a screen floating in space in front of you, transforming into the same immersive space when you’re watching VR content already available on the app. Google TV is similar, but the imagery available suggests immersion will be more theater-like, with a fading black surround for your screen content. These apps’ familiarity and uniqueness will be a great jumping-off point for Android XR devs and users.
Google TV on Android XR headsets
2 The promise of a better experience with glasses
Overlays for the win
While stock Google apps can make XR feel like home to first-time users and initial adopters, the key to long-term success would lie in imaginative applications. We are still miffed that Google Glass isn’t a thing today, but the company showed off a few use cases that give us hope for a bright future with XR. First, Google Maps should be able to overlay navigation information on smart glasses powered by XR, so you won’t glance at your phone constantly.
Navigation on XR glasses
Moreover, the tech giant imagines that wearable tech using XR will simplify DIY instructional videos since the glasses will have the same point of view as the user. Instructions, like the placement of your new floating shelf, can be overlaid via software, so there’s zero ambiguity about the next step. With multimodal interaction with the device and Google’s primary objective to shrink the hardware to the size of spectacles in the future, these apps are cool today and thought-provoking for developers to create even cooler software experiences for XR.
Tutorials with XR
1 Share your headset with others
Lifelike experiences for all
While most of the features we discussed above point to Android XR’s flexibility, one attribute is often ignored or neglected. Mixed reality is designed as a personal experience, tailored to the headset’s wearer, and that’s how Apple showed off its Vision Pro. However, Google did something different on stage, where a presenter handed the glasses off to another person, and they continued interacting with Gemini and the other XR features normally.
Both presenters used Gemini for multimodal prompting, and the assistant didn’t struggle with the switch. It could be a form of Voice Match on Android, but that couldn’t apply to gesture tracking and controls. However, Android XR products would be a joy to use if such shareability comes to the consumer, given the steep initial investment in the headset. Both presenters at Google’s event didn’t wear glasses, and the real-world experience may differ if you need dioptre correction on the headset.
Laying the foundation for a promising future
Google’s plans for the future of Android XR seem to have solid foundations. The company announced that Samsung will be the first to launch a headset codenamed Moohan. The brand partnered with leading hardware manufacturers such as Qualcomm for Snapdragon Spaces and headset makers like Lynx, Sony, and Xreal to continue development.
Today, much of the tech seems borrowed or inspired by Apple’s headset, but Google stands out in that its solution will work with hardware from multiple brands, and developers seem to have more freedom to make apps that work in XR and standard Android OS for touchscreens. Meanwhile, rumors about Samsung’s XR glasses are already afoot.