Google DeepMindā€™s Sir Demis Hassabis called glasses a new form factor at the I/O 2024 stage, but the real tech buffs know that Google actually pioneered smart eyewear over a decade ago. That was one piece of hardware we can safely say was ahead of its time.




With the renewed chatter around generative AI-powered smart glasses, the Google Glass is set to make a comeback, likely with a new design and purpose. And at the I/O stage earlier this week, we even got a sneak peek at Project Astra, which could power Googleā€™s all-new AR glasses that weā€™ve been eagerly awaiting.


Coming back with a bang

Project Astra had our jaws dropped

A day after OpenAI showed off its GPT-4oā€™s new multimodal capabilities that eerily reminded me of the movie Her, Google came out with its own alternative. A quick Project Astra demo showed how Gemini could recognize things using the phoneā€™s camera to help you with explainers and suggest fixes. Google mentioned that the demo video was recorded in real-time, meaning it wasnā€™t tweaked to cut down the perceptible response time or fix errors as was the case in a previous demo. Considering that, this new hands-on video looked quite impressive; Astra could turn out to be much handier than the current crop of dedicated AI hardware, but thatā€™s a story for another day.


Related

Googleā€™s Project Astra takes Gemini AI into the real world

The Google Glass idea is evolving in hiatus

Toward the end of the demo, two things happened that intrigued me quite a bit. For one, AI could remember what it saw in the input feed to help the user find it later. Privacy concerns can be alleviated by moving processing offline, but the fact that AI can do that brings it closer to human-like visual intelligence. This feature alone could be quite useful for the elderly and the forgetful ones among us and also find nifty applications for office and home users.


What AI helped find was a pair of glasses lying next to a red apple. Yeah, it was a cheeky reference to Appleā€™s recent eyewear, and those werenā€™t some ordinary glasses. Those glasses can also do everything that Project Astra can do on a phone, keeping your hands free. A pair of smart glasses far more seamlessly integrates into daily life without requiring another dorky device hanging from your t-shirt that projects stuff on your palm. On top of that, baking AI into our eyeglasses could have dozens of potential accessibility use cases to help those with visual impairments navigate the world more easily.


The Google Glass 2.0

Make it a reality, Google



The first-gen Google Glass had a tiny display close to your eye to overlay digital elements against the real world, and for its time, the pair looked nothing less than a tempting sci-fi tool. But the pair we saw at the I/O this week looked a lot like regular shades, just with chonkier temple arms to accommodate the hardware. From the split second they were shown on screen, the glasses didnā€™t look much different from Metaā€™s AI glasses developed in partnership with Ray-Ban.

Google was already rumored to be working with Samsung on a pair of AR glasses. In this demo, you can see that the glasses can overlay what youā€™re seeing with some contextual translucent elements, though we didnā€™t get to see how that works exactly. While Meta glasses can do many of these things, like answering your questions based on what you see and even recording PoV videos using the onboard camera, a Google Glass could have much more potential.


Unlike Meta, Google can pair its glasses with Android on a deeper level, allowing it to sync data with a ton of first and third-party apps, making them infinitely more usable than Metaā€™s offering. Imagine getting directions or scanning QR codes right in front of your eyes without pulling out your phone or even raising your wrist. Supercharged with conversational and multimodal AI that can understand text, voice, and visuals, the new Google Glass could do things that we canā€™t even imagine today.

So many things have fallen into place since the first Google Glass came out, that there hasnā€™t been a better time for Google to relaunch its pair of glasses than now.



Project Astra is still, well, a project

But itā€™s not too far from reality

Google was teasing about what itā€™s been cooking behind the scenes all this while. If anything, this demo comes off as confirmation that a new-gen Google Glass is in the making, and it will be out sooner rather than later.

However, Astra is still a project in its early stages, despite looking flawless in that crafted demo. Our own Taylor Kerns looked at how Project Astra would work in several situations and came out rather optimistic about the future, but a few others werenā€™t as impressed by Googleā€™s demonstrations.



The good news is that Google will soon have a pair of glasses to challenge the Apple Vision Pro, and thereā€™s a good chance that Google might have the upper hand, despite being a few months late to the party. Unlike Appleā€™s bulky mixed reality headset that limits your movement, Googleā€™s AR glasses will replace your existing eyewear without being cumbersome to wear (for the most part) ā€” weā€™ve experienced that with the Meta Smart Glasses. And since Meta has already set a pricing benchmark, Google must price its offering under $500, which is far cheaper than the insanely priced Vision Pro.

While I have little hope for Google given its track record with hardware, the company has all the pieces in place to complete this puzzle. That little glimpse at the Glass pushed me back to the past and made me want a well-made pair of AR glasses that could reach the hands of a wider audience. Just prove me right this one time, Google.