Google Glasses is coming back: here’s what to expect

Last December, I wore Google glasses in several forms while they were still under development. Soon you will be able to get your hands on the final versions. When, exactly and for how much? We may know more in just a few days.

While Meta has been the biggest tech company aiming to make a place on your face in the form of glassesit’s far from being the only one. Google is about to enter the race with a the whole range of smart glassesthe company’s first return to everyday face technology since Google Glass in 2013.

This time the focus is almost entirely on AI. Gemini will be the most important reason and function for Google’s Android XR glasses to work, but they will be available in a wide range of models: Warby Parker, Gentle Monster, Kering Eyewear and Samsung are all expected to have their own models. Xreal, a creator display glasseswill also have an additional plug-in mixed reality device called Project Aura.

This year Google I/O Developer Conference May 19 is fast approaching, and we should hear a lot more about Google’s smart glasses strategy. But we already know a lot, since Google talked about and demonstrated these glasses last year. Now that it’s 2026, all these glasses should finally be arriving, and if you’ve even half-thought about getting a pair of smart glasses, you’ll want to see what all the fuss is about.

Watch this: What to expect from Google I/O: glasses, glasses, glasses

All about Gemini

Google, Samsung and Qualcomm collaborate on Android XRa new operating system for a range of mixed reality headsets, AI glasses, display-enabled glasses and eventually augmented reality glasses. The first product of this collaboration, Samsung Galaxyarrived last fall.

The Galaxy XR is truly a VR headsetbut also a mixed reality computer, similar to Apple Vision Pro and the Meta Quest 3. It runs Android apps through its Android XR operating system and also has Gemini AI which can respond to voice and run live to see everything on your device screen and in the real world via its external cameras.

This built-in Gemini Assistant is exactly what will be the key application of the next wave of smart glasses. Just like Meta’s Ray-Ban and Oakley glasses, which use Meta AI, Google’s glasses will use Gemini as well as related Gemini apps such as Nano Banana And CarnetLM.

Contextual information on display-compatible glasses will offer contextual details, such as live map data.

Google

The screenless glasses will use built-in microphones and speakers to respond to AI prompts, handle live translation, or listen to music and phone calls. A camera can take photos and videos, or activate a Gemini Live mode for continuous recording and AI awareness of the world.

An additional line of display-enabled glasses, with a color screen in one lens, will display snapshots taken on the glasses, show phone notifications, play videos, or even provide closed captioning or live assistive translation. Some apps will also work on the glasses as extensions of what you do on your phone: Google Maps can show directions and maps displayed on the ground in front of you with a tilt of the head, or Uber can display driver status.

CNET’s Patrick Holland tried out a prototype of the glasses last year, also at Google I/O.

Lexy Savvides

Three (or more) design partners

Warby Parker, Korean fashion eyewear brand Gentle Monster, and European eyewear brand Kering are already official Android XR glasses partners, meaning all three will launch Android XR eyewear lines. Expect lots of designs and fashion styles, much like how Meta’s eyewear partner EssilorLuxottica creates many frame designs under its brand. Oakley And Ray-Ban brands.

Gucci Smart Glasses are expected via Kering, and there will certainly be other surprises. Plus, Samsung is probably in the mix. Even though Samsung is already a partner helping make all of these other glasses (likely providing camera and display components), Samsung is also reportedly set to announce its own Android XR glasses at some point.

Add to that Xreal, a maker of USB-connected display-compatible glasses, which makes its own Android XR minicomputer called Project Aura (more on that below).

Just like Google’s many partnerships with watch brands years ago through Android Wear, more eyewear brands could join in.

Project Aura, made by Xreal and Google, are display glasses that can run Android XR apps like a full mixed reality headset. They are just part of what will happen next year.

Google

A distinct kind of AR glasses experience, Project Aura

The glasses made by Xreal work differently than other smart glasses, acting more like a mini VR headset than an all-day pair of glasses. Project Aura is a specialized set of Xreal glasses with a larger screen and additional cameras that plug into a phone-sized processing puck. By wearing them (which I did last year), you can run 3D apps and experiences and even use hand tracking like a VR headset.

Project Aura runs the same apps as the Galaxy XR and uses the same chipset. It’s truly a sort of scaled-down mixed reality experiment, aiming to serve as a development tool for future Google AR glasses that could connect directly to phones as well as an actual product. But it’s not meant to be worn all day. Instead, like Xreal’s other glasses, it’s a sort of portable “headset for your eyes” display with audio that can extend the screens around you while you’re on the go.

The big difference: How well they will work with Google and Android

Google’s big advantage with Android XR should be how these devices work with AI apps you may already use or with apps on your phone. On Android phones, these should be more deeply integrated with the phone’s controls and apps, like a smartwatch. With iOS, they should also work with Gemini services.

There still haven’t been everyday smart glasses that connect deeply to the phones in our pockets, and Google’s should be first. Apple could follow next year with its own glasses.

Google has already said that phone notifications should appear as interactive widgets on the glasses, but will more apps also create deeper links? And will more AI be allowed beyond Gemini? For now, Google has said Gemini is the primary AI service for its glasses. But these glasses will work too with WearOS watchesAlso.

Will you know who is wearing these glasses and how comfortable will the AI’s privacy policies seem to you?

Scott Stein/CNET

Will Google solve privacy and social acceptance issues?

Meta has repeatedly encountered problems with its handling of users’ personal data, and inappropriate public use of its smart glasses cameras has sparked backlash on social media. Meta’s AI privacy policies are murkyand Meta is not a respected company for social media security or privacy, for very good reasons.

Will Google do better? It’s considered more reputable, but it’s also a company that already embeds ads in our personal data and is gobbling up more and more data, like health and fitness, for its connected AI services. Google will have to explain how responsible it will be in the future with glasses and overcome public acceptance factors. Will the “Glasshole” nickname come back to bite?

Price and release date unknown

We have no idea when these glasses will arrive other than “sometime in 2026.” But expect more news starting at Google I/O on May 19. I’ll be there and we’ll report on all the AI ​​and smart glasses news as it happens. Then we should know more.

Exit mobile version