AT2k Design BBS Message Area
Casually read the BBS message area using an easy to use interface. Messages are categorized exactly like they are on the BBS. You may post new messages or reply to existing messages!

You are not logged in. Login here for full access privileges.

Previous Message | Next Message | Back to Engadget is a web magazine with...  <--  <--- Return to Home Page
   Local Database  Engadget is a web magazine with...   [176 / 268] RSS
 From   To   Subject   Date/Time 
Message   VRSS    All   Google XR glasses hands-on: Lightweight but with a limited field   May 20, 2025
 4:52 PM  

Feed: Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics
Feed Link: https://www.engadget.com/
---

Title: Google XR glasses hands-on: Lightweight but with a limited field of
view

Date: Tue, 20 May 2025 21:52:00 +0000
Link: https://www.engadget.com/ar-vr/google-xr-glas...

One of the biggest reveals of Google I/O was that the company is officially
back in the mixed reality game with its own prototype XR smart glasses. It's
been years since we've seen anything substantial from the search giant on the
AR/VR/XR front, but with a swath of hardware partners to go with its XR
platform it seems that's finally changing.

Following the keynote, Google showed off a very short demo of the prototype
device we saw onstage. I only got a few minutes with the device so my
impressions are unfortunately very limited, but I was immediately impressed
with how light the glasses were compared with Meta's Orion prototype and
Snap's augmented reality Spectacles. While both of those are quite chunky,
Google's prototype device was lightweight and felt much more like a normal
pair of glasses. The frames were a bit thicker than what I typically wear,
but not by a whole lot.

Karissa Bell for Engadget

At the same time, there are some notable differences between Google's XR
glasses and what we've seen from Meta and Snap. Google's device only has a
display on one side ΓÇö the right lens, you can see it in the image at the
top of this article ΓÇö so the visuals are more "glanceable" than fully
immersive. I noted during Google's demo onstage at I/O that the field of view
looked narrow and I can confirm that it feels much more limited than even
Snap's 46-degree field of view (Google declined to share specifics on how
wide the field of view is on its prototype.)

Instead, the display felt a bit similar to how you might use the front
display of a foldable phone. You can get a quick look at the time and
notifications and small snippets of info from your apps, like what music
you're listening to.

Obviously, Gemini is meant to play a major role in the Android XR ecosystem
and Google walked me through a few demos of the assistant working on the
smart glasses. I could look at a display of books or some art on the wall and
ask Gemini questions about what I was looking at. It felt very similar to
multimodal capabilities we've seen with Project Astra and elsewhere.

There were some bugs, though, even in the carefully orchestrated demo. Gemini
started to tell me about what I was looking at before I had even finished my
question to it, which was followed by an awkward moment where we both paused
and interrupted each other.

One of the more interesting use cases Google was showing was Google Maps in
the glasses. You can get a heads-up view of your next turn, much like Google
augmented reality walking directions, and look down to see a little section
of map on the floor. However, when I asked Gemini how long it would take to
drive to San Francisco from my location it wasn't able to provide an answer.
(It actually said something like "tool output," and my demo ended very
quickly after.)

Engadget

I also really liked how Google took advantage of the glasses' onboard camera.
When I snapped a photo, a preview of the image immediately popped up on the
display so I could see how it turned out. I really appreciated this because
framing photos from a camera on smart glasses is inherently unintuitive
because the final image can vary so much depending on where the lens is
placed. I've often wished for a version of this when taking photos with my
Ray-Ban Meta Smart Glasses, so it was cool to see a version of this actually
in action.

I honestly still have a lot of questions about Google's vision for XR and
what eventual Gemini-powered smart glasses will be capable of. As with so
many other mixed reality demos I've seen, it's obviously still very early
days. Google was careful to emphasize that this is prototype hardware meant
to show off what Android XR is capable of, not a device it's planning on
selling anytime soon. So any smart glasses we get from Google or its hardware
partners could look very different.

What my few minutes with Android XR was able to show, though, was how Google
is thinking about bringing AI and mixed reality together. It's not so
different from Meta, which sees smart glasses as key to long-term adoption of
its AI assistant too. But now that Gemini is coming to just about every
Google product that exists, the company has a very solid foundation to
actually accomplish this.

This article originally appeared on Engadget at https://www.engadget.com/ar-
vr/google-xr-glasses-hands-on-lightweight-but-with-a-limited-field-of-view-
213940554.html?src=rss

---
VRSS v2.1.180528
  Show ANSI Codes | Hide BBCodes | Show Color Codes | Hide Encoding | Hide HTML Tags | Show Routing
Previous Message | Next Message | Back to Engadget is a web magazine with...  <--  <--- Return to Home Page

VADV-PHP
Execution Time: 0.0156 seconds

If you experience any problems with this website or need help, contact the webmaster.
VADV-PHP Copyright © 2002-2025 Steve Winn, Aspect Technologies. All Rights Reserved.
Virtual Advanced Copyright © 1995-1997 Roland De Graaf.
v2.1.250224