Will Google smart glasses replace cell phones? See review

How often do you pick up your cell phone to get directions, take a video call, or do a Google search? Probably more times than you can count.

Google wants to change that, and it’s not alone. The search giant is part of a growing group of technology companies that are betting big on smart glasses that they use to transmit information from your cell phone and analyze the environment around you using cameras and microphones.

Google demonstrated its glasses in May and introduced the software behind them a year ago. Last week, he showed CNN and other outlets a more detailed look at the software, offering a better glimpse of what to expect when the glasses launch next year.

It could be a step towards a world where people no longer need to reach for their cell phones every few seconds.

That’s if the glasses are successful. Google Glass, the company’s previous attempt at smart devices, failed about a decade ago because it was unstylish, expensive, limited in functionality and raised a wave of privacy concerns.

Google relies primarily on search, advertising and cloud services for its revenue, so hardware products like smart glasses won’t make or break the nearly $4 trillion tech giant. But the stakes are still high for smart glasses, with tech giants eager to establish their presence in what they see as the next wave of personal computers.

“If you look at how Google and many companies in our industry have grown, it’s always been about expanding with new computing platforms,” ​​said Juston Payne, director of product management for Android XR, the software platform that powers the glasses.

“We see the same thing happening in this space,” he added.

The company already faces stiff competition, as Meta has touted strong sales of its Ray-Ban glasses, saying in an earnings call in October that its newest model “sold out in almost all stores within 48 hours.” Tech companies, including Google, have already tried and failed to make virtual reality and smart devices an everyday product.

Glasses that can see — and change — the world around you

Similar to Meta’s Ray-Ban glasses, Google’s devices let you do a variety of things hands-free, like taking photos, getting directions, answering calls, and learning about objects in your field of vision.

In this demo and previous ones, I asked Gemini questions like, “do I need to read the other books in this series?” or “are these peppers hot?” when observing a bookshelf or supermarket items.

Google has made some notable changes since I last tried its prototype about a year ago. After taking a photo of my surroundings, Google instantly transformed the image to look like the North Pole using its Nano Banana AI model. All this with a simple voice command.

I was impressed, but also a little worried about how easily and quickly glasses like these capture and manipulate images.

Google had already traveled this winding path more than a decade ago, when its failed Google Glass sparked controversy for its ability to discreetly take photos and videos.

Payne says the company has learned from what happened with Glass. Similar to Meta’s Ray-Bans, there is a light on the prototype glasses that indicates when the camera or AI image editing model is in use, and Gemini users can delete commands and activities in the app.

“Our belief is that glasses can fail because of a lack of social acceptance. In other words, we need to be fully committed” to privacy, Payne said. Google Glass, however, faced other obstacles.

Better than your phone in some ways, but not a replacement

Instead of looking down at your phone at every turn to find your way around a new city, or looking up translations during a conversation with someone who speaks another language, glasses like those on the search engine’s owners can display and recite information close to your line of sight and into your ear.

The version of Google Maps I tried on the glasses shows an arrow pointing the correct direction near your line of sight and the map when you look down. It’s similar in functionality to how it works on phones, but without the hassle of looking at a screen.

But be prepared for some awkward interactions. For example, I spoke before Gemini was ready to listen and interrupted his responses on several occasions.

Social nuances are just part of the reason smartphones aren’t likely to go away anytime soon, and why Payne recognizes that glasses won’t replace phones.

But Google is willing to bet that the glasses will play a role in the next big computing device — so much so that the glasses are even compatible with Android’s biggest rival, the iPhone.

The company plans to sell the glasses in two versions: one with a display and another that only provides audio feedback. Google is partnering with glasses makers Warby Parker and Gentle Monster and has not yet said when they will be launched or how much they will cost.

It is also developing another model with two screens that can display more detailed graphics, but the company has not specified exactly when it will be launched.

It’s not just Google’s future that depends on the success of Android XR and Gemini. Just like Android for smartphones, Android XR is available to other technology companies so they can make headsets and glasses powered by the software. Samsung and Xreal, a company specializing in smart glasses, are among Google’s first partners.

When asked what would happen to Xreal if the AI ​​boom turned out to be just a fad, founder and CEO Chi Xu said he believes “AI is real.”

source

News Room USA | LNG in Northern BC