Google Glass was the company’s first launch of smart glasses in 2012/2013, featuring a projector-powered display, touchpad, camera for photo and video recordings, bone circulation audio and voice commands. It certainly made for an innovative but ultimately failed first attempt, finally finding a small niche for itself in the business world.
It looks like Google is yet to release the consumer-oriented smart glasses, as it unveiled a prototype for an anonymous pair of glasses at its I / O 2022 developer conference. The new specs look like spectacles featured by startup Focal, acquired by Google in 2020, featuring search giant live translation and real-time transcription.
Our direction: All you need to know about Google hardware
This is definitely a great idea and we don’t know if Google is planning to commercialize this product. But the company needs to come up with more if they want to make a pair of smart glasses of the future a success with a wider audience. Here are some of the features we expect to see in a potential Google Glass successor.
Sign language interpretation
Google’s own videos actually showcase real-time transcripts of speeches, showing how hearing-impaired people can benefit from the technology. But how about sign language translation and interpretation?
Related: The Best American Sign Language App for Android
It sounds super high-tech, but so is Google Announced a web game To help people learn sign language in December 2021. The company has also announced On-device hand-tracking technology Back in 2019 (see screenshots above), sign language lays the foundation for applications. So we want to see a translation of written or spoken language from sign language into a pair of future Google smart glasses.
Take control of your smart home gadgets
One of the more exciting smart home advances in recent years is the Ultra Wide-Band (UWB) technology, which is available on devices such as Apple, Google, Samsung and Xiaomi. The next company even posted a neat demo showing that you can control smart home gadgets by pointing your phone at the relevant gadget.
Imagine looking at the front door to access the camera feed of your smart doorbell.
What if we brought this UWB-enabled technology into a pair of Google smart glasses of the future? Imagine looking at your smart door lock to lock / unlock, or looking at the front door to access the camera feed of your smart doorbell. This could theoretically extend to traditional smart speakers and smart displays, as looking at a UWB-equipped smart speaker could enable listening or give you visual access to routines and other commands.
Read more: Everything you need to know about UWB wireless technology
Filters, filters everywhere
Hadley Simmons / Android Authority
While this is not the most useful example of machine learning, it is difficult to argue that filters have not been one of the main driving forces behind AR technology in the last five years. Everyone from Snapchat and Instagram to TikTok has used machine learning to offer fun face filters for use on their platforms.
Deep excavation: What is the difference between AR and VR?
This isn’t the first time we’ve seen AR filters and 3D effects on a pair of smart glasses, though Snapchat Spectacles also offers some of these effects. However, these are usually limited to environmental effects rather than giving face filters. So a more open approach to integrating with face filters would be great for developers of other, more popular platforms like TickTock and Instagram.
A major upgrade for navigation
This sounds like a no-brainer, but Google Maps Live View is definitely a feature we would love to see a pair of smart glasses of the future. Google has introduced this augmented reality feature for maps in late 2020, overlaying directions and other navigation information in your phone’s camera viewfinder. Just lift your phone up and point it around to get directions.
More about augmented reality: The best augmented reality app for Android
Google Maps Live View also gained more functionality last year, offering virtual street labels, landmark landmarks, and the ability to see details of certain places (such as reviews or whether they’re busy).
With glasses, the camera follows your vision so you don’t have to stand in the street holding your phone to discover the world around you.
All of this seems like a normal fit for the next generation of smart glasses, as the camera will always be in your head and you won’t have to stand on the street with your phone to explore the world around you. Driving a live view through glasses seems like a more seamless experience, as opposed to taking your eyes off the road – although we hope that implementing a smart glass isn’t too confusing either.
More advanced AR search
Google I / O 2022 has seen some advances in Google Lens and AR-powered search, such as multi-search and visual search. And they both seem to fit into the future pair of Google smart glasses.
Scene Exploration is another Nifty AR search tool, because you can hold your phone’s camera to search the world around you. It sounds like a pedestrian, but it is broader than the lens’s existing visual search functionality, considering multiple objects in a scene instead of just one. You also get easy insights overlaid on the scene and product. Google gives the example of someone searching for chocolate in a grocery store. Not only will you get ratings for each chocolate bar but the feature will also find the preferred keywords like “dark” or “nut-free” chocolate.
Other lens features such as copying / pasting real-world words, highlighting popular dishes on restaurant menus, and translating foreign languages ​​also seem to be great additions to a new version of Google Glass. So the fingers crossed that a future product actually offers all these features.
Most things in your smartwatch
Caitlin Simino / Android Authority
In addition to health and fitness, smartwatches excel in many other areas due to their wearable nature. These tasks include taking phone calls, offering music controls for your phone, and editing / checking grocery shopping lists. This is in addition to more mundane features like notification mirroring, viewing calendar entries and showing the weather forecast.
We want to see the next generation of Google smart glasses that reduce the need to pick up our phone for every task
So we want to see the next generation of Google smart glasses start from the Wear OS smartwatches in this regard, reducing the need to pick up the phone for every task. We also see some light fitness functionality in these glasses, such as step-counting for walking and GPS tracking for cycling.
Buyer’s Guide: The best smartwatch on the market today
What are the new smart glasses closing?
We must argue that the timing has never been better for the successor to Google Glass. For starters, the increased reality of dedicated hardware needs from that first day has evolved in a big way. For Google’s own ARCore augmented reality suite, only a camera and some sensors, such as a gyroscope, need to overlay animals and objects in front of you.
Natural language processing is another area where search giants have made impressive progress, compared to the pre-assistant real Google Glass era. The Pixel 6 phone’s tensor chipset even allows offline voice typing, highlighting how far we’ve come in the last decade. This is another area where a pair of smart glasses of the future could see significant improvement.
What do you like most about the new Google Smart Glasses?
54 votes
There has also been a huge increase in horsepower since the first Google Glass in 2012/2013, which originally had a dual-core chipset and 2GB of RAM. Google itself has taken advantage of this power boost by packing the Snapdragon XR1 processor in the Glass 2 Enterprise version of 2019. This energy advancement must also come in consumer-level glasses.
In other words, the technical parts of the puzzle – both hardware and software – seem to be coming together for Google Glass follow-up. But this product will survive and die in its use, so hopefully, Google will add some of our favorites list above if it aims for a commercial release soon.