Tekmono
  • News
  • Guides
  • Lists
  • Reviews
  • Deals
No Result
View All Result
Tekmono
No Result
View All Result
Home News
Google Testing AR Glasses Integration with Android Auto

Google Testing AR Glasses Integration with Android Auto

by Tekmono Editorial Team
14/04/2025
in News
Share on FacebookShare on Twitter

A heads-up display while driving has always been the dream use of AR glasses and now it looks like that could soon become a reality.

Looking at a screen for navigation while driving is undoubtedly a hazard. So overlaying guidance on glasses, that let you keep focused on the road, makes a lot of sense. Now it looks like Google has taken this onboard and may be closer to rolling out just such a feature with a new Android Auto update.

The update could allow drivers to integrate their AR glasses so that directions from Android Auto are displayed directly on their lenses. An Android Authority APK teardown of Android Auto version 14.2.151544 has revealed this new feature. Specifically, this was discovered in the Hindi version where it offers the option, “to view navigation on smart glasses, start navigation.”

Related Reads

OpenAI Launches Customizable Skills for Codex Coding Agent

Amazon’s Alexa+ to Integrate with Four New Services

EA Investigated for AI-Generated Content in Battlefield 6

Apple to Start iPhone 18 Production in January

This is just a few lines of code at this point so it appears to be in the very early stages of development. This makes sense when you take into account that there aren’t many fully fledged AR glasses options out there. Although leaks about a new Meta AI glasses release could fit this nicely.

While glasses could be well suited for driving, this could also represent an amazing addition to motorcycle helmets with AR display capabilities. How that would work with Android Auto and the visor overlay is less clear, but likely something that’ll be worth Google working out for what must be a very receptive audience.

While this does feel like a little hidden code tease, potentially, the timing is interesting. This follows Google’s recent TED2025 demo of its Android XR glasses. In this demo the use of Gemini AI was a powerful feature, as it could remember what was seen. In the demo the person asked where she had left her glasses and the AI was able to direct her to them.

Apply this to driving and it could mean a really helpful guidance as well as memory feature. Not to mention how helpful that could be for spotting where you passed a coffee shop, fuel stop or even an easy spot to stop.

It’s early days, but this could be the start of a very exciting and genuinely useful AR development.

ShareTweet

You Might Be Interested

OpenAI Launches Customizable Skills for Codex Coding Agent
News

OpenAI Launches Customizable Skills for Codex Coding Agent

24/12/2025
Amazon’s Alexa+ to Integrate with Four New Services
News

Amazon’s Alexa+ to Integrate with Four New Services

24/12/2025
EA Investigated for AI-Generated Content in Battlefield 6
News

EA Investigated for AI-Generated Content in Battlefield 6

24/12/2025
Apple to Start iPhone 18 Production in January
News

Apple to Start iPhone 18 Production in January

24/12/2025
Please login to join discussion

Recent Posts

  • OpenAI Launches Customizable Skills for Codex Coding Agent
  • Amazon’s Alexa+ to Integrate with Four New Services
  • EA Investigated for AI-Generated Content in Battlefield 6
  • Apple to Start iPhone 18 Production in January
  • Connect Your Phone to Wi-Fi Easily

Recent Comments

No comments to show.
  • News
  • Guides
  • Lists
  • Reviews
  • Deals
Tekmono is a Linkmedya brand. © 2015.

No Result
View All Result
  • News
  • Guides
  • Lists
  • Reviews
  • Deals