X
Innovation

Google Maps is getting this new 'immersive' view. Here's why it might be useful

Using advances in AI, Google is introducing new ways to explore and better understand the world in Maps, Search, and Translate.
Written by Stephanie Condon, Senior Writer
Screenshot of Google Maps immersive view
Image: Stephanie Condon/ZDNET

If you've ever looked up a restaurant on Google Maps, you may get an idea of the menu and the decor. 

But it can be tricky to anticipate what the experience of eating there will be like. Will it feel too crowded when you arrive? Does the lighting set the right mood? 

These are the sorts of questions Google is trying to answer with its new "immersive view" feature in Google Maps. 

The new feature, announced last year, is rolling out Tuesday starting in London, Los Angeles, New York, San Francisco, and Tokyo. 

In the coming months, it will launch in more cities, including Florence, Venice, Amsterdam, and Dublin.

Also: The best Android phones right now

Using AI, the feature fuses billions of Street View and aerial images, creating a rich, "immersive view" of the world. It uses neural radiance fields (NeRF), an advanced AI technique, to create 3D images out of ordinary pictures. This gives the user an idea of a place's lighting, the texture of materials, or pieces of context, such as what's in the background.

Image: Google

Google will soon expand the Search with Live View in Maps to more places in Europe, including Barcelona, Madrid, and Dublin. Additionally, Indoor Live View is expanding to more 1,000 new airports, train stations, and malls in a variety of cities, including London, Paris, Berlin, Madrid, Barcelona, Prague, Frankfurt, Tokyo, Sydney, Melbourne, São Paulo, and Taipei.

Maps is also getting new features for EV drivers. The platform will show you places that have on-site charging stations and will help you find chargers of 150 kilowatts or higher. 

Google is also stepping up the ways you can explore the world with Lens, the AI-powered tool that lets people conduct an image search from their camera or photos. First released in 2017, people already use Lens more than 10 billion times a month, Google said. 

With multisearch using Lens, people can search using text and images at the same time. Just a few months ago, Google launched "multisearch near me" to take a picture of something (like a specific meal) and find out where to find it locally. In the coming months, Google said it will roll out "multisearch near me" to all languages and countries where Lens is available. 

Multisearch is also expanding to images on the web on mobile. Additionally, Google is bringing a "search your screen" with the Lens feature to Android. Users will be able to search photos or videos on their screen, regardless of what app or website they're using -- without leaving the app.

Also: 3 things Google needs to fix for Android to catch up to iOS

Google Translate is also getting an update. Among other things, the tech giant shared on Wednesday that the tool will provide more context for translations. For instance, it will tell you if words or phrases have multiple meanings and help you find the best translation. This update will start rolling out in English, French, German, Japanese, and Spanish in the coming weeks.

Editorial standards