Google Maps will soon be announcing a slew of new features ranging from map weather updates to AR-powered indoor navigation. There’s a lot to discuss, and the company says it’s on track to deliver “more than 100 AI-powered improvements to Google Maps” this year.
First, there’s a new directions user interface. Today, the Directions UI uses tabs for each mode of transportation: one for driving, then public transit, walking, shared rides, and biking. In this redesign, everything appears in a flat list, but now you can press the “options” button and set the modes of transport you want. You can prioritize route options for driving, walking, trains, buses, motorcycles, bicycles, ride-sharing, ‘bike and scooter shares’, and ferries. You can even pick multiple items so that all of your top picks are first in the list.
Some route options have a small green leaf next to them, which is part of Google Maps’ new focus on promoting cleaner modes of transport. Before driving, the Google Maps route screen will soon take fuel efficiency into account, and you’ll also see a green leaf next to the fuel-efficient routes. For many places, the shortest route is the most economical, so not much changes. But Google Maps calculates things like traffic, starts and stops, and road elevation (a major concern in Google’s California backyard) to come up with a CO2 rating for each trip. If it finds a route that is more economical but longer, it will tell you about it, and if both routes take the same amount of time, it will use fuel economy as the tiebreaker for the default route.
Google says both features will be released sometime this year.
Google Maps is taking over again
Two new layers are coming to Google Maps that put the service in a bit of competition with your favorite weather app: air quality and weather layers. Weather is almost always something to look into before traveling, and soon you’ll be able to get that information right in Google Maps. For those with allergies or those in places where air quality is a regular problem, it’s also helpful to see that information easily.
Google says, “Data from partners such as The Weather Company, AirNow.gov and the Central Pollution Board are feeding these layers that will be rolled out on Android and iOS in the coming months. The weather layer will be available globally and the air quality layer will launch in Australia, India and the US, and more countries will follow.”
For now, the presentation of this information is very limited. The most obvious way to display weather and air quality data is with an overlay that shows the rain and sky in a radar display, with different colors indicating the intensity. Google Maps only displays this data as small, random dots on the map, similar to how POIs are displayed. This makes it difficult to determine where the rain will start and stop, how long it will linger, whether it will get better or worse over the next few hours, or how bad the weather will be while you drive there. Adding this information directly to Google Maps would probably take a big chunk out of the weather app industry as overlaying data on Google Maps is a core feature, but maybe Google is more open to a first-party weather solution now that Apple is investing in it. area with the acquisition of Dark Sky.
Indoor AR Navigation
Google Maps AR navigation moves inward. Rolling out to iPhone and Android devices in select cities in 2019, the feature uses AR Core-based 3D detection and Google’s wealth of streetview imagery to determine your direction through the camera and where it’s pointing. Outside, the feature turned out to be the world’s most complicated replacement for a compass, but compasses in phones (especially Android phones) just aren’t as accurate and prone to interference, so it can be difficult to get the first direction of travel right with just a compass. a challenge. AR navigation, in addition to the cool 3D images that are superimposed on a camera, is really a big help.
It sounds like AR navigation is really going to shine when it transitions to indoor navigation. Google demonstrated the feature in an airport terminal, where it can do things like determine your location (GPS doesn’t work indoors) and identify which floor you’re on. In the demo, it tells someone where the escalator is and to go down a level to reach their terminal. Google says it plans to roll out this technology to “airports, transit stations and shopping malls,” where it “can help you find the nearest elevator and escalators, your gate, platform, baggage claim, check-in counters, ticket office, restrooms, ATMs, and more.”
Indoor navigation has been something that Google has been constantly trying to get companies to adopt solutions such as Wi-Fi RTT (Wi-Fi-based positioning) built into Android 9. I think the company has realized that any method that requires independent companies to install and maintain some sort of technical infrastructure isn’t going to work. AR navigation feels like a more scalable alternative because Google can do all the work itself. It’s powered by nothing but your camera and a bunch of photos stored on Google Maps — Google calls this VPS or Visual Positioning System — and it’s basically AI-powered landmark navigation.
VPS data is the same as Street View data, so it can be scaled in the same way as Street View, sending hordes of contractors around the world to shoot everything with special equipment. You could say that shooting any large indoor public space is too much work, but Google has already proven that it can be done with Street View. The company now professionally produces its own Street View backpacks, so sending a contractor on a quick march through your local airport, train station or shopping center should be enough for VPS data. “Just go and shoot the whole world” is completely within Google’s capabilities.
Google calls the feature “Indoor Live View” and says it’s “now live on Android and iOS in a number of malls in Chicago, Long Island, Los Angeles, Newark, San Francisco, San Jose, and Seattle. It will begin in the coming months.” in selected airports, shopping malls and transit stations in Tokyo and Zurich, with more cities to come.”
Now if we could get this in an AR version of Google Glass, that would be great.