Google Assistant's big update: All the new AI tricks and features, explained
The next phase of Google Assistant
At Google I/O 2018, Google announced the "next phase of the Google Assistant", which includes improving the voice assistant so that it's "naturally conversational, visually assistive, and helpful in getting things done." Here's a look at what that means.
No more 'Hey Google' or 'OK Google'
Google explained at Google I/O 2018 that language is incredibly complex.
For instance, people can ask about the weather in over 10,000 ways. You might say, "Is it raining cats and dogs today?" or "What's the weather like? or "Should I wear a coat?" As a result, Google said it's "dramatically" improving Google Assistant's language understanding so you can speak more naturally to your Assistant and it will know what you mean.
As part of this effort, you'll soon be able to have a natural conversation with Assistant without repeating "Hey Google" or "OK Google" for each follow-up request. Assistant will be able to understand when you're talking, versus someone else, and respond. This feature is called Continued Conversation, and you'll be able to turn it on "in the coming weeks."
New voices -- including John Legend's
Do you hate Google Assistant's default voices?
Well, thanks to advancements in artificial intelligence, as well as advancements in WaveNet technology from Google's DeepMind initiative, Assistant is now able to get a new voice within weeks rather than months or years. Google said it's been able rapidly make new voices for the assistant -- while still capturing "subtleties" like "pitch, pace, and all the pauses that convey meaning." So, starting now, six new voices are available. Another one, which is a copy of musician John Legend's voice, will be available for Assistant later this year.
Multiple Actions: Understanding more complex queries
Good-bye simple commands. Hello, complex ones!
Google announced that another new featured, called Multiple Actions, is already starting to roll out. With it, Google Assistant can finally understand more complex queries. For instance, you'll be able to ask it, "What's the weather like in New York and in Austin?"
Google Assistant is getting a new mode that's powered by Google's Family Link service.
Essentially, to encourage your kids to ask for things nicely, Google said it'll introduce a Pretty Please mode later this year. It enables Assistant to understand polite conversation. So, if your kid wants Assistant to do something, he or she will need to say "please" and then "thank you," at which point Assistant will respond and do what's been asked of it.
This feature is optional and requires Assistant's voice recognition setting to be turned on if you want Pretty Please to only be activated for a specific person. That way, Assistant can detect who is talking to it whenever it's doling out a lesson in manners.
Custom Routines (that you can schedule, too)
Earlier this year, Google launched a set of Routines that allow you to do multiple things with a single Assistant command. But now, it's allowing Custom Routines, so that you can create your own routine. For example, you can create a custom routine for movie night.
You can kick off this custom routine by saying, for instance, "Hey Google, it's time for movie night," and then Assistant might turn on the TV, lock the door, and broadcast "time for movie night!" to everyone in the house. Also, around mid-2018, you'll be able to schedule routines for a specific day or time via the Assistant app or through the Google Clock app for Android.
Smart Displays coming in July 2018
Google Assistant is mostly voice-based. But, going forward, it'll be more visual.
Smart Displays, for instance, are a new category of devices with built-in Google Assistant. They allow you to access Assistant hands-free by voice, but they also allow you to tap and swipe o the screen. That way, you can control your smart home, watch live TV on YouTube TV, make video calls with Google Duo, etc. Google said the first Smart Displays will be available in July 2018.
New, more visual phone experience
Speaking of Assistant becoming visual, Google said it's redesigning the Google Assistant experience on the phone screen so that it will give you a quick snapshot of your day -- "with suggestions based on the time of day, location, and recent interactions with the Assistant."
As part of this effort, Google is partnering with Any.do, Todoist, Starbucks, Doordash, Applebee's, Dunkin' Donuts, and Domino's. The idea is that, with these integrations, you'll see your notes, tasks, and things like food deliveries all very easily -- without being constrained by separate apps or a chat-style interface. This new visual design will be available in late 2018.
Assistant navigation is coming to Google Maps
Google Assistant is coming to navigation in the Google Maps on iOS and Android sometime during the summer of 2018.
With this functionality, you'll also be able to use Assistant in the Maps app to share your ETA while you're driving, all without touching your phone. You can even "send text messages, play music and podcasts, and get information" -- directly with Assistant in the Maps app.
That means you should be able to say, "Hey Google, read me my messages," and then you'll hear a summary of your texts with the option to respond by voice.
Assistant can call and book your appointments
Again, sometime in the summer of 2018, Google will start testing another new feature within Google Assistant. This one will allow you to make restaurant reservations, schedule hair salon appointments, and more. All you have to do is provide the date and time, and Assistant will call the business for you or even book online for you.
This feature is powered by a new technology called Google Duplex. You see, now that Assistant can understand complex queries and respond more naturally, it can make very natural-sounding phone calls. It will be clear about the intent of the call, too. Even better, once your reservation or appointment is booked, Assistant will add a calendar reminder for your appointment.