X
Innovation

Interacting with cars: What's on the horizon?

From button-free functionality to AR windshields, Nuance Communications tells us what to expect from upcoming automotive interfaces.
Written by Stephanie Condon, Senior Writer

For years, Nuance Communications has helped the automotive industry integrate sophisticated voice-based technologies into cars. The company works with nearly every major automaker, and its user-interface software is installed in almost half of the vehicles sold around the world. If you have a relatively new car, there's a good chance you use Nuance's technology for voice dialing, to control navigation by voice, to enter destinations or to play music.

Now, Nuance is leveraging advances in its voice technology, along with technologies like emotion detection AI, eye tracking/ and augmented reality, to introduce entirely new ways of interacting with your car.  Eric Montague, Nuance's senior director of strategy & product marketing, spoke to ZDNet about what's on the horizon. 

Here are the highlights of that conversation:

Coming soon: A button-free experience

By the end of 2019, Nuance and its auto partners plan to introduce the concept of a button-free car, Montague. That means controlling effectively everything by voice -- with precision. You'll be able to say, "open the window," as well as more specifically, "open the window half way," or "open the window a little more."

Additionally, drivers will be able to interact with their vehicles without a wake-up word. You can "just drive the car, and say, 'I'm hungry,' and the car will listen to you and help you find a place to eat," Montague said.

The precision will make it "much faster than using any other interface," Montague said. "You can just drive, and say [to your navigation system], 'zoom in where we are.' Or, 'zoom in destination,' or, 'show me the nearest gas station,' and just have conversations like a human being."

Keeping data private

"The system always listens to you, so the technology is embedded to ensure that your private conversational data does not go to the cloud," Montague said. "It stays in the car, because privacy is a huge concern" for automotive brands.

"So we tried to keep as much data as possible inside the car," he continued. "But sometimes, you can't avoid sending private data to the cloud. One particular example is when you say things like, 'What's the weather where I am?' or, 'What's the weather at my destination?' So where you are, and where you're going, that's something you may not want to share with a third party.

For those cases, Nuance has worked with OEMs "to build systems that make use of data in a one-way manner," Montague said, explaining that Nuance uses a one-way hash mechanism.

"So all the data that comes out of the car is separated into different buckets and coded in a way that doesn't make it possible to recombine that data, to reconstruct all the private data of the driver."

Coming soon: Interacting with the outside world with your gaze

Late this year, Montague said, drivers will be able to interact with the outside world using their voice and gaze.

He explained, it's "the ability when you drive, to look at things outside, points of interest, and ask questions about what you see... You see a restaurant, and you say, 'What are the user ratings for this restaurant?' just by looking at it. The system will look at your eye, and look where your eyes are looking at, and convert that into a point of interest, take your command, and then interpret your command on that point of interest."

Down the road: Interactive smart windows

At CES, Nuance worked with partner Saint-Gobain to demonstrate the ability to display widgets and interact with a smart windshield. Effectively, the entire windshield serves as a transparent display,  Montague explained. It could display information about the music you are playing or about things in the surrounding area. That should hit the market "in the next few years," Montague said.

Keeping drivers safe

Compared to chatting with your smart speaker in your living room, interacting with a smart interface is much riskier and much more complex when you're driving. Consequently, smart interfaces in vehicles must make "decisions are critical for safety," Montague said. He explained, "We have research that shows that interacting with the driver at a critical moment can increase the risk of an accident. So what you really need to do is have software that can internally decide... what to do with the interface" during a potentially high-risk event.

"When we provide the interface with that car, we are told about every single event that the car is aware of,"  Montague said.

He gave an example:

"Let's say I am interacting with the car, to select a music station... If I have, for example, a child in the back of the car, and the child takes off his seatbelt, we get an event from the internal network of the car, into the interface. If one of the sensors detects that there's a car on my left that I can't see -- from blind spot detection sensors -- then I get an event, a message from that network, into the interface. If I'm low on fuel, I get a message from the network... So the amount of events that can change the interaction is 10 to 100 times higher than what you would find on a mobile phone or a smart speaker, and you have to deal with all of that. You have to decide when one of these sensors detects an event, like a seatbelt being taken off, whether that moment, the interface should continue to interact with the driver, or stop and offer other options."

Using emotion detection for safety:

Nuance is also working with a Boston-based company called Affectiva that's developing facial emotion detection technology. This technology can help keep drivers safe, Montague said.

"It can look at your eyes, at your face, and then using artificial intelligence, assess that you are tired," he explained. "Then usually the systems play a warning tone. Or there's a screen that pops up and tells you that you should stop, because you look tired."

Nuance, Montague said, is taking this kind of system one step further. "Beyond the analysis of your facial expression, the tone of your voice... we really try to engage the driver at multiple levels, to come up with a real solution to the fact that they are tired," he said. "So what does it mean? Once the technology detects that you are tired, it may first play a tone to warn you that you are tired... And then gradually, it will present different options to the driver. One option could be to engage in conversation with the driver. Another option is to practically offer an option to have a rest. If you're very tired, maybe look for a hotel... Maybe offer a place where you can stop on the way to have coffee."

Ensuring users aren't creeped out

Two and a half years ago, Nuance established a user experience research lab called Drive Labs. "For every function or new product that greets the market, we conduct some research with users, to ensure that the way we bring these products to market feels right to the user," Montague said.

Very often, it comes down to details, he said. "Little details about how you engage, how you set expectation, how you deduce the technology to use, how it can work inside the car. It makes a difference to the acceptance level of the consumers. So there's no simple answer. It's all about doing the right research with consumers, and we do this on a global scale.... The expectations and the behavior of users changes depending on the regions."

Editorial standards