X
Business

Driven to distraction: Why IBM's Watson is getting onboard with self-driving vehicles and impatient passengers

IBM has teamed up with Local Motors for a new autonomous vehicle. Here's how it will handle difficult passengers - and why you won't be able to buy one.
Written by Jo Best, Contributor
olli-watson-thumb.jpg

Local Motors CEO and co-founder John B Rogers, Jr with the Olli minibus.

Image: Rich Riggins/Feature Photo Service for IBM

Watson, the AI system first built by IBM to win the popular game show Jeopardy! and later adapted for industries like banking and healthcare, is now going to work in the automotive business.

Last month, IBM announced a partnership with auto-maker Local Motors that will see Watson used onboard a new autonomous public transport vehicle. While the business of driving the vehicle, called Olli, will be the job of Local Motors' technology, Watson will be there to chat to passengers -- the vehicle's 'human face', as it were, taking on the role of a bus conductor or train guard.

As any bus conductor or train guard will tell you, being the face of public transport can mean running up against people who want to take out their frustrations about a delayed journey or a stressful day on someone -- anyone -- whether it's their fault or not.

Watson has been prepped for these situations, and will never be without human back-up for the toughest encounters.

"Passengers tend not to be the most patient people in the world... We've had to design for that a little bit, but for the most part, once people start to deviate from the standard interactions and the types of things people might want, it's not like we have to counter that, we don't have to design a heckling response," Brett Greenstein, IBM's VP for the Internet of Things, told ZDNet.

"As with anyone in services, [with Watson] when people start to get really difficult, it's like: 'these are the things I can help you with, and if I can't, would you like to speak to my manager?' There's still always an override in the vehicle, there are cameras installed in Olli, and there's a service person in a central location who can manage several cars at the same time, and watch for truly out-of-whack situations."

The Watson system will also be able to chat to people, allaying their fears about being driven around a city by a vehicle with no driver, offering information about the local area, or talking to travellers about their destination or journey.

How do you get passengers to start opening up and talking out loud to a machine? It all begins, says Greenstein, with Olli needing to be both visually appealing and chatty.

"It starts with a cute name and a cute design of the vehicle. There's an iPad terminal in the vehicle, and having it be able to attractive and appealing and cute is good... With the dialogue itself, with the Watson natural language system, we can frame types of interactions, so you could have a conversation back and forth rather than just ask a question, get an answer. It can remember previous questions, for example, and different grammar structures."

IBM's Node-RED programming tool allows Watson to pick out which words matter in a conversation and extract the right meaning from what a passenger is saying, even if they might be saying it in a non-obvious way.

The system will use four Watson APIs -- Speech to Text, Natural Language Classifier, Entity Extraction, and Text to Speech -- to allow travellers on Olli to talk directly to Watson, and get responses back in spoken natural language.

Watson IoT for Automotive, the name of the system that will be used on Olli, is a version of the core Watson that started life as a gameshow winner, and has since been customised for various applications in verticals from scientific research, pharmaceuticals, travel, and retail.

The company first started considering taking Watson into the automotive industry after launching its IoT platform last year, and subsequently being approached by Local Motors, which IBM had worked with before, about producing a self-driving vehicle.

Whenever Watson is readied for its next application or industry, the system is fed extra information to train it for its new use. According to Greenstein, the development work need to get Watson prepared for Olli took a small engineering team a matter of three months.

For its transport job, in order to build up enough of a bank of knowledge to give Watson context about what passengers on Olli might potentially ask, IBM has launched a web tool called 'ask Olli' where locals could enter their own questions online. Typically, people have been asking about the weather, where to eat, and what places to visit. For its responses, Watson can draw on information sources including live traffic, maps, and weather data, as well as offer Yelp reviews.

Watson has also been working on learning to better understand the multiplicity of different accents and speaking styles that passengers may have.

"In order to understand them you need as much context as possible. If somebody walks up to you and starts speaking at random in an accent you're not familiar with, it can take you a few seconds to tune in and figure out what they're talking about, but you don't really have that time in the vehicle. So, we use the IoT data about the vehicle and its sensors, and its environment, so when people ask questions we already have a general sense about what sort of questions they might be asking."

In Olli's case, Watson can use data from the system's sensors about the speed, direction, and destination, as well as local weather and traffic, prioritise which elements are important, and then compile an answer.

However, given the complexity of tasks that Watson can already carry out, from winning Jeopardy! to sitting parts of the US' general medical licensing board exams, why did IBM decide to make Watson only Olli's greeter, not its driver, especially given how every sufficiently large tech and auto company is working on a self-driving system?

"We have to focus on where our strengths are, and Watson is focused very much on the understanding of unstructured data -- audio, video, image, voice, text -- as well understanding people -- intention, personality, tone.

"Working with more specific control systems like manufacturing, robotics, self-driving vehicle systems: I'm not saying we couldn't do all those things, but they're specialised and they're focused, so it's better for us to add image processing to multiple systems that might control a robot car, rather than to build a robot car," Greenstein said.

"Control systems are real time, they're operated locally, they behave differently than a cognitive system, where we use a lot of data from the cloud and a lot of live data."

Olli is already being trialled in National Harbor, Maryland and will soon also be piloted in Miami, and then Las Vegas. It's able to carry up to 12 people around the city's roads, though Greenstein added that IBM has had conversations with other public authorities about trialling the vehicles, and could even see them one day being used to transport elderly passengers, who might previously have used golf carts to get around. It may also be used to move visitors between rides at theme parks.

Olli is meant to be ridden around a particular set of streets at no great speed, where passengers won't necessary need a seatbeat. It's not likely, then, that you'll be able to replace your family car with a consumer version of Olli.

"When we were doing the demos, we'd ride on Olli and bring different groups of people with us, and my favourite question was: 'can I have one for myself?' It's really designed as a city vehicle, it's aimed at cities and communities."

Read more on Watson

Editorial standards