X
Innovation

What Toyota's $1 billion investment in AI is really about

Self-driving cars are coming, but Toyota will get a lot more for its $1 billion dollar investment
Written by Greg Nichols, Contributing Writer

Earlier this month, and to much fanfare, Toyota announced that it would establish the Toyota Research Institute in Silicon Valley to advance its AI and robotics research. The car maker will inject $1 billion over the next five years, adding to a $50 million investment in AI research it made with MIT and Stanford in September.

For obvious reasons the headlines have focused on Toyota's leap into autonomous driving, and comparisons with Google's driverless car program abound. To be sure, the largest car maker in the world by sales wants to remain in pole position when it comes to vehicle R&D, and while there's no timetable for a driverless Prius, the company has signaled its ambition to make its cars safer through what might be termed semi-autonomous solutions, such as collision avoidance. Toyota's Lexus division is already at work on an autonomous vehicle.

But the more immediate impact of Toyota's investment won't be found on the road.

"There's a significant impetus for safe driving, but I think manufacturing is the underreported story," says Martial Hebert, Director of the Robotics Institute at Carnegie Mellon University, who offered me some insights during a recent phone call. "I believe we'll see solutions in manufacturing from Toyota's efforts much sooner than we think."

7ec84gill-pratt-akio-toyoda-toyota-robotics-1446787809883.jpg
Akio Toyoda, President of the Toyota Motor Corporation, shows an awesome photo of a younger Gill Pratt (working on a Toyota)

Gill Pratt, the well-known roboticist whom Toyota selected to lead its Silicon Valley effort, confirmed as much when he discussed the implications of the investment on Toyota's industrial automation infrastructure, which it calls Toyota Production Systems (TPS). Pratt was previously in charge of DARPA's Robotics Challenge.

"As extraordinary as the T.P.S. is," he told the New York Times, "we believe it can be improved still further through the use of more data and more A.I. There may also be advances in robot perception, planning, collaboration, and electromechanical design from [the Toyota Research Institute] that will translate into improvements in manufacturing robotics."

Toyota has also specifically mentioned its interest in developing assistive technologies to help the elderly. The company is already developing a Human Support Robot to coexist with family members in the home to improve living conditions and overall quality of life.

201507160106-1500x1000.png
Toyota's Human Support Robot

"Both of those fields are consistence with what we see in our own work. There's been an enormous expansion in interest in manufacturing and assistive technologies. What they have in common with each other and with autonomous driving is this notion that at some level robotic systems will be working with people. Autonomous driving is actually the more difficult problem because you have to contend with a completely unpredictable environment."

In other words, developing AI that enables robots to work autonomously around people in structured environments like factories and warehouses is a more approachable problem than trying to get an autonomous car to drive flawlessly in the unstructured environment of the open road. Resting somewhere in the middle of those two extremes is the semi-structured environment of a senior living center.

Though the specific applications seem very different, all three require significant work in machine perception, machine learning, and AI to enable machines to understand their environment, understand human intent, and plan tasks accordingly.

"Understanding human intent is a big one," Hebert confirms, offering a basic example. "Suppose I have a robot in my workspace. It will need to observe my hand moving, understand that it is moving in a forward direction along a trajectory, and therefore understand that, given what it knows about the environment, my hand is going to grab a cup. If it knows that, the robot can make much deeper decisions about how to stay out of my space or how to help me.

"You can clearly see how you can transfer that capability to manufacturing, to assistive technologies, and to autonomous driving."

When you line those three targets up -- factories, senior centers, and the open road -- you start to get a clear idea of where we'll be seeing the fruits of the research carried out by Toyota and by academic institutions like Carnegie Mellon in the coming years.

Editorial standards