X
Innovation

Israeli vision technology will bring gesture-control to cars, robots

Computer vision company signals a new era for natural human-machine interaction.
Written by Greg Nichols, Contributing Writer

Gesture control was a big theme at CES 2016, with a new crop of toys and gadgets that users control with a few waves of the hand. But most of those CES devices, like most of the gesture-control gizmos on the market now, rely on gloves, watches, and wristbands to capture user movement and convert it into commands.

That's not ideal. Wearable sensors are easily lost, and devices that rely on them can't readily be operated by multiple new users in high traffic situations. Imagine a gesture-control check-out console at a grocery store: it's impractical (and unhygienic) for everyone who checks out to don a wristband. Wearable gesture control sensors also work best for the hands, limiting applications. Automakers, for example, would never make a gesture control feature that requires a driver to take their hands off the wheel. A nod of the head might be a safer gesture for drivers, and unless you can get rich guys to wear flower crows studded with sensors, it's not going to happen with wearables.

The answer is gesture control technology that doesn't require the user to wear any kind of sensor--gesture control that relies on smart computer vision instead of motion capture. That's technology that Israel-based eyeSight Technologies has been working on.

The company delivers natural user interactions with a range of consumer electronics, from mobile phones and tablets to PCs, TVs, Virtual Reality (VR) headsets, among others, allowing touch-free control with the swipe of a hand or point of a finger. eyeSight also has an IoT and smart home solution called singlecue, which allows users to control the devices they already own with touch-free finger gestures.

The company's early lead developing computer vision for consumer products is starting to pay off with partnerships with companies like Lenovo, Toshiba, and Phillips. And just last week, Chinese conglomerate Kuang-Chi announced a $20 million investment in eyeSight to bring embedded computer vision to a range of growing technology sectors, including internet of things (IoT), robotics, and automotive.

Dorian Barak, the Managing Partner of Indigo Global, a financial adviser on the investment, explains, "Kuang-Chi is one of China's leading tech groups, with strength in a range of technologies including robotics and autonomous motion. Their decision to invest in eyeSight is a strong endorsement of the Israeli company's unique capabilities and vision. As their cornerstone investment here in the startup nation, it sets a high standard for Israel-China collaboration."

Dr. Ruopeng Liu, chairman of Kuang-Chi, called out robotics applications as a particular area of interest. With personal assistant robots starting to become a reality, we're going to see more and more platforms that wed deep language learning with computer vision to create superlative, natural UIs that respond to both speech and body language in ways that feel human.

Editorial standards