Whatever rumours you want to believe regarding the release date of the next iPhone (? 15 November?), its hardware features (12-megapixel camera!), display ( ) or even its name (iPhone 5S? iPhone 6?), two things are certain: one, the unveiling of Apple's new device is not far away; and two, it stands to reason that said device will have some barnburner features in order to regain market share .
Thus it's likely that Apple is going to try and out-Samsung Samsung, meaning that it will try to take whatever features the Galaxy devices have and do them one better in the new iPhone. One of those features is eye-tracking; on the, you can pause or stop a video playing on the screen simply by looking away. The phone 'knows' when you've stopped watching, and assumes you want to watch the rest of the show, so it pauses the video until you're ready.
If Yitzi Kempinski, CTO and co-founder of uMoove, knows whether that feature — or a more advanced application of real eye-tracking — will be included on the next iPhone, he isn't letting on. Many companies are working with uMoove though, and within months, the first apps and games that use the company's completely software-based eye- and face-tracking technology will be on the market.
uMoove, which has been in business for three years but was in stealth mode until just a few months ago, is now offer eye- and face-tracking that, Kempinski said, will work with almost any smart device.
"The Kinect and others that can do some eye-tracking require dedicated equipment, like a camera, infrared, etc. What we have is completely software-based, and will be able to work with any iPhone or iPod, for example. If you install an app that uses our SDK, you will be able to magically use eye-tracking on your two-year-old device," he said.
The most obvious use of eye-tracking remains gaming. "Imagine driving a virtual motorcycle using your gaze and eye movements," he said. "It makes sense because drivers online and off move their heads around."
Eye and face are useful for a lot more than games, Kempinski said: "You could use face detection, for example, to replace the functions of a mouse; by looking done you would scroll down on a page, just like you would with a mouse." These functions are already available in the uMoove SDK, he said — and eye and face-tracking can enhance the overall computing experience as well.
"By using these eye and head 'gestures' we're able to create an even more immersive experience than you get with gesture," he said. "If you're using hand gestures to control an app, you're by definition excluding touch, since you can't be gesturing and touching with the same hand at the same time. Using eye and face, you can add an additional layer to the computing experience, one that at this time is not in common use."
Far less ambitious functions eat up gobs of memory, but Kempinski claimed that uMoove doesn't hammer it in the same way. "We've tested our SDK in apps on a Galaxy S3 and other devices. It only takes up two percent of the processor in real-time," he said.
"Eye-tracking has been around in some form for a long time, and all the tech was written for hardware-based systems. We had to develop our system from scratch. Not overusing resources was definitely a big challenge, but it was one we had to overcome if we wanted the system to be useful commercially."
Kempinski said the system takes advantage of most of the sensors on smart devices, from 2D camera light sensors to the accelerometer.
"You have to compensate for things like the shaking of the camera while a person is holding the device, because that could distort the eye-tracking," Kempinski said. "You also have to make sure that the tracking works in various levels of light, even low light, and in the mobile environment the light changes when you move just a few inches.
"While we call it eye-tracking, the truth is there is much more behind it, [including a slew of patents]. The eyes often move involuntarily and you have to figure out whether a particular movement was meant to accomplish a task or is something to ignore." Not to mention that you want may want eye-tracking in some contexts, such as moving an object in a game, "but there are times you need to ignore it, like when a player wants to check their score on the screen".
While eye-tracking may be still considered a novely in computing terms, not that long ago the same was said about touch.
"Like touch and voice, eye and face provide another layer of the computing experience, and another set of options for users," he said. "Just like touch did not, and probably won't in the near future, replace keyboards and mice, eye and face will be added to the mix to make interfacing with devices more fun and productive. We lived just fine before touch, but it brought another level of interaction to computing, and so will this."