X
Business

Do we need a multi-touch gesture alphabet?

Developers working in the touch screen arena are, it appears to me, wielding some of the greatest wow-factor breakthroughs around at the moment. It was only at Microsoft’s ReMIX 08 rich media conference last September that I first saw the company exhibit its ‘Surface’ table.
Written by Adrian Bridgwater, Contributor

Developers working in the touch screen arena are, it appears to me, wielding some of the greatest wow-factor breakthroughs around at the moment. It was only at Microsoft’s ReMIX 08 rich media conference last September that I first saw the company exhibit its ‘Surface’ table. The developers and web designers crowded around it like flies, so much so that I couldn’t get near it. The first time I did get to play with one was actually at a Curry’s Megastore when I ordered a new fridge last week.

So as this development stream becomes more widespread, my question is this: do we need to agree on a set alphabet for multi-touch gestures and label them all now so that we avoid fragmentation and incompatibilities in this space in the future?

Last week I mentioned Embarcadero’s sneaky peak ‘let’s get three press releases out of one story’ programme for releasing its RAD Studio 2010 product. This week the company is sneakily peaking its second preview of this tool, which it claims will enables developers to build touch based GUI, tablet, touchpad and kiosk applications with a flexible touch enabled framework. But how many touch gestures has Embarcadero provisioned for? The answer is 30 and these include left, right, up, down, scratch-out and interactive multi-touch gestures like pan, zoom and rotate.

Who says there are 30 multi-touch gestures then, or that Embarcadero has been intuitive enough to pick the most logical group. I’m sure they are very sensible choices. But what if we all start using touch tables, iPhones and kiosks that much more interactively and we start to develop a new set of touch behaviour? Who is going to keep track of this and try and provide some nomenclature to classify the way we work with these machines so that developers can programme with optimum efficiency?

Let me give you an example. When I am in the states I use a fantastic DVD rental service called RedBox where the movies are a dollar a night. The kiosk interface is OK, but it tends to time out and think I’ve left the scene when in fact I am just standing there still trying to decide which Robert De Niro masterpiece I am going to watch. What if we came up with a new touch behaviour? Say I move my finger in a circle meaning I am pondering. That way the system might now it needs to give me more time or more information.

Obviously this is just thinking out loud. But what if becomes a real issue? After all, there are about 100 touch receptors in each of your fingertips.We have an alphabet on our keyboards. Don’t we need a set of standards for touch - especially if we are to run with good interoperability across multiple operating systems?

Editorial standards