Tucked away in the corner of Google's after party last night was a large screen with lots of people touching, tapping and zooming pictures and maps in and out. Chatting away with the experimenters was Jeff Han, the man behind Perceptive Pixel and the massively multitouch system that most US TV channels used to analyse results on election night in 2008. It's a pressure sensitive touch screen that uses infrared (and some very clever software) to sense where and how hard you're pressing.
It's three years since I first came across Perceptive Pixel after Han's impressive TED demo; the company has concentrated on private customers who can pay the "six figures" the kind of large screen he has here costs. Han would love to talk about the cool things clients are doing but many of them are military - he hints that a screen is headed for the White house situation room - and it's all hush hush. He does point out that part of the appeal is that 'even a four star general' can work the system himself instead of being put off by how complicated it is to work through a system using a keyboard and mouse and menus. And he suggests that smaller screens for home and business users are on their way; "we're just about to go commercial". He's been on a hiring spree; the company has a new office in Portland (because he found people with the particular skills he needed there) and he's in town setting up another office in Palo Alto.
Has the market settled down? I ask him. "That's not how I'd put it!" he says in surprise; "that makes it sound as if it's all set and it's only just starting."
The nearest similar system on the market is Microsoft Surface; he doesn't like the table format himself -" it's always upside down for somebody" - although he's keen on the idea of a personal drafting table. He's not interested in the hospitality business that Microsoft is going after or in using tags to interact with objects - but he does like using a pen a s well as touch on his screen. It's just a standard pen, he says showing me a biro ("it's nothing special)" and writing on the screen with it (without even taking the top off) as he's panning around the election map. He shows me one feature the screen doesn't have yet but that he wants to work on; "wouldn't it be nice if the system could tell the difference between when you do this [he scribbles in one direction] and when you do this [he changes his position and scribbles from the other direction]". So far 'passive' touch systems that don’t need special pens and equipment can't tell which direction you're coming from or when you get close to the screen without touching it; if Han can add that, it would give the system another advantage.
The screen was at Google IO as part of a mini version of O'Reilly's Maker Faire that runs this weekend in San Mateo. He's not staying for that but he's no stranger to hardware hacking. "I still have the scar I got from a soldering iron I got when I was six. We had a traditional house, all bare feet … When I was twelve I was making my own laser. I remember the electric shock I got - ow!"