Did Microsoft beat Apple to dual-sided touch technology?

A recently disclosed Apple patent describes a number of devices using a two-sided touch panel and to no surprise, the speculation online begins. However, some of the imaginings sound like technology that Microsoft has shown before.
Written by David Morgenstern, Contributor

A recently disclosed Apple patent describes a number of devices using a two-sided touch panel and to no surprise, the speculation online begins. With a patent that describes "embodiments" of a technology, the dreams can go anywhere, it appears.

However, in some of these embodiments, Microsoft appears to have been there first!

Unwired View posted the Dual-sided Track Pad patent document and an accompanying piece with imaginative description of iPhones and MacBook tablets using the technology (with illustrations, even).

Here's a bit from the patent's technology summary:

The capacitive array element may be a dual-sided panel that is capable of sensing touch from either side and sending signals indicative of the touches to a host device (e.g. a desktop computer, a laptop computer, a digital music player or a mobile telephone unit). The capacitive array element may be able to sense multiple simultaneous touches and distinguish on which side the touches occur. A connected processor unit, either in the device or in the host system, may be adapted to interpret the signals as a cursor or view movement with six degrees of freedom.

In some embodiments, the track pad device includes a display element and the capacitive array element is translucent. The display element and the array element may be configured with respect to each other, possibly including a configuration where the array lies over the display screen so that the display screen is viewable through the array element, in effect forming a touch screen. The device may have a touch screen mode of use activated by the device's being configured with the array element over the display screen, which allows a one-to-one correspondence between the array element and the display screen. The device may interpret and respond to the signals from the the array element in a configurable way.

As long as it's configurable! Whew, I was worried there. This patent is tough reading. (And am I the only one who's worried about patents that call expressions of a technology, "embodiments?")

The Unwired View article takes the patent device and imagines a number of "expressions" of its own, including a iPod nano-sized iPhone with a clamshell input pad, a Mac tablet notebook with a virtualized keyboard, and a standard notebook with a clear trackpad that makes a porthole (more of a port-square, really), letting users see a section of the screen when the lid is closed.

The authors seem to think there's a great demand for a clamshell form-factor about the size of an iPod nano, that's really, really small and even harder to type upon than the current iPhone. Is that what we want? I find the current iPhone touch form factor is usable by folks with small and big paws alike.

Some of the "expressions" described in the article sound familiar because they have been sighted at Microsoft's show-and-tell technology fairs.

The first is Windows Slideshow, a feature of Windows Vista. It can display information from a notebook even when the system is turned off or sleeping. The information is held in a small RAM cache and displayed on a secondary display — or it would if a notebook maker would make such a machine.

It would be high irony if Apple shipped a notebook with something like Slideshow, albeit done very differently, before Microsoft's cost-conscious technology partners — every other PC maker besides Apple — got around to implementing Slideshow.

The two-sided touch concept reminded me of a Microsoft technology demonstration last summer. Created with the Mitsubishi Electronics Research Laboratory, the technology is called LucidTouch. A video is available on the MERL site.

The idea here is that you can't really see what's under your fingers. By accepting input from both sides and then displaying digital images of the fingers under a map (with active points on the fingertips), users can see more of the map and interact with it in a rich way.

Here's a bit from MERL's technical introduction:

Many direct touch input devices provide only two input states: out-of-range and dragging, the assumption being that the user's finger or stylus provides feedback in order to anticipate the point of interaction. When the hands are behind the display, this visual tracking is not possible. Our pseudo-transparency approach allows users to see their hands as they are attempting to acquire a target from the back of the device, thus solving not only the occlusion problem, but also the lack of tracking feedback. In order to overcome the fat finger problem, simple computer-vision techniques are applied, allowing each finger's touch points to be visualised prior to making contact with the touchpad. As a result, LucidTouch enables fast and intuitive land-on selection, in contrast to the take-off selection techniques other opaque devices employ.

It looks interesting. And wacky.

All of this, from the imaginings from the Apple patent to Microsoft's LucidTouch, sounds very complicated to me. Reliability in mobile computing is very important to usability and these schemes may have the potential to bring a new high score for electro-mechanical points-of-failure.

Editorial standards