X
Tech

New sensor tech makes almost any surface a touch surface

User interface company Sentons has released a developer smartphone that shows how its pressure-sensitive ultrasound-based touch technology can trigger a range of actions.
Written by Ross Rubin, Contributor

As I wrote in my column about the Surface Neo's dual display, smartphones are encountering diminishing returns when it comes to increasing screen-to-body ratio. Even as it approaches 100%, screen-to-body ratio offers a skewed description, measuring only utilization on the front of a device. Now, technology providers are looking to make better use of the non-screen parts of the phone via sensor-driven, pressure-sensitive technologies. Sentons' SurfaceWave technology, for example, uses ultrasonic waves to precisely measure where and with how much pressure a finger or other object presses against a surface. CEO Jess Lee likens the approach to representations of how sonar from a submarine bounces off undersea objects.  

We have already seen simple examples of this kind of functionality via the side squeeze features in HTC and Google Pixel phones as well as the non-mechanical Home buttons on pre-Face ID iPhone 7 and 8. ASUS uses Sentons' technology in conjunction with haptic feedback for the AirTriggers features in its lavishly accessorized ROG gaming phone.

Beyond pressure sensitivity, these capacitive alternatives offer a few advantages. They can work with gloved hands or other implements such as paintbrushes, and even work underwater. But they can also work in conjunction with capacitive touchscreens. Apple's Force Touch, which did not use Sentons' technology, was an example of such a combination, but Apple abandoned the input method.

Such a rejection may seem like bad news for pressure sensitivity. Force Touch faced obstacles as a late addition to the iOS interface, though. For example, force-touching an icon in the launcher would bring up a context menu whereas long-pressing that icon would bring up the familiar quivering as a signal to rearrange or delete the app. But in other operating environments that lack Force Touch, such as Android Windows' touch UI, a long press brings up a context menu. As an activity that requires more effort, Force Touch should enable the rarer behavior of icon quivering while long press should enable the context menu, but that would have required undesirable relearning. (Of course, Apple could avoid this messy scenario if it would just provide a more efficient way to rearrange and maintain icon layouts on the home screen.)

So, there's far more to explore around such functionality on a smartphone. Sentons this week released a reference phone for developers that offers a range of different touch-driven gestures for experimentation. For example, in addition to the virtual shoulder buttons implemented by ASUS, its reference phone can use Sentons' tech to zoom the camera in and out just by running one's finger along the border of the display. This approach doesn't obstruct the image and could be particularly useful now that smartphones are sporting wider zoom ranges than ever.

But there's also potential for the technology beyond the smartphone. Smartwatches and other wearables also provide an opportunity given that they may have limited displays or lack displays altogether. And on the other end of the device size spectrum, car manufacturers such as BMW have shown off the potential for touch-enabling various surfaces in vehicles such as steering wheels and sections of the seating. Cars present a prime opportunity since drivers' eyes should be focused on the road versus screens. The touchscreen revolution that kicked off with the iPhone may soon extend to a far wider range of surfaces.

Editorial standards