A new super high-resolution three-dimensional imaging system could help forensics specialists identify weapons, doctors identify cancers and manufacturers inspect products.
Researchers at the Massachusetts Institute of Technology say they created a portable imaging system that can achieve resolutions previously possible only with bulky, expensive lab equipment.
The system is called GelSight and it promises to help allow detailed inspection of items too large to fit under a microscope. The device uses made of a piece of transparent synthetic rubber, one side coated with metal-flecked paint. When pressed against the surface of an object, the paint-coated side of the slab deforms; multiple cameras mounted on the opposite side photograph the results, after which a computer algorithm analyzes the images.
Sounds low-tech, but it's highly accurate. An earlier version of the system from 2009 was sensitive enough to detect the raised ink patterns on a $20 bill; the latest iteration can register physical features less than a micrometer in depth and about two micrometers across.
But here's an interesting nuance: because the system uses multiple cameras, it can produce 3D models of an object, helping analysis.
The production of this kind of imaging at micrometer-scale usually requires a confocal microscope or a white-light interferometer, both large, both lab-bound and both of which take minutes to hours to produce a 3D image.
To boot, such a device must be mounted on a vibration isolation table. But researchers Edward Adelson and Micah Kimo Johnson built a prototype sensor to handle this task.
The researchers say they're already in discussion with one major aerospace company and several manufacturers of industrial equipment to use the technology; all seek to monitor the integrity of their products.
There's also a law enforcement application: the system offers a cheap, efficient way to identify the impressions that particular guns leave on the casings of spent shells. It also can help with more detailed fingerprinting and even distinguishing cancerous growths from moles.
Not bad for a project that began as a way to create tactile sensors for robots.
This post was originally published on Smartplanet.com