Researchers have launched a trial for prosthetic limbs which have been designed to reach for objects automatically in a bid to make them act more like natural, organic limbs.
A team of biomedical engineers from the UK's Newcastle University, funded by the Engineering and Physical Sciences Research Council (EPSRC), said this month that a bionic hand has been developed which "sees" objects in its environment.
Through the use of a camera, the prosthesis has been designed to take an image of objects in front of it, assess the shape and form of the object and then automatically trigger a set of movements in the hand within milliseconds, mimicking the unthinking way we naturally use organic limbs to grasp objects.
The UK gains roughly 600 new upper-limb amputees a year while the United States accounts for an additional 500,000. Prosthetic limbs are currently controlled by devices which capture and translate myoelectric signals -- electrical activity of the muscles recorded from the skin surface of the stump -- but learning to control these kinds of prosthetics take time, a lot of practice, and are not as fluid or natural as the original limb.
According to the team, the new technology is now ready for trial and will be offered to patients at the Newcastle Freeman Hospital through the UK's National Health Service (NHS).
"Using computer vision, we have developed a bionic hand which can respond automatically -- in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction," Dr. Kianoush Nazarpour, Senior Lecturer in Biomedical Engineering at Newcastle University said. "Responsiveness has been one of the main barriers to artificial limbs."
"For many amputees, the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison," Nazarpour added. "Now, for the first time in a century, we have developed an 'intuitive' hand that can react without thinking."
By using neural networks and artificial intelligence, Ph.D. student Ghazal Ghazaei and Nazarpour, co-authors of the research and subsequent study, taught a computer system to recognize the 'grip' needed for different objects, such as a rock or stick.
To take into account the environment, the team took many images of the same object at different angles and heights in various lighting conditions.
"The computer isn't just matching an image, it's learning to recognize objects and group them according to the grasp type the hand has to perform to successfully pick it up," Ghazaei said. "It is this which enables it to accurately assess and pick up an object which it has never seen before -- a huge step forward in the development of bionic limbs."
The prosthetic limb isn't going to cost the earth, either. A 99p ($1.28) camera is fitted with the prosthesis, and once the object is detected and scanned, the hand will conduct one of four movements: a palm-wrist neutral grasp -- used when you pick up a cup -- the movements needed to grasp a tv remote, a "tripod" movement which brings two fingers and your thumb together, and a pinching movement which uses a thumb and your first finger.
The team hopes to eventually spin out the work into another prosthesis project to create a hand that can also sense pressure and temperature before sending this information back to the brain.
"It's a stepping stone towards our ultimate goal," Nazarpour says. "But importantly, it's cheap and it can be implemented soon because it doesn't require new prosthetics -- we can just adapt the ones we have."
Response is one of the most important components of a successful prosthesis, not only in helping amputees adapt to their new situation but also in order to provide functionality suitable for daily life.
Other companies, such as Myo, are also developing new ways to improve the connection between organic and artificial by creating technology that reads electromyographic pulses (EMC) caused by skeletal muscles to prompt the correct movements in response to the wearer's wishes.
The most exciting, innovative MIT projects in 2016