Most moveable prosthetic limbs
usually require some kind of human management. It just basically uses certain mechanisms to receive signals from the users which, in turn, makes the action possible.
But what if the hand itself could see and needs no more assistance?
Well, here's the biomedical researchers from Newcastle University in the UK to answer that. They've designed such prosthetic with an AI-powered camera
attached to the knuckles to enable it to "see". Utilizing neural networks, it is taught to recognize about 500 objects so that, say, a mug, is in front of it, the camera would then take a picture of the object and make the hand react into a suitable “grasp type”, consequently enabling it to grab what's nearby within milliseconds. To scale, that's actually around 10 times faster than the existing limbs on the market!
“Using computer vision, we have developed a bionic hand which can respond automatically — in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction,” said Dr. Kianoush Nazarpour, a biomedical lecturer at Newcastle University, in a press statement
. “The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects — which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before.”
The prototype, just like any other, still has its own share of imperfections, of course. Its recognition of the objects isn't very flawless which is dangerous specially when the user opted to hold sharp objects like knives or have decided to grab his/her pet. Still, the researchers are optimistic and are looking forward to making a better version that will (hopefully) become more of like an organic limb.
“It’s a stepping stone towards our ultimate goal,” said Dr. Nazarpour. “But importantly, it’s cheap and it can be implemented soon because it doesn’t require new prosthetics — we can just adapt the ones we have."