A breakthrough integration of neuro engineering and robotics for smart artificial hand

A breakthrough integration of neuro engineering and robotics for smart artificial hand

smart-artificial-hand Healthcare is certainly not immune to such changes, as the advanced technologies are swiftly changing everything from the way the caregivers take charges of patient, paper-pen consultation to remote monitoring devices – to the way diseases are treated. Robotics, on the other hand, is changing the lives of people to a great extent and will continue to move on its alleyway in the upcoming years. Out of the several areas where robotics is stepping forward, one of them is the need of the hour for handicaps.

The encroachment of technology is empowering people to excel in their health, and that’s why people with disabilities are the early adopters of the newly conceived tools. Until recently, wearable robotic devices were enhancing the consumer’s abilities that was considered largely the stuff of science fiction. On September 11, 2019, the researchers of Ecole Polytechnique Federale de Lausanne (EPFL) announced the development of a pristine approach to manage robotic hands with AI. Researchers have merged two novel concepts, namely robotics, and neuro-engineering, and have also included automation for providing strong grasping ability. The combination of these two new fields for the development of robotic hand control is the first attempt in the history of neuroprosthetics. A neuro-engineering concept involves AI deciphering intended for finger movements through the muscular activity for individual control of fingers in prosthetic hand which have never been achieved before, in contrast, the robotics cling to objects and maintains contact with them for robust grasping. 

The researcher at EPFL’s Learning Algorithms and Systems Laboratory stated that “When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react.” He further added, “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along with the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping. “

The algorithm engineered by the scientists for the robotic hand of amputees first learns the way to decode user intention, following to which, it translates the algorithm into finger movement of the prosthetic hand. Though the amputee needs to perform a sequence of hand movements so as to coach the algorithm that uses AI. The sensor embedded in the robotic hand then detects the muscular activity of the amputee and learns the algorithm accordingly, the one which matchup with the patterns of muscular activity associated with the hand movement. The moment the algorithm learns the user’s finger movements, the corresponding information is subsequently used to control individual fingers of the prosthetic hand. The all-new robotic hands take less reaction time as they are equipped with a range of pressure sensors; all along the fingers so that it can react and stabilize the object way earlier than the brain can perceive that the object is slipping.

In lieu of an abstract, the researchers concluded the research as “Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices.“