Technological developments and innovations have the ability to leave a person in awe. The robots have the potential to do all our work, and we have worked hard to design them. But what would happen when robots and humans work together. Well, researchers from EPFL (Ecole Polytechnique Federale de Lausanne) have developed an innovative solution for the amputees that share both human and robotic control. Using it, the wearer can control the fingers, and the robotic handle can control the manipulation of objects.
Here the scientists have leveraged two concepts from two different domains – the first concept involves deciphering the finger movements through the activity of muscles of the one who wears it and translates it to individual finger control. The other ideas deal with the robotic side, which allows the manipulation of objects like grabbing the object and maintain contact for better grasping.
As per the head of EPFL’s Learning Algorithms and Systems Laboratory, when a person holds an object in his hand, and it starts to slip, he/she have only a couple of seconds to react. The robotic hand can respond within 400 milliseconds. The handle has pressure sensors all along with the fingers, which can react and stabilize the object before the brain can perceive that the object is slipping.
Artificial intelligence can translate and augment the intended movement of the artificial arm, especially in situations where the user’s muscular activity is not sufficient to complete the task – gripping the bottle. During this, the user might lose strength in their grasp. This would give him just milliseconds to react and readjust their grip before the object is all set to fall. This is where artificial intelligence comes into the picture. AI can automatically interpret the user’s muscular signals and understands when it gets weak. It ensures that the robotic hand won’t allow the object to fall when it is not supposed to, even before the human brain might perceive that the bottle is about to slip.
The scientists have also leveraged the machine learning algorithm that extracts meaningful activity from the muscles and converts them into movements.
Before using the artificial arm, the user is supposed to train the model via a series of hand movements. Sensors are also placed on the end of the limb of the wearer that collects data on muscular activity. The algorithm then learns how to decipher the user’s intention in real-time by studying the patterns in this data. Once the system is trained, the users can more accurately control individual fingers of the artificial hand, thus providing the user with more agility than a conventional artificial hand might.
The artificial arm was successfully tested on with three amputee and seven healthy subjects. It is yet to be commercialized. Our best wishes with the team to commercialize it and help the amputee to do their tasks efficiently. What are your views on the artificial arm? We would love to listen to it.