Prosthetics are an essential part of life for people with amputated limbs. They make life easier while also helping individuals carry out regular day-to-day tasks without needing constant assistance. Recently, a researcher has developed an intelligent prosthetic arm that can normalize the lives of disabled people further.
The newly introduced arm can move naturally and autonomously by the wearer’s limb movements and body posture. The prosthetic would be a massive contribution to the Prosthetic Arm Market as it would relieve individuals from being constantly aware of their arms and whether it is correctly placed or not.
The researcher involved in the development thought of obtaining the body movement data through brain-computer. However, the option is unreliable and would need electrons to be embedded within the user’s brain. So instead, training was done through machine learning. It is this movement data upon which the arm’s movements are based.
For accomplishing the task, a motion capture suit was created via 3D printed parts. This helped in accumulating all the data from the body motion: head, legs, and arms. The suit can measure joint movements through rotating prices by magnetic encoders. Plus, it measures head and limb position with the help of a unique headband via MPU-6050 inertial measurements suit and Teensy LC boards.
After that, the data received is fed into a machine learning model situated on the suit’s Raspberry Pi Zero using AOgmaNeo. The technology refers to a lightweight C++ software library created to run on low-power devices like microcontrollers. AOgmaNeo is a reinforcement machine learning system responsible for understanding what the data is doing to itself. The task denotes that any piece of data can be removed, and after completion of training, the software will work towards replacing the missing piece with a learning output. In the present study, the team removed the right arm and used the software’s learned output to move the prosthetic arm. They believe that the technology could be applied on any limb.
The team added that the machine learning model trained enables one to put Raspberry Pi Zero into playback mode. It can then match the actions of the wearer’s other arm and act according to the training as per the positions and movements of other body parts. Thus, controlling the backpack-mounted movements of the arm intelligently.
The researcher demonstrated this aspect of research by walking on the spot. The test showed that the prosthetic arm swung autonomously in the opposite way to the other arm as the individual raised his leg to walk forward. It was also noted that the arm hung still when the individual was standing.
There is much that remains to be done before the arm would be applicable for commercial purposes. However, its autonomous abilities are already a massive success for the prosthetic limbs sector.