New implant added sense of touch to the robotic arm

A team of researchers unveiled a robotic arm that is controlled by brain implants. It sends touch signals to the user.

The researchers explained that when people pick up an object, vision works first. Then other senses come into play. People have a feeling called proprioception, which helps us understand where parts of the body are, even if they are not visible. At the same time, the sense of touch tells us when we have established contact with the object, and the feeling of pressure gives us an idea of ​​how tightly we gripped the object. The visual system becomes secondary in this process.

Users of manipulators, on the other hand, have to visually determine when they have grabbed onto an object just by looking at it. However, this requires a lot of training and full attention. Adding other senses would have obvious benefits.

The first attempts by scientists to create feedback and pressure during touch involves the transfer of sensations to an area of ​​\ u200b \ u200bthe skin. The system required extensive training to translate all user sensations into information about the pressure exerted by the manipulator fingers. However, the researchers then got an idea of ​​the areas of the brain that process the information sent to them by the sensory nerve cells of the hand. For the new study, a team of specialists implanted two electrode arrays in the part of the brain that processes information from the skin. The activation of the electrodes caused the sensation that something was interacting with the palm and fingers.

A study participant, paralyzed below the neck, controlled a robotic arm for about two years using brain implants in the area of ​​motor control of the brain. He could use his hand even if he didn’t feel anything. However, in new experiments, the research team alternated tests in which the hand had additional tactile feedback, and tests in which this system was disabled. Most of the tests involved grabbing objects of various shapes, moving them somewhere, and then dropping them.

In many individual tests, a similar pattern was observed: having a sense of touch dramatically increased performance. The average take-move-drop sequence execution time decreased in all cases, and in about half of them the difference was statistically significant. During the time it took the participant to complete nine tasks with the sensor system turned off, he was able to complete more than a dozen tasks with the active system.

While every aspect of the mission has been improved, the most significant growth has occurred in the process of capturing the object. The time between when the participant touched the object with his hand and lifted it off the table was reduced by two-thirds when sensory feedback was turned on. When the system was turned off, the competitor spent more time positioning the arm to provide a stable grip before moving on.

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Alexandr Ivanov earned his Licentiate Engineer in Systems and Computer Engineering from the Free International University of Moldova. Since 2013, Alexandr has been working as a freelance web programmer.
Function: Web Developer and Editor
Alexandr Ivanov

Spelling error report

The following text will be sent to our editors: