SYDNEY: Technology has come one step closer to helping disabled people perform more natural movements using a robotic limb, with the development of a new type of ‘brain–machine interface’ by U.S. scientists.
A brain–machine interface (BMI) is a direct communication pathway between the brain and an external device, such as a robotic arm. This new approach, which used computer algorithms to decode electrical impulses in the brains of rhesus monkeys, is the first of its kind that could enable users to plan and perform a series of sequential movements naturally.
“This is different from BMIs used thus far, which require users to plan and execute each element of the sequential movement one at a time. In such BMIs, for example, the user cannot plan the second letter of a word before typing the first letter,” explained Ziv Williams, professor of neurosurgery at Massachussets General Hospital in Boston, USA, and co-author of the study, published in Nature Neuroscience.
“Development of this new BMI implies that it may be possible, in principle, for patients to plan and perform sequential movements as they would do naturally, for example typing the full planned series of letters in a word,” he said.
New functional brain structure revealed
Williams and colleagues recorded the electrical impulses from the brains of rhesus monkeys trained to remember a sequence of two locations on a computer screen and, after a short pause, move the cursor to those locations.
They found that the two movements could be decoded, using computer algorithms, from separate, small groups of neurones in the premotor cortex – a part of the brain involved in planning and executing limb movements.
“Our results reveal a new functional structure within the premotor cortex that allowed for accurate and concurrent decoding of two planned motor targets across multiple spatial locations,” the authors wrote in the paper.
“Only a small number of neurones was sufficient to accurately predict the location of both targets, making the decoding of such information highly robust.”
The two distinct subpopulations of neurones allowed the two planned targets of the movement to be simultaneously held, without degradation, in the ‘working’ memory – a brain system that provides temporary storage and real-time processing of the information necessary to perform complex tasks.
Exploiting these mechanisms, the team then developed a BMI that could not only predict both of the intended movements simultaneously, but also drive the movements in real time alongside the monkey’s motor response.
Direct recordings of brain activity
According to Geoffrey Goodhill, a computational neuroscientist at The University of Queensland, Australia, the ability to use direct measurements of brain activity to drive movements of a prosthetic limb is a critical aid to people who are paralysed.
“This paper takes an important step beyond previous work by showing that it’s possible to decode a sequence of intended movements from direct recordings of neural activity, not just a single movement,” said Goodhill, who was not involved in the study.
“Overall it’s a great example of how experiments on animals, combined with sophisticated mathematical modelling, can yield real insight into a problem of significant medical relevance,” he said.
“Plenty more work to be done”
Williams said the new design could also lead to the development of BMIs that can analyse intended movements before executing them – therefore enabling a robotic limb to perform the movements more effectively.
“For example, it may also be able to type the word faster or remove spelling mistakes before typing,” he said.
Williams added that the next step would be to further examine how the new BMI can be used to perform more fluid and accurate sequential movements using a robotic limb, and to test the design on human patients.
According to Goodhill, however, “there is still plenty more work to be done”.
“The decoder only correctly predicted the intended movement 50–80% of the time. This was much better than chance, but certainly not good enough to, for instance, type fluently on a keyboard,” he said.
Abstract of the paper, published in Nature Neuroscience.