Emerging Technologies Emerging Technologies #11 Monkeys Control Avatar’s Arms Through Brain-Machine Interface Emerging Technologies Table of Contents Section 1 – Executive Summary ........................................................................................................................ 3 Section 2 – Related News & Articles ................................................................................................................ 4 Article 1. Monkeys Use Minds to Control Avatar Arms ...............................................................................................4 Article 2. Monkeys Control Coordinated Arms Using Brain-Machine Interface ...............................................6 Article 3. Monkeys Control Avatar’s Arms Through Brain-Machine Interface .................................................7 Article 4. Brain-machine interface lets monkeys control two virtual arms .......................................................8 Emerging Technologies Section 1 – Executive Summary Monkeys Control Avatar’s Arms Through Brain-Machine Interface In a study led by Duke researchers, monkeys have learned to control the movement of both arms on an avatar using just their brain activity. To enable the monkeys to control two virtual arms, researchers recorded nearly 500 neurons from multiple areas in both cerebral hemispheres of the animals' brains, the largest number of neurons recorded and reported to date. The study suggests that very large neuronal ensembles—not single neurons—define the underlying physiological unit of normal motor functions. Small neuronal samples of the cortex may be insufficient to control complex motor behaviors using a brain-machine interface. Millions of people worldwide suffer from sensory and motor deficits caused by spinal cord injuries. Researchers are working to develop tools to help restore their mobility and sense of touch by connecting their brains with assistive devices. The brain-machine interface approach, pioneered at the Duke University Center for Neuroengineering in the early 2000s, holds promise for reaching this goal. However, until now brain-machine interfaces could only control a single prosthetic limb. Nicolelis may have shown that the monkeys can learn to use these avatar arms to complete a one simple task, but it's not clear that the same type of training will work for the more complex activities that humans need to perform, Contreras-Vidal cautions. "This is a first step." Emerging Technologies Section 2 – Related News & Articles Article 1. Source: Monkeys Use Minds to Control Avatar Arms http://news.sciencemag.org/brain-behavior/2013/11/monkeys-use-minds-control-avatar-arms (06/11/2013) Video URL: http://www.youtube.com/watch?v=ReNLwUcuMlY Most of us don’t think twice when we extend our arms to hug a friend or push a shopping cart—our limbs work together seamlessly to follow our mental commands. For researchers designing brain-controlled prosthetic limbs for people, however, this coordinated arm movement is a daunting technical challenge. A new study showing that monkeys can move two virtual limbs with only their brain activity is a major step toward achieving that goal, scientists say. The brain controls movement by sending electrical signals to our muscles through nerve cells. When limb-connecting nerve cells are damaged or a limb is amputated, the brain is still able to produce those motion-inducing signals, but the limb can't receive them or simply doesn’t exist. In recent years, scientists have worked to create devices called brain-machine interfaces (BMIs) that can pick up these interrupted electrical signals and control the movements of a computer cursor or a real or virtual prosthetic. The female monkey, called monkey C, first learned how to get the juice by moving joysticks with her real arms and hands—as she manipulated the joysticks, the right and left avatar arms did what she wished. After practicing this during regular 20- to 40-minute sessions over the course of a year, she was strapped into a padded chair so that she couldn’t move her own arms or hands, and trained to control the avatar arms just by thinking. After weeks of practice, she was able to complete the task more than 75% of the time, the scientists report today in Science Translational Medicine. Emerging Technologies Because a paralyzed person or amputee can't necessarily practice a task using joysticks, the next step was to determine whether observation alone could teach the BMI. Monkey M, a male, wasn't allowed to use the joysticks or move his arms at any point in the experiment—he simply observed the task being performed. It took longer for him to learn, but monkey M also learned to control the virtual arms using only his thoughts. Both animals’ performances improved over time, and the researchers noticed that their neuronal firing patterns changed as this happened, suggesting that their brains were adapting to the BMI devices. This could be because the monkeys came to consider the virtual arms as part of their own bodies, Nicolelis suggests. “The animals literally incorporate the avatar as if the avatar was them.” The basic technology that Nicolelis and colleagues used to extract instructions for movement from the mishmash of monkey brain signals isn't new, says Jose Contreras-Vidal, a biomedical engineer at the University of Houston in Texas. The real advance of the study, he says, is that the team was able to figure out which neurons they needed to record to control two arms working together. Although one might assume that it would be possible to simply combine neural activity from two arms acting independently, the study shows that cells act differently when they are coordinating the movements of two limbs than they do when separately instructing one limb or the other, he says. This is the first study to extract and use that complex information to coordinate arm movements in real time, Contreras-Vidal says. Although he agrees that the new study is strong, Andrew Schwartz, a neurobiologist at the University of Pittsburgh in Pennsylvania, thinks scientists can do better. Even though the monkeys’ task was quite simple, they had only about a 45% success rate overall, he notes. “I’m looking forward to higher performance and success rates and more realistic natural movements.” Nicolelis may have shown that the monkeys can learn to use these avatar arms to complete a one simple task, but it's not clear that the same type of training will work for the more complex activities that humans need to perform, Contreras-Vidal cautions. "This is a first step." Emerging Technologies Article 2. Monkeys Control Coordinated Arms Using Brain-Machine Interface Source: http://singularityhub.com/2013/11/14/monkeys-control-coordinated-arms-using-brain-machineinterface/ (11/11/2013) http://3278as3udzze1hdk0f2th5nf18c1.wpengine.netdna-cdn.com/wpcontent/uploads/2013/11/11.6.13_STM_Ifft.pdf The term “brain-machine interface” seems to invoke fear and discomfort in the minds of many, and gleeful enthusiasm in the minds of technophiles. Both of these reactions often overlook one of the major reasons for developing such interfaces: to enable amputees and those who suffer from paralysis to regain self-sufficiency by mentally controlling a mobile robot or robotic getup. Prosthetic limbs controlled by wearers’ thoughts are already available, though not commercially. They’re great for people who lack use of a single limb, usually just the part below the knee or elbow. But what about quadriplegics, for example? Duke University researchers Miguel Nicolelis and Peter Ifft managed to create a two-handed brain-machine interface using monkeys in a study recently published in Science Translational Medicine. Nicolelis has previously done similar work with a single arm. Attempts to empower two robotic limbs using a brain-machine interface have met with very limited success. “Bimanual movements in our daily activities — from typing on a keyboard to opening a can — are critically important. Future brain-machine interfaces aimed at restoring mobility in humans will have to incorporate multiple limbs to greatly benefit severely paralyzed patients,” Nicolelis said in a news release. To observe how the brain produces coordinated movement, Nicolelis and Ifft trained monkeys to use the hands of a realistic on-screen avatar, first with a joystick and, eventually, with just their minds. (An algorithm cleaned up the signals, looking for just the movement commands.) Using surgically implanted arrays, the researchers achieved a two-handed BMI by recording the activity of a wider selection neurons than had been used in previous efforts — nearly 500 from multiple areas in both sides of the animals’ brains. They then sent the movement impulses on to power the avatar arms and captured them for subsequent analysis. Emerging Technologies Article 3. Monkeys Control Avatar’s Arms Through Brain-Machine Interface Source: https://www.medgadget.com/2013/11/monkeys-control-avatar-arms-through-brain-machine- interface.html (13/11/2013) Following unfortunate accidents that leave perfectly healthy people severely paralyzed, many hear the diagnosis of “you will never walk again.” Though realistically this still holds true for many, the latest findings coming out of research labs point to the fact that practical therapies that may restore movement are no longer only the stuff of science fiction Researchers at Duke University have reported in journal Science Translational Medicine that they were able to train monkeys to control two virtual limbs through a brain-computer interface (BCI). The rhesus monkeys initially used joysticks to become comfortable moving the avatar’s arms, but later the braincomputer interfaces implanted on their brains were activated to allow the monkeys to drive the avatar using only their minds. Two years ago the same team was able to train monkeys to control one arm, but the complexity of controlling two arms required the development of a new algorithm for reading and filtering the signals. Moreover, the monkey brains themselves showed great adaptation to the training with the BCI, building new neural pathways to help improve how the monkeys moved the virtual arms. As the authors of the study note in the abstract, “These findings should help in the design of more sophisticated BMIs capable of enabling bimanual motor control in human patients.” Emerging Technologies Article 4. Brain-machine interface lets monkeys control two virtual arms Source: http://medicalxpress.com/news/2013-11-brain-machine-interface-monkeys-virtual-arms.html (06/11/2013) In a study led by Duke researchers, monkeys have learned to control the movement of both arms on an avatar using just their brain activity. The findings, published Nov. 6, 2013, in the journal Science Translational Medicine, advance efforts to develop bilateral movement in brain-controlled prosthetic devices for severely paralyzed patients. To enable the monkeys to control two virtual arms, researchers recorded nearly 500 neurons from multiple areas in both cerebral hemispheres of the animals' brains, the largest number of neurons recorded and reported to date. Millions of people worldwide suffer from sensory and motor deficits caused by spinal cord injuries. Researchers are working to develop tools to help restore their mobility and sense of touch by connecting their brains with assistive devices. The brain-machine interface approach, pioneered at the Duke University Center for Neuroengineering in the early 2000s, holds promise for reaching this goal. However, until now brain-machine interfaces could only control a single prosthetic limb. "Bimanual movements in our daily activities—from typing on a keyboard to opening a can—are critically important," said senior author Miguel Nicolelis, M.D., Ph.D., professor of neurobiology at Duke University School of Medicine. "Future brain-machine interfaces aimed at restoring mobility in humans will have to incorporate multiple limbs to greatly benefit severely paralyzed patients." The study suggests that very large neuronal ensembles—not single neurons—define the underlying physiological unit of normal motor functions. Small neuronal samples of the cortex may be insufficient to control complex motor behaviors using a brain-machine interface. "When we looked at the properties of individual neurons, or of whole populations of cortical cells, we noticed that simply summing up the neuronal activity correlated to movements of the right and left arms did not allow us to predict what the same individual neurons or neuronal populations would do when both arms were engaged together in a bimanual task," Nicolelis said. "This finding points to an emergent brain property—a non-linear summation—for when both hands are engaged at once."