The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Fri, 7 Mar, 12:04 AM UTC
4 Sources
[1]
UCSF researchers enable paralyzed man to control robotic arm with brain signals
University of California - San FranciscoMar 7 2025 Researchers at UC San Francisco have enabled a man who is paralyzed to control a robotic arm through a device that relays signals from his brain to a computer. He was able to grasp, move and drop objects just by imagining himself performing the actions. The device, known as a brain-computer interface (BCI), worked for a record 7 months without needing to be adjusted. Until now, such devices have only worked for a day or two. The BCI relies on an AI model that can adjust to the small changes that take place in the brain as a person repeats a movement - or in this case, an imagined movement - and learns to do it in a more refined way. This blending of learning between humans and AI is the next phase for these brain-computer interfaces. It's what we need to achieve sophisticated, lifelike function." Karunesh Ganguly, MD, PhD, neurologist, professor of neurology and member of the UCSF Weill Institute for Neurosciences The study, which was funded by the National Institutes of Health, appears March 6 in Cell. The key was the discovery of how activity shifts in the brain day to day as a study participant repeatedly imagined making specific movements. Once the AI was programmed to account for those shifts, it worked for months at a time. Location, location, location Ganguly studied how patterns of brain activity in animals represent specific movements and saw that these representations changed day-to-day as the animal learned. He suspected the same thing was happening in humans, and that was why their BCIs so quickly lost the ability to recognize these patterns. Ganguly and neurology researcher Nikhilesh Natraj, PhD, worked with a study participant who had been paralyzed by a stroke years earlier. He could not speak or move. He had tiny sensors implanted on the surface of his brain that could pick up brain activity when he imagined moving. To see whether his brain patterns changed over time, Ganguly asked the participant to imagine moving different parts of his body, like his hands, feet or head. Although he couldn't actually move, the participant's brain could still produce the signals for a movement when he imagined himself doing it. The BCI recorded the brain's representations of these movements through the sensors on his brain. Ganguly's team found that the shape of representations in the brain stayed the same, but their locations shifted slightly from day to day. From virtual to reality Ganguly then asked the participant to imagine himself making simple movements with his fingers, hands or thumbs over the course of two weeks, while the sensors recorded his brain activity to train the AI. Then, the participant tried to control a robotic arm and hand. But the movements still weren't very precise. So, Ganguly had the participant practice on a virtual robot arm that gave him feedback on the accuracy of his visualizations. Eventually, he got the virtual arm to do what he wanted it to do. Once the participant began practicing with the real robot arm, it only took a few practice sessions for him to transfer his skills to the real world. He could make the robotic arm pick up blocks, turn them and move them to new locations. He was even able to open a cabinet, take out a cup and hold it up to a water dispenser. Months later, the participant was still able to control the robotic arm after a 15-minute "tune-up" to adjust for how his movement representations had drifted since he had begun using the device. Ganguly is now refining the AI models to make the robotic arm move faster and more smoothly, and planning to test the BCI in a home environment. For people with paralysis, the ability to feed themselves or get a drink of water would be life changing. Ganguly thinks this is within reach. "I'm very confident that we've learned how to build the system now, and that we can make this work," he said. University of California - San Francisco Journal reference: Natraj, N., et al. (2025). Sampling representational plasticity of simple imagined movements across days enables long-term neuroprosthetic control. Cell. doi.org/10.1016/j.cell.2025.02.001.
[2]
Paralyzed man moves robotic arm with his thoughts
Researchers at UC San Francisco have enabled a man who is paralyzed to control a robotic arm through a device that relays signals from his brain to a computer. He was able to grasp, move and drop objects just by imagining himself performing the actions. The device, known as a brain-computer interface (BCI), worked for a record 7 months without needing to be adjusted. Until now, such devices have only worked for a day or two. The BCI relies on an AI model that can adjust to the small changes that take place in the brain as a person repeats a movement -- or in this case, an imagined movement -- and learns to do it in a more refined way. "This blending of learning between humans and AI is the next phase for these brain-computer interfaces," said neurologist, Karunesh Ganguly, MD, PhD, a professor of neurology and a member of the UCSF Weill Institute for Neurosciences. "It's what we need to achieve sophisticated, lifelike function." The study, which was funded by the National Institutes of Health, appears March 6 in Cell. The key was the discovery of how activity shifts in the brain day to day as a study participant repeatedly imagined making specific movements. Once the AI was programmed to account for those shifts, it worked for months at a time. Location, location, location Ganguly studied how patterns of brain activity in animals represent specific movements and saw that these representations changed day-to-day as the animal learned. He suspected the same thing was happening in humans, and that was why their BCIs so quickly lost the ability to recognize these patterns. Ganguly and neurology researcher Nikhilesh Natraj, PhD, worked with a study participant who had been paralyzed by a stroke years earlier. He could not speak or move. He had tiny sensors implanted on the surface of his brain that could pick up brain activity when he imagined moving. To see whether his brain patterns changed over time, Ganguly asked the participant to imagine moving different parts of his body, like his hands, feet or head. Although he couldn't actually move, the participant's brain could still produce the signals for a movement when he imagined himself doing it. The BCI recorded the brain's representations of these movements through the sensors on his brain. Ganguly's team found that the shape of representations in the brain stayed the same, but their locations shifted slightly from day to day. From virtual to reality Ganguly then asked the participant to imagine himself making simple movements with his fingers, hands or thumbs over the course of two weeks, while the sensors recorded his brain activity to train the AI. Then, the participant tried to control a robotic arm and hand. But the movements still weren't very precise. So, Ganguly had the participant practice on a virtual robot arm that gave him feedback on the accuracy of his visualizations. Eventually, he got the virtual arm to do what he wanted it to do. Once the participant began practicing with the real robot arm, it only took a few practice sessions for him to transfer his skills to the real world. He could make the robotic arm pick up blocks, turn them and move them to new locations. He was even able to open a cabinet, take out a cup and hold it up to a water dispenser. Months later, the participant was still able to control the robotic arm after a 15-minute "tune-up" to adjust for how his movement representations had drifted since he had begun using the device. Ganguly is now refining the AI models to make the robotic arm move faster and more smoothly, and planning to test the BCI in a home environment. For people with paralysis, the ability to feed themselves or get a drink of water would be life changing. Ganguly thinks this is within reach. "I'm very confident that we've learned how to build the system now, and that we can make this work," he said.
[3]
Paralyzed man moves robot arm for record 7 months with new brain chip
Most BCIs previously available have a two-day maximum shelf life with a possibility of disruption, however, this one astonishingly operated for a full seven months without major recalibration. The biggest advancement comes from the AI model that this BCI is built around. It adapts to natural shifts in brain activity over time, allowing the participant to refine his imagined movements. "This blending of learning between humans and AI is the next phase for these brain-computer interfaces," said neurologist Karunesh Ganguly. "It's what we need to achieve sophisticated, lifelike function." The study participant, who suffered paralysis from a stroke, was implanted with small sensors on the surface of the brain. When the patient pictured moving their limbs or head, these sensors captured the brain's activity. Over time, researchers found that while the brain's movement patterns remained consistent in shape, their exact locations shifted slightly from day to day. This outlines why previous BCIs failed so quickly. In order to solve this problem, the research team developed an AI-enabled model that adjusted for day-to-day changes. For two weeks, the subject tried to visualize simple movements while the AI learned from his brain signals. When he first attempted to control a robotic arm, his movements were imprecise. To improve accuracy, he practiced using a virtual robotic arm that provided real-time feedback.
[4]
AI-Powered Brain Implant Lets Paralyzed Man Control Robotic Arm - Neuroscience News
Summary: A new brain-computer interface (BCI) has enabled a paralyzed man to control a robotic arm by simply imagining movements. Unlike previous BCIs, which lasted only a few days, this AI-enhanced device worked reliably for seven months. The AI model adapts to natural shifts in brain activity, maintaining accuracy over time. After training with a virtual arm, the participant successfully grasped, moved, and manipulated real-world objects. The technology represents a major step toward restoring movement for people with paralysis. Researchers are now refining the system for smoother operation and testing its use in home settings. Researchers at UC San Francisco have enabled a man who is paralyzed to control a robotic arm through a device that relays signals from his brain to a computer. He was able to grasp, move and drop objects just by imagining himself performing the actions. The device, known as a brain-computer interface (BCI), worked for a record 7 months without needing to be adjusted. Until now, such devices have only worked for a day or two. The BCI relies on an AI model that can adjust to the small changes that take place in the brain as a person repeats a movement - or in this case, an imagined movement - and learns to do it in a more refined way. "This blending of learning between humans and AI is the next phase for these brain-computer interfaces," said neurologist, Karunesh Ganguly, MD, PhD, a professor of neurology and a member of the UCSF Weill Institute for Neurosciences. "It's what we need to achieve sophisticated, lifelike function." The study, which was funded by the National Institutes of Health, appears March 6 in Cell. The key was the discovery of how activity shifts in the brain day to day as a study participant repeatedly imagined making specific movements. Once the AI was programmed to account for those shifts, it worked for months at a time. Location, location, location Ganguly studied how patterns of brain activity in animals represent specific movements and saw that these representations changed day-to-day as the animal learned. He suspected the same thing was happening in humans, and that was why their BCIs so quickly lost the ability to recognize these patterns. Ganguly and neurology researcher Nikhilesh Natraj, PhD, worked with a study participant who had been paralyzed by a stroke years earlier. He could not speak or move. He had tiny sensors implanted on the surface of his brain that could pick up brain activity when he imagined moving. To see whether his brain patterns changed over time, Ganguly asked the participant to imagine moving different parts of his body, like his hands, feet or head. Although he couldn't actually move, the participant's brain could still produce the signals for a movement when he imagined himself doing it. The BCI recorded the brain's representations of these movements through the sensors on his brain. Ganguly's team found that the shape of representations in the brain stayed the same, but their locations shifted slightly from day to day. From virtual to reality Ganguly then asked the participant to imagine himself making simple movements with his fingers, hands or thumbs over the course of two weeks, while the sensors recorded his brain activity to train the AI. Then, the participant tried to control a robotic arm and hand. But the movements still weren't very precise. So, Ganguly had the participant practice on a virtual robot arm that gave him feedback on the accuracy of his visualizations. Eventually, he got the virtual arm to do what he wanted it to do. Once the participant began practicing with the real robot arm, it only took a few practice sessions for him to transfer his skills to the real world. He could make the robotic arm pick up blocks, turn them and move them to new locations. He was even able to open a cabinet, take out a cup and hold it up to a water dispenser. Months later, the participant was still able to control the robotic arm after a 15-minute "tune-up" to adjust for how his movement representations had drifted since he had begun using the device. Ganguly is now refining the AI models to make the robotic arm move faster and more smoothly, and planning to test the BCI in a home environment. For people with paralysis, the ability to feed themselves or get a drink of water would be life changing. Ganguly thinks this is within reach. "I'm very confident that we've learned how to build the system now, and that we can make this work," he said. Authors: Other authors of this study include Sarah Seko and Adelyn Tu-Chan of UCSF and Reza Abiri of the University of Rhode Island. Funding: This work was supported by National Institutes of Health (1 DP2 HD087955) and the UCSF Weill Institute for Neurosciences. Author: Robin Marks Source; UCSF Contact: Robin Marks - UCSF Image: The image is credited to Neuroscience News Original Research: Open access. "Sampling representational plasticity of simple imagined movements across days enables long-term neuroprosthetic control" by Karunesh Ganguly et al. Cell Abstract Sampling representational plasticity of simple imagined movements across days enables long-term neuroprosthetic control The nervous system needs to balance the stability of neural representations with plasticity. It is unclear what the representational stability of simple well-rehearsed actions is, particularly in humans, and their adaptability to new contexts. Using an electrocorticography brain-computer interface (BCI) in tetraplegic participants, we found that the low-dimensional manifold and relative representational distances for a repertoire of simple imagined movements were remarkably stable. The manifold's absolute location, however, demonstrated constrained day-to-day drift. Strikingly, neural statistics, especially variance, could be flexibly regulated to increase representational distances during BCI control without somatotopic changes. Discernability strengthened with practice and was BCI-specific, demonstrating contextual specificity. Sampling representational plasticity and drift across days subsequently uncovered a meta-representational structure with generalizable decision boundaries for the repertoire; this allowed long-term neuroprosthetic control of a robotic arm and hand for reaching and grasping. Our study offers insights into mesoscale representational statistics that also enable long-term complex neuroprosthetic control.
Share
Share
Copy Link
Researchers at UC San Francisco have developed a brain-computer interface that allows a paralyzed man to control a robotic arm using his thoughts. The device, powered by AI, maintained functionality for an unprecedented 7 months without adjustment.
In a significant advancement in neurotechnology, researchers at the University of California, San Francisco (UCSF) have developed a brain-computer interface (BCI) that has enabled a paralyzed man to control a robotic arm using only his thoughts for an unprecedented seven months without recalibration 12. This breakthrough, published in the journal Cell on March 6, 2025, represents a major step forward in restoring movement capabilities for individuals with paralysis.
The key to this remarkable achievement lies in the integration of artificial intelligence (AI) into the BCI system. The AI model is designed to adapt to the subtle changes that occur in the brain as a person repeatedly imagines performing specific movements 3. Dr. Karunesh Ganguly, a neurologist and professor at UCSF, explains, "This blending of learning between humans and AI is the next phase for these brain-computer interfaces. It's what we need to achieve sophisticated, lifelike function" 1.
The research team, led by Dr. Ganguly and neurology researcher Dr. Nikhilesh Natraj, made a crucial discovery about brain activity patterns. They found that while the shape of movement representations in the brain remained consistent, their locations shifted slightly from day to day 4. This understanding of brain plasticity was instrumental in developing an AI model that could maintain accuracy over extended periods.
The study participant, who had been paralyzed by a stroke years earlier, underwent a two-week training period where he imagined making simple movements while sensors implanted on his brain surface recorded his neural activity 2. Initially, attempts to control the robotic arm were imprecise. To refine control, the researchers implemented an innovative approach:
The paralyzed man demonstrated remarkable control over the robotic arm, performing tasks such as:
Dr. Ganguly and his team are now focused on refining the AI models to achieve faster and smoother robotic arm movements. They are also planning to test the BCI system in home environments, bringing this life-changing technology closer to practical, everyday use for people with paralysis 4.
This breakthrough has far-reaching implications for individuals with paralysis. The ability to perform simple tasks like feeding oneself or getting a drink of water independently could significantly improve quality of life. Dr. Ganguly expresses confidence in the future of this technology, stating, "I'm very confident that we've learned how to build the system now, and that we can make this work" 1.
As research continues, this AI-powered BCI technology holds the promise of restoring a degree of independence and mobility to those affected by paralysis, marking a new era in the intersection of neuroscience, artificial intelligence, and assistive technology.
Reference
[1]
[2]
[3]
[4]
A 69-year-old man with paralysis successfully controlled a virtual drone through complex obstacle courses using only his thoughts, thanks to a brain-computer interface that interprets neural signals associated with finger movements.
6 Sources
6 Sources
Researchers develop a brain-computer interface that can translate thoughts into audible speech almost instantly, potentially revolutionizing communication for people with severe paralysis.
18 Sources
18 Sources
Neuralink, Elon Musk's brain-computer interface company, reports success with its second human patient. The individual, who had quadriplegia, can now play video games using only their thoughts.
3 Sources
3 Sources
A new AI-powered treatment called adaptive deep brain stimulation is showing significant improvements in managing Parkinson's disease symptoms, reducing medication dependence, and enhancing patients' quality of life.
2 Sources
2 Sources
Elon Musk's Neuralink faces a challenge in its first human trial as electrode threads retract from the patient's brain, possibly due to an air pocket. This setback highlights the complexities in brain-computer interface technology.
2 Sources
2 Sources