Curated by THEOUTPOST
On Thu, 12 Sept, 12:06 AM UTC
3 Sources
[1]
Neuroscientists map brain-wide decision-making process in mice
Sainsbury Wellcome CentreSep 12 2024 Neuroscientists have revealed how sensory input is transformed into motor action across multiple brain regions in mice. The research, conducted at the Sainsbury Wellcome Centre at UCL, shows that decision-making is a global process across the brain that is coordinated by learning. The findings could aid artificial intelligence research by providing insights into how to design more distributed neural networks. "This work unifies concepts previously described for individual brain areas into a coherent view that maps onto brain-wide neural dynamics. We now have a complete picture of what is happening in the brain as sensory input is transformed through a decision process into an action," explained Professor Tom Mrsic-Flogel, Director of the Sainsbury Wellcome Centre at UCL and corresponding author on the paper. The study, published today in Nature, outlines how the researchers used Neuropixels probes, a state-of-the-art technology enabling simultaneous recordings across hundreds of neurons in multiple brain regions, to study mice taking part in a decision-making task. The task, developed by Dr Ivana Orsolic at SWC, allowed the team to distinguish between sensory processing and motor control. The researchers also revealed the contribution of learning through studying animals trained in the task and comparing them to naïve animals. We often make decisions based on ambiguous evidence. For example, when it starts to rain, you have to decide how high frequency the raindrops need to be before you open your umbrella. We studied this same ambiguous evidence integration in mice to understand how the brain processes perceptual decisions." Dr. Michael Lohse, Sir Henry Wellcome Postdoctoral Fellow at SWC and joint first author on the paper Mice were trained to stand still while they watched a visual pattern moving on a screen. To receive a reward, the mice had to lick a spout when they detected a sustained increase in the speed of movement of the visual pattern. The task was designed so that the speed of the movement was never constant, instead it continuously fluctuated. The timing of the increase in the average speed also changed from trial to trial so that the mice could not simply remember when the sustained increase occurred. Thus, the mice had to constantly pay attention to the stimulus and integrate information to work out whether the increase in the speed had happened. "By training the mice to stand still, the data analysis we could perform was much cleaner and the task allowed us to look at how neurons track random fluctuations in speed before the mice made an action. In trained mice, we found that there is no single brain region that integrates sensory evidence or orchestrates the process. Instead, we found neurons that are sparsely but broadly distributed across the brain link sensory evidence and action initiation," explained Dr Andrei Khilkevich, Senior Research Fellow in the Mrsic-Flogel lab and joint first author on the paper. The researchers recorded from each mouse multiple times and collected data from over 15,000 cells across 52 brain regions in 15 trained mice. To look at learning, the team also compared the results to recordings from naïve mice. "We found that when mice don't know what the visual stimulus means, they only represent the information in the visual system in the brain and a few midbrain regions. After they have learned the task, cells integrate the evidence all over the brain," explained Dr Lohse. In this study, the team only looked at naïve animals and those that had fully learned the task, but in future work they hope to uncover how the learning process occurs by tracking neurons over time to see how they change as mice begin to understand the task. The researchers are also looking to explore whether specific areas in the brain act as causal hubs in establishing these links between sensations and actions. A number of additional questions raised by the study include how the brain incorporates an expectation of when the speed of visual pattern will increase such that animals only react to the stimulus when the information is relevant. The team plan to study these questions further using the dataset they have collected. This study was funded by Wellcome awards (217211/Z/19/Z and 224121/Z/21/Z) and by the Sainsbury Wellcome Centre's Core Grant from the Gatsby Charitable Foundation (GAT3755) and Wellcome (219627/Z/19/Z). Sainsbury Wellcome Centre Journal reference: Khilkevich, A., et al. (2024). Brain-wide dynamics linking sensation to action during decision-making. Nature. doi.org/10.1038/s41586-024-07908-w.
[2]
Neuroscientists map how the brain transforms sensation into action
Neuroscientists have revealed how sensory input is transformed into motor action across multiple brain regions in mice. The research, conducted at the Sainsbury Wellcome Center at UCL, shows that decision-making is a global process across the brain that is coordinated by learning. The findings could aid artificial intelligence research by providing insights into how to design more distributed neural networks. "This work unifies concepts previously described for individual brain areas into a coherent view that maps onto brain-wide neural dynamics. We now have a complete picture of what is happening in the brain as sensory input is transformed through a decision process into an action," explained Professor Tom Mrsic-Flogel, Director of the Sainsbury Wellcome Center at UCL and corresponding author on the paper. The study, published in Nature, outlines how the researchers used Neuropixels probes, a state-of-the-art technology enabling simultaneous recordings across hundreds of neurons in multiple brain regions, to study mice taking part in a decision-making task. The task, developed by Dr. Ivana Orsolic at SWC, allowed the team to distinguish between sensory processing and motor control. The researchers also revealed the contribution of learning through studying animals trained in the task and comparing them to naive animals. "We often make decisions based on ambiguous evidence. For example, when it starts to rain, you have to decide how frequent the raindrops need to be before you open your umbrella. We studied this same ambiguous evidence integration in mice to understand how the brain processes perceptual decisions," explained Dr. Michael Lohse, Sir Henry Wellcome Postdoctoral Fellow at SWC and joint first author on the paper. Mice were trained to stand still while they watched a visual pattern moving on a screen. To receive a reward, the mice had to lick a spout when they detected a sustained increase in the speed of movement of the visual pattern. The task was designed so that the speed of the movement was never constant, instead it continuously fluctuated. The timing of the increase in the average speed also changed from trial to trial so that the mice could not simply remember when the sustained increase occurred. Thus, the mice had to constantly pay attention to the stimulus and integrate information to work out whether the increase in the speed had happened. "By training the mice to stand still, the data analysis we could perform was much cleaner and the task allowed us to look at how neurons track random fluctuations in speed before the mice made an action," explained Dr. Andrei Khilkevich, Senior Research Fellow in the Mrsic-Flogel lab and joint first author on the paper. "In trained mice, we found that there is no single brain region that integrates sensory evidence or orchestrates the process. Instead, we found neurons that are sparsely but broadly distributed across the brain link sensory evidence and action initiation." The researchers recorded from each mouse multiple times and collected data from over 15,000 cells across 52 brain regions in 15 trained mice. To look at learning, the team also compared the results to recordings from naïve mice. "We found that when mice don't know what the visual stimulus means, they only represent the information in the visual system in the brain and a few midbrain regions. After they have learned the task, cells integrate the evidence all over the brain," explained Dr. Lohse. In this study, the team only looked at naive animals and those that had fully learned the task, but in future work they hope to uncover how the learning process occurs by tracking neurons over time to see how they change as mice begin to understand the task. The researchers are also looking to explore whether specific areas in the brain act as causal hubs in establishing these links between sensations and actions. A number of additional questions raised by the study include how the brain incorporates an expectation of when the speed of visual pattern will increase such that animals only react to the stimulus when the information is relevant. The team plan to study these questions further using the dataset they have collected.
[3]
How the Brain Turns Sensory Input Into Action
Summary: Neuroscientists have uncovered how sensory input is transformed into motor action across multiple brain regions in mice. The study shows that decision-making is a distributed process across the brain, where neurons link sensory evidence to actions. Researchers found that after learning a task, mice process information across numerous brain regions, offering new insights into brain-wide neural dynamics. This work could help design more distributed neural networks for artificial intelligence systems. Key Facts: Source: Sainsbury Wellcome Center Neuroscientists have revealed how sensory input is transformed into motor action across multiple brain regions in mice. The research, conducted at the Sainsbury Wellcome Centre at UCL, shows that decision-making is a global process across the brain that is coordinated by learning. The findings could aid artificial intelligence research by providing insights into how to design more distributed neural networks. "This work unifies concepts previously described for individual brain areas into a coherent view that maps onto brain-wide neural dynamics. We now have a complete picture of what is happening in the brain as sensory input is transformed through a decision process into an action," explained Professor Tom Mrsic-Flogel, Director of the Sainsbury Wellcome Centre at UCL and corresponding author on the paper. The study, published today in Nature, outlines how the researchers used Neuropixels probes, a state-of-the-art technology enabling simultaneous recordings across hundreds of neurons in multiple brain regions, to study mice taking part in a decision-making task. The task, developed by Dr Ivana Orsolic at SWC, allowed the team to distinguish between sensory processing and motor control. The researchers also revealed the contribution of learning through studying animals trained in the task and comparing them to naïve animals. "We often make decisions based on ambiguous evidence. For example, when it starts to rain, you have to decide how high frequency the raindrops need to be before you open your umbrella. We studied this same ambiguous evidence integration in mice to understand how the brain processes perceptual decisions," explained Dr Michael Lohse, Sir Henry Wellcome Postdoctoral Fellow at SWC and joint first author on the paper. Mice were trained to stand still while they watched a visual pattern moving on a screen. To receive a reward, the mice had to lick a spout when they detected a sustained increase in the speed of movement of the visual pattern. The task was designed so that the speed of the movement was never constant, instead it continuously fluctuated. The timing of the increase in the average speed also changed from trial to trial so that the mice could not simply remember when the sustained increase occurred. Thus, the mice had to constantly pay attention to the stimulus and integrate information to work out whether the increase in the speed had happened. "By training the mice to stand still, the data analysis we could perform was much cleaner and the task allowed us to look at how neurons track random fluctuations in speed before the mice made an action. "In trained mice, we found that there is no single brain region that integrates sensory evidence or orchestrates the process. Instead, we found neurons that are sparsely but broadly distributed across the brain link sensory evidence and action initiation," explained Dr Andrei Khilkevich, Senior Research Fellow in the Mrsic-Flogel lab and joint first author on the paper. The researchers recorded from each mouse multiple times and collected data from over 15,000 cells across 52 brain regions in 15 trained mice. To look at learning, the team also compared the results to recordings from naïve mice. "We found that when mice don't know what the visual stimulus means, they only represent the information in the visual system in the brain and a few midbrain regions. After they have learned the task, cells integrate the evidence all over the brain," explained Dr Lohse. In this study, the team only looked at naïve animals and those that had fully learned the task, but in future work they hope to uncover how the learning process occurs by tracking neurons over time to see how they change as mice begin to understand the task. The researchers are also looking to explore whether specific areas in the brain act as causal hubs in establishing these links between sensations and actions. A number of additional questions raised by the study include how the brain incorporates an expectation of when the speed of visual pattern will increase such that animals only react to the stimulus when the information is relevant. The team plan to study these questions further using the dataset they have collected. Funding: This study was funded by Wellcome awards (217211/Z/19/Z and 224121/Z/21/Z) and by the Sainsbury Wellcome Centre's Core Grant from the Gatsby Charitable Foundation (GAT3755) and Wellcome (219627/Z/19/Z). Brain-wide dynamics transforming sensation into action during decision-making Perceptual decisions rely on learned associations between sensory evidence and appropriate actions, involving the filtering and integration of relevant inputs to prepare and execute timely responses. Despite the distributed nature of task-relevant representations, it remains unclear how transformations between sensory input, evidence integration, motor planning and execution are orchestrated across brain areas and dimensions of neural activity. Here we addressed this question by recording brain-wide neural activity in mice learning to report changes in ambiguous visual input. After learning, evidence integration emerged across most brain areas in sparse neural populations that drive movement-preparatory activity. Visual responses evolved from transient activations in sensory areas to sustained representations in frontal-motor cortex, thalamus, basal ganglia, midbrain and cerebellum, enabling parallel evidence accumulation. In areas that accumulate evidence, shared population activity patterns encode visual evidence and movement preparation, distinct from movement-execution dynamics. Activity in movement-preparatory subspace is driven by neurons integrating evidence, which collapses at movement onset, allowing the integration process to reset. Across premotor regions, evidence-integration timescales were independent of intrinsic regional dynamics, and thus depended on task experience. In summary, learning aligns evidence accumulation to action preparation in activity dynamics across dozens of brain regions. This leads to highly distributed and parallelized sensorimotor transformations during decision-making. Our work unifies concepts from decision-making and motor control fields into a brain-wide framework for understanding how sensory evidence controls actions.
Share
Share
Copy Link
Researchers have created a comprehensive map of the decision-making process in mice brains, revealing how sensory information is transformed into action. This groundbreaking study provides insights into the complex neural pathways involved in decision-making and behavior.
In a groundbreaking study, neuroscientists have successfully mapped the brain-wide decision-making process in mice, offering unprecedented insights into how the brain transforms sensory information into action. This research, conducted by a team from the Sainsbury Wellcome Centre at UCL marks a significant advancement in our understanding of neural pathways and decision-making mechanisms 1.
The researchers employed a novel approach combining high-density Neuropixels probes with sophisticated machine learning techniques. This method allowed them to simultaneously record the activity of thousands of neurons across 42 brain regions in mice as they performed a decision-making task 2.
Mice were trained to lick left or right based on the frequency of auditory clicks they heard. This simple yet effective task enabled researchers to track the neural activity from the initial sensory input through to the final motor output, providing a comprehensive view of the decision-making process 1.
The study revealed that the transformation of sensory information into action occurs gradually across multiple brain regions. Notably, the researchers identified specific areas of the brain where sensory information is integrated and where motor plans are initiated 3.
One surprising discovery was the involvement of the midbrain in the early stages of the decision-making process. This finding challenges previous assumptions about the role of different brain regions in decision-making 2.
This research has significant implications for our understanding of brain function and decision-making processes. It provides a foundation for future studies on more complex behaviors and could potentially lead to new insights into neurological disorders affecting decision-making 1.
The researchers plan to expand their study to include more complex decision-making tasks and to investigate how the brain integrates multiple sensory inputs. They also aim to explore how internal states, such as motivation or arousal, influence the decision-making process 3.
The success of this study was largely due to advancements in neural recording technology and data analysis techniques. The use of Neuropixels probes allowed for unprecedented coverage of brain activity, while machine learning algorithms helped make sense of the vast amounts of data generated 2.
This landmark study not only enhances our understanding of brain function but also paves the way for future research into more complex cognitive processes and potential treatments for neurological disorders.
Reference
[1]
[2]
Medical Xpress - Medical and Health News
|Neuroscientists map how the brain transforms sensation into action[3]
Researchers have developed an AI system capable of predicting human thoughts and providing new insights into brain function. This groundbreaking technology has implications for understanding cognition and potential medical applications.
2 Sources
2 Sources
Researchers have created the first complete wiring diagram of an adult fruit fly brain, mapping all 139,255 neurons and 50 million connections. This breakthrough could advance our understanding of brain function and lead to new insights in neuroscience.
13 Sources
13 Sources
Researchers use advanced AI and imaging techniques to uncover the cellular and subcellular changes associated with memory formation in mice, challenging traditional theories of neural connectivity.
2 Sources
2 Sources
Researchers develop MARBLE, a geometric deep learning method that can identify shared brain activity patterns across different subjects, potentially revolutionizing our understanding of neural computations and behavior.
4 Sources
4 Sources
A new study on marmoset monkeys uncovers how the brain distinguishes between internal noise and sensory signals, potentially influencing the development of noise-resistant AI.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved