2 Sources
2 Sources
[1]
The brain's visual system plays a bigger role in decision making than expected
Columbia University School of Engineering and Applied ScienceApr 18 2025 When you see a bag of carrots at the grocery store, does your mind go to potatoes and parsnips or buffalo wings and celery? It depends, of course, on whether you're making a hearty winter stew or getting ready to watch the Super Bowl. Most scientists agree that categorizing an object - like thinking of a carrot as either a root vegetable or a party snack - is the job of the prefrontal cortex, the brain region responsible for reasoning and other high-level functions that make us smart and social. In that account, the eyes and visual regions of the brain are kind of like a security camera collecting data and processing it in a standardized way before passing it off for analysis. However, a new study led by biomedical engineer and neuroscientist Nuttida Rungratsameetaweemana, an assistant professor at Columbia Engineering, shows that the brain's visual regions play an active role in making sense of information. Crucially, the way it interprets the information depends on what the rest of the brain is working on. If it's Super Bowl Sunday, the visual system sees those carrots on a veggie tray before the prefrontal cortex knows they exist. Published April 11 in Nature Communications, the study provides some of the clearest evidence yet that early sensory systems play a role in decision-making - and that they adapt in real-time. It also points to new approaches for designing AI systems that can adapt to new or unexpected situations. We sat down with Rungratsameetaweemana to learn more about the research. What's exciting about this new study? Our findings challenge the traditional view that early sensory areas in the brain are simply "looking" or "recording" visual input. In fact, the human brain's visual system actively reshapes how it represents the exact same object depending on what you're trying to do. Even in visual areas that are very close to raw information that enters the eyes, the brain has the flexibility to tune its interpretation and responses based on the current task. It gives us a new way to think about flexibility in the brain and opens up ideas for how to potentially build more adaptive AI systems modeled after these neural strategies. How did you come to this surprising conclusion? Most previous work looked at how people learn categories over time, but this study zooms in on the flexibility piece: How does the brain rapidly switch between different ways of organizing the same visual information? What were your experiments like? We used functional magnetic resonance imaging (fMRI) to observe people's brain activity while they put shapes in different categories. The twist was that the "rules" for categorizing the shapes kept changing. This let us determine whether the visual cortex was changing how it represented the shapes depending on how we had defined the categories. We analyzed the data using computational machine learning tools, including multivariate classifiers. These tools allow us to examine patterns of brain activation in response to different shape images, and measure how clearly the brain distinguishes shapes in different categories. We saw that the brain responds differently depending on what categories our participants were sorting the shapes into. What did you see in the data from these experiments? Activity in the visual system - including the primary and secondary visual cortices, which deal with data straight from the eyes - changed with practically every task. They reorganized their activity depending on which decision rules people were using, which was shown by the brain activation patterns becoming more distinctive when a shape was near the grey area between categories. Those were the most difficult shapes to tell apart, so it's exactly when extra processing would be most helpful. We could actually see clearer neural patterns in the fMRI data in cases when people did a better job on the tasks. That suggests the visual cortex may directly help us solve flexible categorization tasks. What are the implications of these findings? Flexible cognition is a hallmark of human cognition, and even state-of-the-art AI systems currently still struggle with flexible task performance. Our results may contribute to designing AI systems that can better adapt to new situations. The results may also contribute to understanding how cognitive flexibility might break down in conditions like ADHD or other cognitive disorders. It's also a reminder of how remarkable and efficient our brains are, even at the earliest stages of processing. What's next for this line of research? We're pushing the neuroscience further by studying how flexible coding works at the level of neural circuits. With fMRI, we were looking at large populations of neurons. In a new follow-up study, we are investigating the circuit mechanisms of flexible coding by recording neurological activity inside the skull. This lets us ask how individual neurons and neuronal circuits in the human brain support flexible, goal-directed behavior. We're also starting to explore how these ideas might be useful for artificial systems. Humans are really good at adapting to new goals, even when the rules change, but current AI systems often struggle with that kind of flexibility. We're hoping that what we're learning from the human brain can help us design models that adapt more fluidly, not just to new inputs, but to new contexts. Columbia University School of Engineering and Applied Science Journal reference: Henderson, M. M., et al. (2025). Dynamic categorization rules alter representations in human visual cortex. Nature Communications. doi.org/10.1038/s41467-025-58707-4.
[2]
Rethinking vision: The brain sees what it wants to see - Earth.com
You walk into a room and see a long, slender object lying on a table. Is it a pen you need for a meeting? Or a thermometer you left out after checking your fever? The object hasn't changed. But depending on why you walked in, your vision - guided by your brain - may classify it entirely differently. This subtle shift in meaning, which is tied to context and intention, has long intrigued scientists. Traditionally, the belief was simple: your eyes gather data and your brain processes it like a computer. The early visual areas were thought to pass along raw details - shape, color, motion - to higher brain centers, which then made decisions. But a new study, published on in Nature Communications, offers a striking update to that view. Led by Professor Nuttida Rungratsameetaweemana of Columbia Engineering, the research shows that the visual cortex itself helps decide what something is. And it does so by reinterpreting the very same image, depending on what you're trying to do. This work challenges the idea of the visual cortex as a passive recording device. Instead, it appears to function more like an active participant in interpretation. Before you're even aware of what you're seeing, your visual system starts shaping that image based on your current goal. "Our findings challenge the traditional view that early sensory areas in the brain are simply 'looking' or 'recording' visual input. In fact, the human brain's visual system actively reshapes how it represents the exact same object depending on what you're trying to do," noted Professor Rungratsameetaweemana. This means that vision isn't only about light hitting the retina. It's about purpose. The brain's early sensory regions are tuned to your tasks, and adjusts their processing in real-time. You don't just see a shape - you see a meaning attached to it. To test this, the research team designed a clever experiment. They created abstract shapes that varied along two dimensions. During fMRI scanning, participants had to sort these shapes into categories. But here's the twist: the rules kept changing. Sometimes the category boundary was linear and straightforward. Other times, it curved in complex ways across the visual space. The same shape might belong to one group in one task, and a completely different group in another. Participants had to adapt fast. This shifting rule environment allowed the researchers to track how the brain responded when the same visual input had to be interpreted differently. The experts found that early visual areas in the brain - like V1 and V2 - adjusted their activity depending on which rule was active, even though the shapes themselves didn't change. The most dramatic shifts in visual cortex activity occurred when shapes fell near a category boundary. These were the toughest decisions - when shapes looked ambiguous and could easily belong to more than one group. In those moments, the brain sharpened its distinctions. Classifier accuracy, a machine learning measure used in the study, was highest in early visual areas for shapes near decision boundaries. This accuracy aligned with task performance. When the brain's patterns were clearer, participants were more likely to choose correctly. In short, the visual system wasn't just observing. It was helping to solve the problem. The study used three task types: Linear-1, Linear-2, and Nonlinear. Participants found the Nonlinear task most difficult, both in speed and accuracy. That difference played out in the brain as well. For Linear-2 tasks, representations in the visual cortex were more distinctive and aligned with the active category boundary. This was especially evident in trials where shapes were hard to classify. It suggests that the brain prioritizes clarity where it's most needed. "We saw that the brain responds differently depending on what categories our participants were sorting the shapes into," noted Professor Rungratsameetaweemana. The team believes that feature-based attention may explain these effects. That's the brain's ability to enhance the processing of specific features -- like edges, curvature, or symmetry -- based on what's currently relevant. In this study, attention may have been dynamically allocated to combinations of features needed for each specific rule. This allowed early visual areas to tweak how they represented each shape, making decision-making more efficient and responsive. This type of attention wasn't cued explicitly. Participants weren't told what features to look at. Instead, they learned to shift their mental focus based on the task at hand. Their brains adapted quietly but powerfully. The ability to adapt to new goals is a cornerstone of human intelligence. Yet even advanced AI systems struggle with this kind of flexible reclassification. Most treat perception and decision-making as separate stages. This study shows that perception itself is fluid. It bends to the task, allowing seamless transitions between rules, even when the input stays constant. That insight could inspire new designs in AI systems that need to operate in unpredictable environments. It may also help us understand cognitive disorders where flexibility breaks down. Conditions like ADHD or autism often involve difficulty shifting mental sets. If brain sensory areas involved in vision contribute to these shifts, future treatments might target them to improve adaptability. "Flexible cognition is a hallmark of human cognition, and even state-of-the-art AI systems currently still struggle with flexible task performance. It's also a reminder of how remarkable and efficient our brains are, even at the earliest stages of processing," said Professor Rungratsameetaweemana. What happens beneath the surface of these visual adjustments? The team's next phase will record from neurons inside the skull to understand how individual cells respond to changing goals. They want to uncover the wiring that supports these fast switches in meaning. They also hope to apply what they find to artificial systems. Today's models often fail when task goals shift mid-stream. But the human brain handles such transitions smoothly. By mimicking this biological flexibility, AI could become more adaptive and useful. "We're hoping that what we're learning from the human brain can help us design models that adapt more fluidly, not just to new inputs, but to new contexts," said Professor Rungratsameetaweemana. It may no longer be true to say that seeing is believing. This research transforms how we understand the brain and its relation to our vision. The eye may be the entry point, but interpretation begins almost immediately, long before conscious thought. It turns out that seeing is not just about gathering information. It's about aligning perception with purpose. When you walk into a room and glance at that object on the table, your brain is already deciding what it means based on why you're there. The pen and the thermometer are the same shape - but they are not the same thing. Thanks to this study, we now know that your brain's visual system is helping you choose, adapt, and act -even before you realize what you're looking at. Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Share
Share
Copy Link
A new study challenges traditional views of visual processing, showing that the brain's visual cortex actively adapts its interpretation of objects based on current tasks and goals.
A groundbreaking study led by Professor Nuttida Rungratsameetaweemana of Columbia Engineering has revealed that the brain's visual system plays a far more active role in decision-making than previously believed. Published in Nature Communications, the research challenges long-held assumptions about how our brains process visual information
1
.Traditionally, scientists thought that the visual cortex simply recorded and relayed raw visual data to higher brain regions for interpretation. However, this new study demonstrates that even early visual areas, such as the primary and secondary visual cortices, actively reshape their representation of objects based on current tasks and goals
2
.The research team designed an experiment where participants categorized abstract shapes while undergoing fMRI scans. The key innovation was that the categorization rules kept changing, forcing participants to adapt quickly
1
.1
.Related Stories
These findings have significant implications for both artificial intelligence and our understanding of human cognition:
AI Development: The study could inspire new approaches to designing more flexible AI systems that can adapt to changing contexts and goals
2
.Cognitive Disorders: The results may contribute to understanding conditions like ADHD, where cognitive flexibility is impaired
1
.Human Cognition: The study highlights the remarkable efficiency and adaptability of the human brain, even at the earliest stages of sensory processing
2
.Professor Rungratsameetaweemana and her team are now exploring how this flexible coding works at the level of individual neurons and neural circuits. They're using intracranial recordings to investigate how specific neuronal populations support adaptive, goal-directed behavior
1
.This research not only reshapes our understanding of visual processing but also opens new avenues for exploring the intricate relationship between perception, attention, and decision-making in the human brain.
Summarized by
Navi
17 Jun 2025•Science and Research
27 Jun 2025•Science and Research
11 Sept 2024
1
Business and Economy
2
Technology
3
Business and Economy