Curated by THEOUTPOST
On Wed, 2 Oct, 12:03 AM UTC
3 Sources
[1]
How do infants connect with the world? It all starts with the feet - Earth.com
In a new study, experts have used artificial intelligence (AI) to investigate the world of infant learning. The researchers were specifically focused on when babies make the transition from random movements to purposeful exploration. When infants kick their tiny feet in the air, it might just seem like random flailing. But what if there's a science to it? Early movements might appear chaotic but believe it or not, these seemingly wild movements reveal patterns of interaction with the world around them. Yet, the understanding of how infants intentionally engage with their surroundings remains somewhat cloudy. Enter the baby-mobile experiment. What's that? Think of a colorful mobile gently tethered to an infant's foot. As our little hero kicks, the mobile moves - a cause-and-effect scenario that's music to the ears of researchers. This experiment, used since the swinging sixties, helps researchers understand babies' control over their movements and their quest to influence their environment. Now, what happens if we add a sprinkle of AI to this mix? At Florida Atlantic University, a team of researchers launched an investigation to see if AI tools could decode the intricate patterns of infant movement. They used a handy device called the Vicon 3D motion capture system to analyze infants' actions, especially their reactions when the mobile moved. The results? A resounding success! Published in the journal Scientific Reports, the study confirms the potential for AI to help us understand early infant development and interaction. The machine and deep learning methods accurately classified five-second clips of infant movements as belonging to different stages of the experiment. Among these methods, one emerged as the star: the deep learning model named 2D-CapsNet. The most fascinating part of this discovery was that feet movements scored the highest accuracy rates. What does this mean? Compared to other parts of the body, the foot movement patterns changed most dramatically across the experiment's stages. Study co-author Dr. Scott Kelso is a Glenwood and Martha Creech Eminent Scholar in Science at the Center for Complex Systems and Brain Sciences in FAU's Charles E. Schmidt College of Science. "Feet, as end effectors, are most impacted by interaction with the mobile. In other words, the way infants connect with their environment is most significant at the points of contact," said Dr. Kelso. In plain English, we can say it all starts "feet first." The study revealed that the 2D-CapsNet model achieved an impressive 86% accuracy when analyzing foot movements and could even capture detailed relationships between different body parts during movement. Plus, foot movements proved to be the most accurate - about 20% higher than movements of the hands, knees, or the entire body. What else did the study uncover? Well, it seems losing the ability to control the mobile made babies more eager to interact with the world. "However, some infants showed movement patterns during this disconnected phase that contained hints of their earlier interactions. This suggests that only certain infants understood their relationship with the mobile well enough to hang onto those movement patterns," explained Dr. Aliza Sloan. Dr. Nancy Aaron Jones, a professor at FAU's Department of Psychology, emphasized the role of AI in studying infants - who, unlike adults, can't explain their actions. "AI can help researchers analyze subtle changes in infant movements to gain insights into how they think and learn, even before they can speak," said Dr. Jones. In the words of Dr. Scott Kelso, this study not only enhances our understanding of infant behavior but also opens up new possibilities. Combining theory-based experiments with AI could lead to comprehensive assessments that take specific contexts into account, thus improving our ability to identify risks and treat disorders effectively. So, the next time you see a baby's tiny toes kicking, remember this: There's a fascinating world of science behind those cute little feet! Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
[2]
Feet first: AI reveals how infants connect with their world
Recent advances in computing and artificial intelligence, along with insights into infant learning, suggest that machine and deep learning techniques can help us study how infants transition from random exploratory movements to purposeful actions. Most research has focused on babies' spontaneous movements, distinguishing between fidgety and non-fidgety behaviors. While early movements may seem chaotic, they reveal meaningful patterns as infants interact with their environment. However, we still lack understanding of how infants intentionally engage with their surroundings and the principles guiding their goal-directed actions. By conducting a baby-mobile experiment, used in developmental research since the late 1960s, Florida Atlantic University researchers and collaborators investigated how infants begin to act purposefully. The baby-mobile experiment uses a colorful mobile gently tethered to an infant's foot. When the baby kicks, the mobile moves, linking their actions to what they see. This setup helps researchers understand how infants control their movements and discover their ability to influence their surroundings. In this new work, researchers tested whether AI tools could pick up on complex changes in patterns of infant movement. Infant movements, tracked using a Vicon 3D motion capture system, were classified into different types -- from spontaneous actions to reactions when the mobile moves. By applying various AI techniques, researchers examined which methods best captured the nuances of infant behavior across different situations and how movements evolved over time. Results of the study, published in Scientific Reports, underscore that AI is a valuable tool for understanding early infant development and interaction. Both machine and deep learning methods accurately classified five-second clips of 3D infant movements as belonging to different stages of the experiment. Among these methods, the deep learning model, 2D-CapsNet, performed the best. Importantly, for all the methods tested, the movements of the feet had the highest accuracy rates, which means that, compared to other parts of the body, the movement patterns of the feet changed most dramatically across the stages of the experiment. "This finding is significant because the AI systems were not told anything about the experiment or which part of the infant's body was connected to the mobile. What this shows is that the feet -- as end effectors -- are the most affected by the interaction with the mobile," said Scott Kelso, Ph.D., co-author and Glenwood and Martha Creech Eminent Scholar in Science at the Center for Complex Systems and Brain Sciences within FAU's Charles E. Schmidt College of Science. "In other words, the way infants connect with their environment has the biggest impact at the points of contact with the world. Here, this was 'feet first.'" The 2D-CapsNet model achieved an accuracy of 86% when analyzing foot movements and was able to capture detailed relationships between different body parts during movement. Across all methods tested, foot movements consistently showed the highest accuracy rates -- about 20% higher than movements of the hands, knees, or the whole body. "We found that infants explored more after being disconnected from the mobile than they did before they had the chance to control it. It seems that losing the ability to control the mobile made them more eager to interact with the world to find a means of reconnecting," said Aliza Sloan, Ph.D., co-author and a postdoctoral research scientist in FAU's Center for Complex Systems and Brain Sciences. "However, some infants showed movement patterns during this disconnected phase that contained hints of their earlier interactions with the mobile. This suggests that only certain infants understood their relationship with the mobile well enough to maintain those movement patterns, expecting that they would still produce a response from the mobile even after being disconnected." The researchers say that if the accuracy of infants' movements remains high during the disconnection, it might indicate that the infants learned something during their earlier interactions. However, different types of movements might mean different things in terms of what the infants discovered. "It's important to note that studying infants is more challenging than studying adults because infants can't communicate verbally," said Nancy Aaron Jones, Ph.D., co-author, professor in FAU's Department of Psychology, director of the FAU WAVES Lab, and a member of the Center for Complex Systems and Brain Sciences within the Charles E. Schmidt College of Science. "Adults can follow instructions and explain their actions, while infants cannot. That's where AI can help. AI can help researchers analyze subtle changes in infant movements, and even their stillness, to give us insights into how they think and learn, even before they can speak. Their movements can also help us make sense of the vast degree of individual variation that occurs as infants develop." Looking at how AI classification accuracy changes for each infant gives researchers a new way to understand when and how they start to engage with the world. "While past AI methods mainly focused on classifying spontaneous movements linked to clinical outcomes, combining theory-based experiments with AI will help us create better assessments of infant behavior that are relevant to their specific contexts," said Kelso. "This can improve how we identify risks, diagnose and treat disorders." Study co-authors are first author Massoud Khodadadzadeh, Ph.D., formerly at Ulster University in Derry, North Ireland and now at University of Bedfordshire, United Kingdom; and Damien Coyle, Ph.D., at the University of Bath, United Kingdom.
[3]
Feet First: AI Reveals How Infants Connect with Th | Newswise
Newswise -- Recent advances in computing and artificial intelligence, along with insights into infant learning, suggest that machine and deep learning techniques can help us study how infants transition from random exploratory movements to purposeful actions. Most research has focused on babies' spontaneous movements, distinguishing between fidgety and non-fidgety behaviors. While early movements may seem chaotic, they reveal meaningful patterns as infants interact with their environment. However, we still lack understanding of how infants intentionally engage with their surroundings and the principles guiding their goal-directed actions. By conducting a baby-mobile experiment, used in developmental research since the late 1960s, Florida Atlantic University researchers and collaborators investigated how infants begin to act purposefully. The baby-mobile experiment uses a colorful mobile gently tethered to an infant's foot. When the baby kicks, the mobile moves, linking their actions to what they see. This setup helps researchers understand how infants control their movements and discover their ability to influence their surroundings. In this new work, researchers tested whether AI tools could pick up on complex changes in patterns of infant movement. Infant movements, tracked using a Vicon 3D motion capture system, were classified into different types - from spontaneous actions to reactions when the mobile moves. By applying various AI techniques, researchers examined which methods best captured the nuances of infant behavior across different situations and how movements evolved over time. Results of the study, published in Scientific Reports, underscore that AI is a valuable tool for understanding early infant development and interaction. Both machine and deep learning methods accurately classified five-second clips of 3D infant movements as belonging to different stages of the experiment. Among these methods, the deep learning model, 2D-CapsNet, performed the best. Importantly, for all the methods tested, the movements of the feet had the highest accuracy rates, which means that, compared to other parts of the body, the movement patterns of the feet changed most dramatically across the stages of the experiment. "This finding is significant because the AI systems were not told anything about the experiment or which part of the infant's body was connected to the mobile. What this shows is that the feet - as end effectors - are the most affected by the interaction with the mobile," said Scott Kelso, Ph.D., co-author and Glenwood and Martha Creech Eminent Scholar in Science at the Center for Complex Systems and Brain Sciences within FAU's Charles E. Schmidt College of Science. "In other words, the way infants connect with their environment has the biggest impact at the points of contact with the world. Here, this was 'feet first.'" The 2D-CapsNet model achieved an accuracy of 86% when analyzing foot movements and was able to capture detailed relationships between different body parts during movement. Across all methods tested, foot movements consistently showed the highest accuracy rates - about 20% higher than movements of the hands, knees, or the whole body. "We found that infants explored more after being disconnected from the mobile than they did before they had the chance to control it. It seems that losing the ability to control the mobile made them more eager to interact with the world to find a means of reconnecting," said Aliza Sloan, Ph.D., co-author and a postdoctoral research scientist in FAU's Center for Complex Systems and Brain Sciences. "However, some infants showed movement patterns during this disconnected phase that contained hints of their earlier interactions with the mobile. This suggests that only certain infants understood their relationship with the mobile well enough to maintain those movement patterns, expecting that they would still produce a response from the mobile even after being disconnected." The researchers say that if the accuracy of infants' movements remains high during the disconnection, it might indicate that the infants learned something during their earlier interactions. However, different types of movements might mean different things in terms of what the infants discovered. "It's important to note that studying infants is more challenging than studying adults because infants can't communicate verbally," said Nancy Aaron Jones, Ph.D., co-author, professor in FAU's Department of Psychology, director of the FAU WAVES Lab, and a member of the Center for Complex Systems and Brain Sciences within the Charles E. Schmidt College of Science. "Adults can follow instructions and explain their actions, while infants cannot. That's where AI can help. AI can help researchers analyze subtle changes in infant movements, and even their stillness, to give us insights into how they think and learn, even before they can speak. Their movements can also help us make sense of the vast degree of individual variation that occurs as infants develop." Looking at how AI classification accuracy changes for each infant gives researchers a new way to understand when and how they start to engage with the world. "While past AI methods mainly focused on classifying spontaneous movements linked to clinical outcomes, combining theory-based experiments with AI will help us create better assessments of infant behavior that are relevant to their specific contexts," said Kelso. "This can improve how we identify risks, diagnose and treat disorders." Study co-authors are first author Massoud Khodadadzadeh, Ph.D., formerly at Ulster University in Derry, North Ireland and now at University of Bedfordshire, United Kingdom; and Damien Coyle, Ph.D., at the University of Bath, United Kingdom. The research was supported by Tier 2 High Performance Computing resources provided by the Northern Ireland High-Performance Computing facility funded by the U.K. Engineering and Physical Sciences Research Council; the U.K. Research and Innovation Turing AI Fellowship (2021-2025) funded by the Engineering and Physical Research Council, Vice Chancellor's Research Scholarship; the Institute for Research in Applicable Computing at the University of Bedfordshire; the FAU Foundation (Eminent Scholar in Science); and United States National Institutes of Health. - FAU -
Share
Share
Copy Link
A groundbreaking study using artificial intelligence has uncovered new insights into how infants transition from random movements to purposeful actions, with a focus on the crucial role of foot movements in early development and learning.
Researchers at Florida Atlantic University have employed artificial intelligence (AI) to gain new insights into infant development and learning. The study, published in Scientific Reports, utilized the classic baby-mobile experiment, where a colorful mobile is gently tethered to an infant's foot 1. This setup, used since the late 1960s, allows researchers to observe how infants discover their ability to influence their environment through movement.
The research team used a Vicon 3D motion capture system to track and analyze infant movements during different stages of the experiment 2. Various AI techniques, including machine and deep learning methods, were applied to classify five-second clips of infant movements. The deep learning model 2D-CapsNet emerged as the most effective, achieving an impressive 86% accuracy in analyzing foot movements.
A key finding of the study was that foot movements consistently showed the highest accuracy rates across all AI methods tested. Dr. Scott Kelso, a co-author of the study, explained, "The feet - as end effectors - are the most affected by the interaction with the mobile" 3. This suggests that infants' primary way of connecting with their environment is through their feet, which showed about 20% higher accuracy in movement patterns compared to hands, knees, or the whole body.
The research revealed interesting patterns in infant behavior:
Increased exploration: Infants explored more after being disconnected from the mobile, suggesting a desire to reconnect and interact with their environment 2.
Individual differences: Some infants maintained movement patterns even after disconnection, indicating a deeper understanding of their relationship with the mobile 1.
Dr. Nancy Aaron Jones highlighted the unique challenges of studying infants, who cannot communicate verbally. AI offers a solution by analyzing subtle changes in infant movements, providing insights into their thinking and learning processes before they can speak 3.
The study opens up new possibilities for understanding and assessing infant behavior. Dr. Kelso suggests that combining theory-based experiments with AI could lead to more comprehensive assessments of infant behavior, potentially improving the identification of risks and treatment of disorders 1.
This groundbreaking research not only enhances our understanding of infant development but also demonstrates the power of AI in decoding the complex world of early childhood learning and interaction.
Reference
[2]
Medical Xpress - Medical and Health News
|Feet first: AI reveals how infants connect with their worldMount Sinai researchers develop an AI tool that uses video data to detect neurological changes in NICU infants, potentially transforming neonatal care through continuous, non-invasive monitoring.
4 Sources
4 Sources
Researchers at OIST have developed an AI model that learns like toddlers, integrating vision, proprioception, and language to achieve compositionality. This breakthrough offers insights into human cognitive development and potential pathways for more transparent and ethical AI.
2 Sources
2 Sources
Researchers have developed an AI system capable of predicting human thoughts and providing new insights into brain function. This groundbreaking technology has implications for understanding cognition and potential medical applications.
2 Sources
2 Sources
Scientists at Scripps Research have developed MovieNet, an AI model that processes videos by mimicking how the human brain interprets real-time visual scenes, achieving 82% accuracy in distinguishing complex behaviors.
5 Sources
5 Sources
A groundbreaking AI-based analysis of nearly 10,000 pregnancies has revealed previously unknown risk factors for stillbirth and newborn complications, potentially revolutionizing prenatal care and risk assessment.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved