2 Sources
[1]
Smartwatch data helps identify daily activities in real life
Washington State UniversityAug 14 2025 Researchers have long been able to use information from smartwatches to identify physical movement, such as sitting or walking, that wearers are performing in a controlled lab setting. Now, Washington State University researchers have developed a way, using a computer algorithm and a large dataset gathered from smartwatches, to more comprehensively identify what people are doing in everyday settings, such as working, eating, doing hobbies or running errands. The work, published in the IEEE Journal of Biomedical and Health Informatics, could someday lead to improve assessment and understanding of cognitive health, rehabilitation, disease management, or surgical recovery. In their study, the researchers were able to accurately identify activities 78% of the time. "If we want to determine whether somebody needs caregiving assistance in their home or elsewhere and what level of assistance, then we need to know how well the person can perform critical activities," said Diane Cook, WSU Regents Professor in WSU's School of Electrical Engineering and Computer Science who led the work. "How well can they bathe themselves, feed themselves, handle finances, or do their own errands? These are the things that you really need to be able to accomplish to be independent." One of the big challenges in healthcare is trying to assess how people who are sick or elderly are managing their everyday lives. Medical professionals often need more comprehensive information about how a person performs functional activities, or higher-level, goal-directed behavior, to really assess their health. As anyone who is trying to help a distant parent with aging or health challenges knows, that information on how well a person is performing at paying their bills, running errands, or cooking meals is complex, variable, and difficult to gather - whether in a doctor's office or with a smartwatch. Lack of awareness of a person's cognitive and physical status is one of the hurdles that we face as we age, and so having an automated way to give indicators of where a person is will someday allow us to better intervene for them and to keep them not only healthy, but ideally independent. This work lays the foundation for more advanced, behavior-aware applications in digital health and human-centered AI." Diane Cook, WSU Regents Professor, WSU's School of Electrical Engineering and Computer Science For their study, the WSU researchers collected activity information over several years from several studies. "Whenever we had a study that collected smartwatch data, we added a question to our data collection app that asked participants to self-label their current activity, and that's how we ended up with so many participants from so many studies," she said. "And then we just dug in to see whether we can perform activity recognition." The 503 study participants over eight years were asked at random times throughout the day to pick from a scroll-down list of 12 categories to describe what they were doing. The categories included things like doing errands, sleeping, traveling, working, eating, socializing, or relaxing. The researchers analyzed a variety of artificial intelligence methods for their ability to generalize across the population of study participants. The researchers developed a large-scale dataset that includes more than 32 million labeled data points, with each point representing one minute of activity. They then trained an AI model to predict what functional activity had occurred. They were able to predict activities up to 77.7% of the time. "A foundational step is to perform activity recognition because if we can describe a person's behavior in terms of activity in categories that are well recognized, then we can start to talk about their behavior patterns and changes in their patterns," said Cook. "We can use what we sense to try to approximate traditional measures of health, such as cognitive health and functional independence." The researchers hope to use their model in future studies in areas such as being able to automate clinical diagnoses, and to look for links between behavior, health, genetics, and environment. The methods and dataset without any identifying information are also publicly available for other researchers to use. The work was funded by the National Institutes of Health. Washington State University Journal reference: Minor, B., et al. (2025). A Feature-Augmented Transformer Model to Recognize Functional Activities from in-the-wild Smartwatch Data. IEEE Journal of Biomedical and Health Informatics. doi.org/10.1109/JBHI.2025.3586074.
[2]
Smartwatches allow researchers to identify human activity with 78% accuracy
Researchers have long been able to use information from smartwatches to identify physical movement, such as sitting or walking, that wearers are performing in a controlled lab setting. Now, Washington State University researchers have developed a way, using a computer algorithm and a large dataset gathered from smartwatches, to more comprehensively identify what people are doing in everyday settings, such as working, eating, doing hobbies or running errands. The work, published in the IEEE Journal of Biomedical and Health Informatics, could someday lead to improved assessment and understanding of cognitive health, rehabilitation, disease management, or surgical recovery. In their study, the researchers were able to accurately identify activities 78% of the time. "If we want to determine whether somebody needs caregiving assistance in their home or elsewhere and what level of assistance, then we need to know how well the person can perform critical activities," said Diane Cook, WSU Regents Professor in WSU's School of Electrical Engineering and Computer Science who led the work. "How well can they bathe themselves, feed themselves, handle finances, or do their own errands? These are the things that you really need to be able to accomplish to be independent." One of the big challenges in health care is trying to assess how people who are sick or elderly are managing their everyday lives. Medical professionals often need more comprehensive information about how a person performs functional activities, or higher-level, goal-directed behavior, to really assess their health. As anyone who is trying to help a distant parent with aging or health challenges knows, information on how well a person is performing at paying their bills, running errands, or cooking meals is complex, variable, and difficult to gather -- whether in a doctor's office or with a smartwatch. "Lack of awareness of a person's cognitive and physical status is one of the hurdles that we face as we age, and so having an automated way to give indicators of where a person is will someday allow us to better intervene for them and to keep them not only healthy, but ideally independent," said Cook. "This work lays the foundation for more advanced, behavior-aware applications in digital health and human-centered AI." For their study, the WSU researchers collected activity information over several years from several studies. "Whenever we had a study that collected smartwatch data, we added a question to our data collection app that asked participants to self-label their current activity, and that's how we ended up with so many participants from so many studies," she said. "And then we just dug in to see whether we can perform activity recognition." The 503 study participants over eight years were asked at random times throughout the day to pick from a scroll-down list of 12 categories to describe what they were doing. The categories included things like doing errands, sleeping, traveling, working, eating, socializing, or relaxing. The researchers analyzed a variety of artificial intelligence methods for their ability to generalize across the population of study participants. The researchers developed a large-scale dataset that includes more than 32 million labeled data points, with each point representing one minute of activity. They then trained an AI model to predict what functional activity had occurred. They were able to predict activities up to 77.7% of the time. "A foundational step is to perform activity recognition because if we can describe a person's behavior in terms of activity in categories that are well recognized, then we can start to talk about their behavior patterns and changes in their patterns," said Cook. "We can use what we sense to try to approximate traditional measures of health, such as cognitive health and functional independence." The researchers hope to use their model in future studies in areas such as being able to automate clinical diagnoses, and to look for links between behavior, health, genetics, and environment. The methods and dataset without any identifying information are also publicly available for other researchers to use.
Share
Copy Link
Washington State University researchers have developed an AI algorithm that can identify daily activities from smartwatch data with 78% accuracy, potentially revolutionizing health monitoring and caregiving assessment.
Researchers at Washington State University have made a significant advancement in the field of wearable technology and artificial intelligence. They have developed a computer algorithm that can identify daily activities from smartwatch data with an impressive 78% accuracy 12. This breakthrough could have far-reaching implications for health monitoring and caregiving assessment.
Source: Medical Xpress
The research, published in the IEEE Journal of Biomedical and Health Informatics, involved a comprehensive study conducted over eight years with 503 participants 1. The study's methodology was unique in its approach:
Diane Cook, WSU Regents Professor in the School of Electrical Engineering and Computer Science, who led the work, emphasized the importance of this research: "If we want to determine whether somebody needs caregiving assistance in their home or elsewhere and what level of assistance, then we need to know how well the person can perform critical activities" 1.
The ability to accurately identify daily activities through smartwatch data opens up numerous possibilities in healthcare:
Source: News-Medical
One of the most promising aspects of this technology is its potential to support elderly individuals in maintaining their independence. By providing a more comprehensive understanding of how well a person can perform critical activities like bathing, eating, handling finances, or running errands, the system could help determine the level of caregiving assistance needed 12.
The researchers have ambitious plans for the future application of their model:
Importantly, the methods and anonymized dataset have been made publicly available, allowing other researchers to build upon this work and potentially accelerate advancements in the field 1.
As wearable technology continues to evolve, this research represents a significant step towards more advanced, behavior-aware applications in digital health and human-centered AI. The ability to gather and analyze real-world activity data could revolutionize how we approach health monitoring and personalized care in the coming years.
Summarized by
Navi
[1]
As nations compete for dominance in space, the risk of satellite hijacking and space-based weapons escalates, transforming outer space into a potential battlefield with far-reaching consequences for global security and economy.
7 Sources
Technology
14 hrs ago
7 Sources
Technology
14 hrs ago
Anthropic has updated its Claude Opus 4 and 4.1 AI models with the ability to terminate conversations in extreme cases of persistent harm or abuse, as part of its AI welfare research.
6 Sources
Technology
22 hrs ago
6 Sources
Technology
22 hrs ago
A pro-Russian propaganda group, Storm-1679, is using AI-generated content and impersonating legitimate news outlets to spread disinformation, raising concerns about the growing threat of AI-powered fake news.
2 Sources
Technology
14 hrs ago
2 Sources
Technology
14 hrs ago
OpenAI has made subtle changes to GPT-5's personality, aiming to make it more approachable after users complained about its formal tone. The company is also working on allowing greater customization of ChatGPT's style.
4 Sources
Technology
6 hrs ago
4 Sources
Technology
6 hrs ago
SoftBank has purchased Foxconn's Ohio plant for $375 million to produce AI servers for the Stargate project. Foxconn will continue to operate the facility, which will be retrofitted for AI server production.
5 Sources
Technology
6 hrs ago
5 Sources
Technology
6 hrs ago