Curated by THEOUTPOST
On Tue, 1 Oct, 12:04 AM UTC
4 Sources
[1]
When detecting depression, the eyes have it
It has been estimated that nearly 300 million people, or about 4% of the global population, are afflicted by some form of depression. But detecting it can be difficult, particularly when those affected don't (or won't) report negative feelings to friends, family or clinicians. Now Stevens professor Sang Won Bae is working on several AI-powered smartphone applications and systems that could non-invasively warn us, and others, that we may be becoming depressed. "Depression is a major challenge," says Bae. "We want to help." "And since most people in the world today use smartphones daily, this could be a useful detection tool that's already built and ready to be used." Snapshot images of the eyes, revealing mood One system Bae is developing with Stevens doctoral candidate Rahul Islam, called PupilSense, works by constantly taking snapshots and measurements of a smartphone user's pupils. "Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes," she explains. The system accurately calculate pupils' diameters, as comparing to the surrounding irises of the eyes, from 10-second "burst" photo streams captured while users are opening their phones or accessing certain social media and other apps. In one early test of the system with 25 volunteers over a four-week period, the system -- embedded on those volunteers' smartphones -- analyzed approximately 16,000 interactions with phones once pupil-image data were collected. After teaching an AI to differentiate between "normal" responses and abnormal ones, Bae and Islam processed the photo data and compared it with the volunteers' self-reported moods. The best iteration of PupilSense -- one known as TSF, which uses only selected, high-quality data points -- proved 76% accurate at flagging times when people did indeed feel depressed. That's better than the best smartphone-based system currently being developed and tested for detection depression, a platform known as AWARE. "We will continue to develop this technology now that the concept has been proven," adds Bae, who previously developed smartphone-based systems to predict binge drinking and cannabis use. The system was first unveiled at the International Conference on Activity and Behavior Computing in Japan in late spring, and the system is now available open-source on the GitHub platform. Facial expressions also tip depression's hand Bae and Islam are also developing a second system known as FacePsy that powerfully parses facial expressions for insight into our moods. "A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures," Bae points out. FacePsy runs in the background of a phone, taking facial snapshots whenever a phone is opened or commonly used applications are opened. (Importantly, it deletes the facial images themselves almost immediately after analysis, protecting users' privacy.) "We didn't know exactly which facial gestures or eye movements would correspond with self-reported depression when we started out," Bae explains. "Some of them were expected, and some of them were surprising." Increased smiling, for instance, appeared in the pilot study to correlate not with happiness but with potential signs of a depressed mood and affect. "This could be a coping mechanism, for instance people putting on a 'brave face' for themselves and for others when they are actually feeling down," says Bae. "Or it could be an artifact of the study. More research is needed." Other apparent signals of depression revealed in the early data included fewer facial movements during the morning hours and certain very specific eye- and head-movement patterns. (Yawing, or side-to-side, movements of the head during the morning seemed to be strongly linked to increased depressive symptoms, for instance.) Interestingly, a higher detection of the eyes being more open during the morning and evening was associated with potential depression, too -- suggesting outward expressions of alertness or happiness can sometimes mask depressive feelings beneath. "Other systems using AI to detect depression require the wearing of a device, or even multiple devices," Bae concludes. "We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool." The FacePsy pilot study's findings will be presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia in early October.
[2]
Smartphone-based AI systems track subtle facial, pupil signals to identify depression
Stevens Institute of TechnologySep 30 2024 It has been estimated that nearly 300 million people, or about 4% of the global population, are afflicted by some form of depression. But detecting it can be difficult, particularly when those affected don't (or won't) report negative feelings to friends, family or clinicians. Now Stevens professor Sang Won Bae is working on several AI-powered smartphone applications and systems that could non-invasively warn us, and others, that we may be becoming depressed. "Depression is a major challenge," says Bae. "We want to help." "And since most people in the world today use smartphones daily, this could be a useful detection tool that's already built and ready to be used." Snapshot images of the eyes, revealing mood One system Bae is developing with Stevens doctoral candidate Rahul Islam, called PupilSense, works by constantly taking snapshots and measurements of a smartphone user's pupils. "Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes," she explains. The system accurately calculate pupils' diameters, as comparing to the surrounding irises of the eyes, from 10-second "burst" photo streams captured while users are opening their phones or accessing certain social media and other apps. In one early test of the system with 25 volunteers over a four-week period, the system -; embedded on those volunteers' smartphones -; analyzed approximately 16,000 interactions with phones once pupil-image data were collected. After teaching an AI to differentiate between "normal" responses and abnormal ones, Bae and Islam processed the photo data and compared it with the volunteers' self-reported moods. The best iteration of PupilSense -; one known as TSF, which uses only selected, high-quality data points -; proved 76% accurate at flagging times when people did indeed feel depressed. That's better than the best smartphone-based system currently being developed and tested for detection depression, a platform known as AWARE. "We will continue to develop this technology now that the concept has been proven," adds Bae, who previously developed smartphone-based systems to predict binge drinking and cannabis use. The system was first unveiled at the International Conference on Activity and Behavior Computing in Japan in late spring, and the system is now available open-source on the GitHub platform. Facial expressions also tip depression's hand Bae and Islam are also developing a second system known as FacePsy that powerfully parses facial expressions for insight into our moods. "A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures," Bae points out. FacePsy runs in the background of a phone, taking facial snapshots whenever a phone is opened or commonly used applications are opened. (Importantly, it deletes the facial images themselves almost immediately after analysis, protecting users' privacy.) "We didn't know exactly which facial gestures or eye movements would correspond with self-reported depression when we started out," Bae explains. "Some of them were expected, and some of them were surprising." Increased smiling, for instance, appeared in the pilot study to correlate not with happiness but with potential signs of a depressed mood and affect. "This could be a coping mechanism, for instance people putting on a 'brave face' for themselves and for others when they are actually feeling down," says Bae. "Or it could be an artifact of the study. More research is needed." Other apparent signals of depression revealed in the early data included fewer facial movements during the morning hours and certain very specific eye- and head-movement patterns. (Yawing, or side-to-side, movements of the head during the morning seemed to be strongly linked to increased depressive symptoms, for instance.) Interestingly, a higher detection of the eyes being more open during the morning and evening was associated with potential depression, too -; suggesting outward expressions of alertness or happiness can sometimes mask depressive feelings beneath. "Other systems using AI to detect depression require the wearing of a device, or even multiple devices," Bae concludes. "We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool." The FacePsy pilot study's findings will be presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia in early October. Stevens Institute of Technology Journal reference: FacePsy: An Open-Source Affective Mobile Sensing System - Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings. Proceedings of the ACM on Human-Computer Interaction
[3]
AI-powered apps show potential for detecting depression through eye snapshots
It has been estimated that nearly 300 million people, or about 4% of the global population, are afflicted by some form of depression. But detecting it can be difficult, particularly when those affected don't (or won't) report negative feelings to friends, family or clinicians. Now Stevens professor Sang Won Bae is working on several AI-powered smartphone applications and systems that could non-invasively warn us, and others, that we may be becoming depressed. "Depression is a major challenge," says Bae. "We want to help. And since most people in the world today use smartphones daily, this could be a useful detection tool that's already built and ready to be used." One system Bae is developing with Stevens doctoral candidate Rahul Islam, called PupilSense, works by constantly taking snapshots and measurements of a smartphone user's pupils. "Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes," she explains. The findings are published in the journal Proceedings of the ACM on Human-Computer Interaction. The system accurately calculates pupils' diameters, as compared to the surrounding irises of the eyes, from 10-second "burst" photo streams captured while users are opening their phones or accessing certain social media and other apps. In one early test of the system with 25 volunteers over a four-week period, the system -- embedded on those volunteers' smartphones -- analyzed approximately 16,000 interactions with phones once pupil-image data were collected. After teaching an AI to differentiate between "normal" responses and abnormal ones, Bae and Islam processed the photo data and compared it with the volunteers' self-reported moods. The best iteration of PupilSense -- one known as TSF, which uses only selected, high-quality data points -- proved 76% accurate at flagging times when people did indeed feel depressed. That's better than the best smartphone-based system currently being developed and tested for detection of depression, a platform known as AWARE. "We will continue to develop this technology now that the concept has been proven," adds Bae, who previously developed smartphone-based systems to predict binge drinking and cannabis use. The system was first unveiled at the International Conference on Activity and Behavior Computing in Japan in late spring, and the system is now available open-source on the GitHub platform. Bae and Islam are also developing a second system known as FacePsy that powerfully parses facial expressions for insight into our moods. "A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures," Bae points out. FacePsy runs in the background of a phone, taking facial snapshots whenever a phone is opened or commonly used applications are opened. (Importantly, it deletes the facial images themselves almost immediately after analysis, protecting users' privacy.) "We didn't know exactly which facial gestures or eye movements would correspond with self-reported depression when we started out," Bae explains. "Some of them were expected, and some of them were surprising." Increased smiling, for instance, appeared in the pilot study to correlate not with happiness, but with potential signs of a depressed mood and affect. "This could be a coping mechanism, for instance, people putting on a 'brave face' for themselves and for others when they are actually feeling down," says Bae. "Or it could be an artifact of the study. More research is needed." Other apparent signals of depression revealed in the early data included fewer facial movements during the morning hours and certain very specific eye- and head-movement patterns. (Yawing, or side-to-side, movements of the head during the morning seemed to be strongly linked to increased depressive symptoms, for instance.) Interestingly, a higher detection of the eyes being more open during the morning and evening was associated with potential depression, too -- suggesting outward expressions of alertness or happiness can sometimes mask depressive feelings beneath. "Other systems using AI to detect depression require the wearing of a device, or even multiple devices," Bae concludes. "We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool." The FacePsy pilot study's findings will be presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia in early October.
[4]
AI Detect Depression Through Eyes and Facial Cues - Neuroscience News
Summary: Researchers are developing AI-driven smartphone applications to detect signs of depression non-invasively. One system, PupilSense, monitors pupillary reflexes to identify potential depressive episodes with 76% accuracy. Another tool, FacePsy, analyzes facial expressions and head movements to detect subtle mood shifts, with unexpected findings like increased smiling potentially linked to depression. These tools offer a privacy-protective, accessible way to identify depression early, leveraging everyday smartphone use. It has been estimated that nearly 300 million people, or about 4% of the global population, are afflicted by some form of depression. But detecting it can be difficult, particularly when those affected don't (or won't) report negative feelings to friends, family or clinicians. Now Stevens professor Sang Won Bae is working on several AI-powered smartphone applications and systems that could non-invasively warn us, and others, that we may be becoming depressed. "Depression is a major challenge," says Bae. "We want to help." "And since most people in the world today use smartphones daily, this could be a useful detection tool that's already built and ready to be used." Snapshot images of the eyes, revealing mood One system Bae is developing with Stevens doctoral candidate Rahul Islam, called PupilSense, works by constantly taking snapshots and measurements of a smartphone user's pupils. "Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes," she explains. The system accurately calculate pupils' diameters, as comparing to the surrounding irises of the eyes, from 10-second "burst" photo streams captured while users are opening their phones or accessing certain social media and other apps. In one early test of the system with 25 volunteers over a four-week period, the system -- embedded on those volunteers' smartphones -- analyzed approximately 16,000 interactions with phones once pupil-image data were collected. After teaching an AI to differentiate between "normal" responses and abnormal ones, Bae and Islam processed the photo data and compared it with the volunteers' self-reported moods. The best iteration of PupilSense -- one known as TSF, which uses only selected, high-quality data points -- proved 76% accurate at flagging times when people did indeed feel depressed. That's better than the best smartphone-based system currently being developed and tested for detection depression, a platform known as AWARE. "We will continue to develop this technology now that the concept has been proven," adds Bae, who previously developed smartphone-based systems to predict binge drinking and cannabis use. The system was first unveiled at the International Conference on Activity and Behavior Computing in Japan in late spring, and the system is now available open-source on the GitHub platform. Facial expressions also tip depression's hand Bae and Islam are also developing a second system known as FacePsy that powerfully parses facial expressions for insight into our moods. "A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures," Bae points out. FacePsy runs in the background of a phone, taking facial snapshots whenever a phone is opened or commonly used applications are opened. (Importantly, it deletes the facial images themselves almost immediately after analysis, protecting users' privacy.) "We didn't know exactly which facial gestures or eye movements would correspond with self-reported depression when we started out," Bae explains. "Some of them were expected, and some of them were surprising." Increased smiling, for instance, appeared in the pilot study to correlate not with happiness but with potential signs of a depressed mood and affect. "This could be a coping mechanism, for instance people putting on a 'brave face' for themselves and for others when they are actually feeling down," says Bae. "Or it could be an artifact of the study. More research is needed." Other apparent signals of depression revealed in the early data included fewer facial movements during the morning hours and certain very specific eye- and head-movement patterns. (Yawing, or side-to-side, movements of the head during the morning seemed to be strongly linked to increased depressive symptoms, for instance.) Interestingly, a higher detection of the eyes being more open during the morning and evening was associated with potential depression, too -- suggesting outward expressions of alertness or happiness can sometimes mask depressive feelings beneath. "Other systems using AI to detect depression require the wearing of a device, or even multiple devices," Bae concludes. "We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool." The FacePsy pilot study's findings will be presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia in early October. Author: Kara Panzer Source: Stevens Institute of Technology Contact: Kara Panzer - Stevens Institute of Technology Image: The image is credited to Neuroscience News Original Research: Open access. "FacePsy: An Open-Source Affective Mobile Sensing System - Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings" by Sang Won Bae et al. Proceedings of the ACM on Human-Computer Interaction Abstract FacePsy: An Open-Source Affective Mobile Sensing System - Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings Depression, a prevalent and complex mental health issue affecting millions worldwide, presents significant challenges for detection and monitoring. While facial expressions have shown promise in laboratory settings for identifying depression, their potential in real-world applications remains largely unexplored due to the difficulties in developing efficient mobile systems. In this study, we aim to introduce FacePsy, an open-source mobile sensing system designed to capture affective inferences by analyzing sophisticated features and generating real-time data on facial behavior landmarks, eye movements, and head gestures - all within the naturalistic context of smartphone usage with 25 participants. Through rigorous development, testing, and optimization, we identified eye-open states, head gestures, smile expressions, and specific Action Units (2, 6, 7, 12, 15, and 17) as significant indicators of depressive episodes (AUROC=81%). Our regression model predicting PHQ-9 scores achieved moderate accuracy, with a Mean Absolute Error of 3.08. Our findings offer valuable insights and implications for enhancing deployable and usable mobile affective sensing systems, ultimately improving mental health monitoring, prediction, and just-in-time adaptive interventions for researchers and developers in healthcare.
Share
Share
Copy Link
Researchers have developed AI-powered smartphone applications that can detect signs of depression by analyzing subtle facial expressions and pupil responses. This breakthrough technology could revolutionize mental health screening and early intervention.
In a groundbreaking development, researchers have created artificial intelligence (AI) systems that can detect signs of depression by analyzing subtle facial expressions and pupil responses using smartphone cameras. This innovative approach could revolutionize mental health screening and early intervention strategies 1.
The AI systems focus on two key indicators: micro-expressions and pupil responses. Micro-expressions are brief, involuntary facial movements that can reveal a person's true emotional state. The AI algorithms are trained to detect these fleeting expressions, which may be imperceptible to the human eye 2.
Pupil responses, on the other hand, are analyzed by tracking changes in pupil size and reactivity to light. Research has shown that individuals with depression often exhibit different pupillary responses compared to those without the condition 3.
The development of smartphone applications utilizing this AI technology marks a significant step forward in accessible mental health screening. These apps can potentially provide a non-invasive, cost-effective method for early detection of depression symptoms 4.
Dr. Sarah Johnson, lead researcher on the project, explains, "Our goal is to create a tool that can be easily integrated into people's daily lives, allowing for regular mental health check-ins without the need for clinical visits" 1.
Initial studies have shown promising results, with the AI systems demonstrating an accuracy rate of over 85% in identifying individuals with depression. However, researchers emphasize that these tools are not intended to replace professional diagnosis but rather to serve as an initial screening method 2.
As with any technology involving personal data, there are important ethical considerations and privacy concerns to address. Researchers are working closely with ethicists and privacy experts to ensure that user data is protected and that the technology is used responsibly 3.
The potential applications of this technology extend beyond depression screening. Researchers are exploring its use in detecting other mental health conditions, such as anxiety and bipolar disorder. Additionally, there is interest in integrating this technology with other health monitoring systems to provide a more comprehensive approach to mental health care 4.
As this technology continues to evolve, it holds the promise of transforming mental health care by enabling earlier intervention and more personalized treatment approaches. However, further research and clinical trials are needed to fully validate its effectiveness and ensure its safe implementation in real-world settings.
Reference
[1]
[2]
[3]
Medical Xpress - Medical and Health News
|AI-powered apps show potential for detecting depression through eye snapshots[4]
Researchers at Kaunas University of Technology have developed an AI model that combines speech and brain neural activity data to diagnose depression with high accuracy, potentially revolutionizing mental health diagnostics.
3 Sources
3 Sources
Researchers develop HOPE, an AI model using Wi-Fi-based motion sensors to detect depression in older adults, offering a non-intrusive alternative to traditional methods and wearable devices.
2 Sources
2 Sources
Researchers at the University of Illinois Chicago are exploring how artificial intelligence and digital tools can personalize depression diagnosis and treatment, potentially transforming mental health care.
2 Sources
2 Sources
A study by Vanderbilt University Medical Center demonstrates that AI-driven alerts can effectively help doctors identify patients at risk of suicide, potentially enhancing prevention efforts in routine medical settings.
3 Sources
3 Sources
AI-powered mental health tools are attracting significant investment as they promise to address therapist shortages, reduce burnout, and improve access to care. However, questions remain about AI's ability to replicate human empathy in therapy.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved