3 Sources
[1]
Novel deep learning model leverages real-time data to assist in diagnosing nystagmus
Florida Atlantic UniversityJun 4 2025 Artificial intelligence is playing an increasingly vital role in modern medicine, particularly in interpreting medical images to help clinicians assess disease severity, guide treatment decisions and monitor disease progression. Despite these advancements, most current AI models are based on static datasets, limiting their adaptability and real-time diagnostic potential. To address this gap, researchers from Florida Atlantic University and collaborators, have developed a novel proof-of-concept deep learning model that leverages real-time data to assist in diagnosing nystagmus - a condition characterized by involuntary, rhythmic eye movements often linked to vestibular or neurological disorders. Gold-standard diagnostic tools such as videonystagmography (VNG) and electronystagmography have been used to detect nystagmus. However, these methods come with notable drawbacks: high costs (with VNG equipment often exceeding $100,000), bulky setups, and inconvenience for patients during testing. FAU's AI-driven system offers a cost-effective and patient-friendly alternative, for a quick and reliable screening for balance disorders and abnormal eye movements. The platform allows patients to record their eye movements using a smartphone, securely upload the video to a cloud-based system, and receive remote diagnostic analysis from vestibular and balance experts - all without leaving their home. At the heart of this innovation is a deep learning framework that uses real-time facial landmark tracking to analyze eye movements. The AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity - a key metric for identifying nystagmus intensity, duration and direction. It then generates intuitive graphs and reports that can easily be interpreted by audiologists and other clinicians during virtual consultations. Results of the pilot study involving 20 participants, published in Cureus (part of Springer Nature), demonstrated that the AI system's assessments closely mirrored those obtained through traditional medical devices. This early success underscores the model's accuracy and potential for clinical reliability, even in its initial stages. Our AI model offers a promising tool that can partially supplement - or, in some cases, replace - conventional diagnostic methods, especially in telehealth environments where access to specialized care is limited. By integrating deep learning, cloud computing and telemedicine, we're making diagnosis more flexible, affordable and accessible - particularly for low-income rural and remote communities." Ali Danesh, Ph.D., principal investigator of the study, senior author, professor in the Department of Communication Sciences and Disorders within FAU's College of Education and professor of biomedical science within FAU's Charles E. Schmidt College of Medicine The team trained their algorithm on more than 15,000 video frames, using a structured 70:20:10 split for training, testing and validation. This rigorous approach ensured the model's robustness and adaptability across varied patient populations. The AI also employs intelligent filtering to eliminate artifacts such as eye blinks, ensuring accurate and consistent readings. Beyond diagnostics, the system is designed to streamline clinical workflows. Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patients' electronic health records, and develop personalized treatment plans. Patients, in turn, benefit from reduced travel, lower costs and the convenience of conducting follow-up assessments by simply uploading new videos from home - enabling clinicians to track disorder progression over time. In parallel, FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time. Early tests in controlled environments have shown promise, though improvements are still needed to address challenges such as sensor noise and variability among individual users. "While still in its early stages, our technology holds the potential to transform care for patients with vestibular and neurological disorders," said Harshal Sanghvi, Ph.D., first author, an FAU electrical engineering and computer science graduate, and a postdoctoral fellow at FAU's College of Medicine and College of Business. "With its ability to provide non-invasive, real-time analysis, our platform could be deployed widely - in clinics, emergency rooms, audiology centers and even at home." Sanghvi worked closely with his mentors and co-authors on this project including Abhijit S. Pandya, Ph.D., FAU Department of Electrical Engineering and Computer Science and FAU Department of Biomedical Engineering, and B. Sue Graves, Ed.D., Department of Exercise Science and Health Promotion, FAU Charles E. Schmidt College of Science. This interdisciplinary initiative includes collaborators from FAU's College of Business, College of Medicine, College of Engineering and Computer Science, College of Science, and partners from Advanced Research, Marcus Neuroscience Institute - part of Baptist Health - at Boca Raton Regional Hospital, Loma Linda University Medical Center, and Broward Health North. Together, they are working to enhance the model's accuracy, expand testing across diverse patient populations, and move toward FDA approval for broader clinical adoption. "As telemedicine becomes an increasingly integral part of health care delivery, AI-powered diagnostic tools like this one are poised to improve early detection, streamline specialist referrals, and reduce the burden on health care providers," said Danesh. "Ultimately, this innovation promises better outcomes for patients -regardless of where they live." Along with Pandya and Graves, study co-authors are Jilene Moxam, Advanced Research LLC; Sandeep K. Reddy, Ph.D., FAU College of Engineering and Computer Science; Gurnoor S. Gill, FAU College of Medicine; Sajeel A. Chowdhary, M.D., Marcus Neuroscience Institute - part of Baptist Health - at Boca Raton Regional Hospital; Kakarla Chalam, M.D., Ph.D., Loma Linda University; and Shailesh Gupta, M.D., Broward Health North. Florida Atlantic University Journal reference: Sanghvi, H., et al. (2025). Artificial Intelligence-Driven Telehealth Framework for Detecting Nystagmus. Cureus. doi.org/10.7759/cureus.84036.
[2]
'Eye' on health: AI detects dizziness and balance disorders remotely
Artificial intelligence is playing an increasingly vital role in modern medicine, particularly in interpreting medical images to help clinicians assess disease severity, guide treatment decisions and monitor disease progression. Despite these advancements, most current AI models are based on static datasets, limiting their adaptability and real-time diagnostic potential. To address this gap, researchers from Florida Atlantic University and collaborators have developed a novel proof-of-concept deep learning model that leverages real-time data to assist in diagnosing nystagmus -- a condition characterized by involuntary, rhythmic eye movements often linked to vestibular or neurological disorders. Gold-standard diagnostic tools such as videonystagmography (VNG) and electronystagmography have been used to detect nystagmus. However, these methods come with notable drawbacks: high costs (with VNG equipment often exceeding $100,000), bulky setups, and inconvenience for patients during testing. FAU's AI-driven system offers a cost-effective and patient-friendly alternative, for a quick and reliable screening for balance disorders and abnormal eye movements. The platform allows patients to record their eye movements using a smartphone, securely upload the video to a cloud-based system, and receive remote diagnostic analysis from vestibular and balance experts -- all without leaving their home. At the heart of this innovation is a deep learning framework that uses real-time facial landmark tracking to analyze eye movements. The AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity -- a key metric for identifying nystagmus intensity, duration and direction. It then generates intuitive graphs and reports that can easily be interpreted by audiologists and other clinicians during virtual consultations. Results of the pilot study involving 20 participants, published in Cureus, demonstrated that the AI system's assessments closely mirrored those obtained through traditional medical devices. This early success underscores the model's accuracy and potential for clinical reliability, even in its initial stages. "Our AI model offers a promising tool that can partially supplement -- or, in some cases, replace -- conventional diagnostic methods, especially in telehealth environments where access to specialized care is limited," said Ali Danesh, Ph.D., principal investigator of the study, senior author, a professor in the Department of Communication Sciences and Disorders within FAU's College of Education and a professor of biomedical science within FAU's Charles E. Schmidt College of Medicine. "By integrating deep learning, cloud computing and telemedicine, we're making diagnosis more flexible, affordable and accessible -- particularly for low-income rural and remote communities." The team trained their algorithm on more than 15,000 video frames, using a structured 70:20:10 split for training, testing and validation. This rigorous approach ensured the model's robustness and adaptability across varied patient populations. The AI also employs intelligent filtering to eliminate artifacts such as eye blinks, ensuring accurate and consistent readings. Beyond diagnostics, the system is designed to streamline clinical workflows. Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patients' electronic health records, and develop personalized treatment plans. Patients, in turn, benefit from reduced travel, lower costs and the convenience of conducting follow-up assessments by simply uploading new videos from home -- enabling clinicians to track disorder progression over time. In parallel, FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time. Early tests in controlled environments have shown promise, though improvements are still needed to address challenges such as sensor noise and variability among individual users. "While still in its early stages, our technology holds the potential to transform care for patients with vestibular and neurological disorders," said Harshal Sanghvi, Ph.D., first author, an FAU electrical engineering and computer science graduate, and a postdoctoral fellow at FAU's College of Medicine and College of Business. "With its ability to provide non-invasive, real-time analysis, our platform could be deployed widely -- in clinics, emergency rooms, audiology centers and even at home." Sanghvi worked closely with his mentors and co-authors on this project, including Abhijit S. Pandya, Ph.D., FAU Department of Electrical Engineering and Computer Science and FAU Department of Biomedical Engineering, and B. Sue Graves, Ed.D., Department of Exercise Science and Health Promotion, FAU Charles E. Schmidt College of Science. This interdisciplinary initiative includes collaborators from FAU's College of Business, College of Medicine, College of Engineering and Computer Science, College of Science, and partners from Advanced Research, Marcus Neuroscience Institute -- part of Baptist Health -- at Boca Raton Regional Hospital, Loma Linda University Medical Center, and Broward Health North. Together, they are working to enhance the model's accuracy, expand testing across diverse patient populations, and move toward FDA approval for broader clinical adoption. "As telemedicine becomes an increasingly integral part of health care delivery, AI-powered diagnostic tools like this one are poised to improve early detection, streamline specialist referrals, and reduce the burden on health care providers," said Danesh. "Ultimately, this innovation promises better outcomes for patients -regardless of where they live."
[3]
'Eye' on Health: AI Detects Dizziness and Balance Disorders Remotely | Newswise
Newswise -- Artificial intelligence is playing an increasingly vital role in modern medicine, particularly in interpreting medical images to help clinicians assess disease severity, guide treatment decisions and monitor disease progression. Despite these advancements, most current AI models are based on static datasets, limiting their adaptability and real-time diagnostic potential. To address this gap, researchers from Florida Atlantic University and collaborators, have developed a novel proof-of-concept deep learning model that leverages real-time data to assist in diagnosing nystagmus - a condition characterized by involuntary, rhythmic eye movements often linked to vestibular or neurological disorders. Gold-standard diagnostic tools such as videonystagmography (VNG) and electronystagmography have been used to detect nystagmus. However, these methods come with notable drawbacks: high costs (with VNG equipment often exceeding $100,000), bulky setups, and inconvenience for patients during testing. FAU's AI-driven system offers a cost-effective and patient-friendly alternative, for a quick and reliable screening for balance disorders and abnormal eye movements. The platform allows patients to record their eye movements using a smartphone, securely upload the video to a cloud-based system, and receive remote diagnostic analysis from vestibular and balance experts - all without leaving their home. At the heart of this innovation is a deep learning framework that uses real-time facial landmark tracking to analyze eye movements. The AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity - a key metric for identifying nystagmus intensity, duration and direction. It then generates intuitive graphs and reports that can easily be interpreted by audiologists and other clinicians during virtual consultations. Results of the pilot study involving 20 participants, published in Cureus (part of Springer Nature), demonstrated that the AI system's assessments closely mirrored those obtained through traditional medical devices. This early success underscores the model's accuracy and potential for clinical reliability, even in its initial stages. "Our AI model offers a promising tool that can partially supplement - or, in some cases, replace - conventional diagnostic methods, especially in telehealth environments where access to specialized care is limited," said Ali Danesh, Ph.D., principal investigator of the study, senior author, a professor in the Department of Communication Sciences and Disorders within FAU's College of Education and a professor of biomedical science within FAU's Charles E. Schmidt College of Medicine. "By integrating deep learning, cloud computing and telemedicine, we're making diagnosis more flexible, affordable and accessible - particularly for low-income rural and remote communities." The team trained their algorithm on more than 15,000 video frames, using a structured 70:20:10 split for training, testing and validation. This rigorous approach ensured the model's robustness and adaptability across varied patient populations. The AI also employs intelligent filtering to eliminate artifacts such as eye blinks, ensuring accurate and consistent readings. Beyond diagnostics, the system is designed to streamline clinical workflows. Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patients' electronic health records, and develop personalized treatment plans. Patients, in turn, benefit from reduced travel, lower costs and the convenience of conducting follow-up assessments by simply uploading new videos from home - enabling clinicians to track disorder progression over time. In parallel, FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time. Early tests in controlled environments have shown promise, though improvements are still needed to address challenges such as sensor noise and variability among individual users. "While still in its early stages, our technology holds the potential to transform care for patients with vestibular and neurological disorders," said Harshal Sanghvi, Ph.D., first author, an FAU electrical engineering and computer science graduate, and a postdoctoral fellow at FAU's College of Medicine and College of Business. "With its ability to provide non-invasive, real-time analysis, our platform could be deployed widely - in clinics, emergency rooms, audiology centers and even at home." Sanghvi worked closely with his mentors and co-authors on this project including Abhijit S. Pandya, Ph.D., FAU Department of Electrical Engineering and Computer Science and FAU Department of Biomedical Engineering, and B. Sue Graves, Ed.D., Department of Exercise Science and Health Promotion, FAU Charles E. Schmidt College of Science. This interdisciplinary initiative includes collaborators from FAU's College of Business, College of Medicine, College of Engineering and Computer Science, College of Science, and partners from Advanced Research, Marcus Neuroscience Institute - part of Baptist Health - at Boca Raton Regional Hospital, Loma Linda University Medical Center, and Broward Health North. Together, they are working to enhance the model's accuracy, expand testing across diverse patient populations, and move toward FDA approval for broader clinical adoption. "As telemedicine becomes an increasingly integral part of health care delivery, AI-powered diagnostic tools like this one are poised to improve early detection, streamline specialist referrals, and reduce the burden on health care providers," said Danesh. "Ultimately, this innovation promises better outcomes for patients -regardless of where they live." Along with Pandya and Graves, study co-authors are Jilene Moxam, Advanced Research LLC; Sandeep K. Reddy, Ph.D., FAU College of Engineering and Computer Science; Gurnoor S. Gill, FAU College of Medicine; Sajeel A. Chowdhary, M.D., Marcus Neuroscience Institute - part of Baptist Health - at Boca Raton Regional Hospital; Kakarla Chalam, M.D., Ph.D., Loma Linda University; and Shailesh Gupta, M.D., Broward Health North. In 1964, Florida Atlantic University's College of Education became South Florida's first provider of education professionals. Dedicated to advancing research and educational excellence, the College is nationally recognized for its innovative programs, evidence-based training, and professional practice. The College spans five departments: Curriculum and Instruction, Educational Leadership and Research Methodology, Special Education, Counselor Education, and Communication Sciences and Disorders, to prepare highly skilled teachers, school leaders, counselors, and speech pathologists. Faculty engage in cutting-edge research supported by prestigious organizations, including the National Science Foundation, the U.S. Department of Education, and the State of Florida.
Share
Copy Link
Florida Atlantic University researchers develop a novel deep learning model that uses real-time data from smartphone videos to diagnose nystagmus, offering a cost-effective and accessible alternative to traditional diagnostic methods.
Researchers from Florida Atlantic University (FAU) have developed a groundbreaking deep learning model that utilizes real-time data to assist in diagnosing nystagmus, a condition characterized by involuntary eye movements often associated with vestibular or neurological disorders 123. This novel approach addresses the limitations of current AI models based on static datasets, offering improved adaptability and real-time diagnostic potential.
Source: News-Medical
Traditional diagnostic tools for nystagmus, such as videonystagmography (VNG) and electronystagmography, come with significant drawbacks, including high costs and inconvenience for patients. The new AI-driven system developed by FAU provides a cost-effective and patient-friendly alternative 123. This innovation allows patients to record their eye movements using a smartphone, upload the video to a cloud-based system, and receive remote diagnostic analysis from experts without leaving their homes.
At the core of this innovation is a deep learning framework that employs real-time facial landmark tracking to analyze eye movements 123. The AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity, a crucial metric for identifying nystagmus intensity, duration, and direction. It then generates intuitive graphs and reports that can be easily interpreted by clinicians during virtual consultations.
A pilot study involving 20 participants demonstrated that the AI system's assessments closely mirrored those obtained through traditional medical devices 123. The research team, led by Dr. Ali Danesh and Dr. Harshal Sanghvi, trained the algorithm on over 15,000 video frames, ensuring robustness and adaptability across varied patient populations.
Beyond diagnostics, the system is designed to enhance clinical workflows 123. Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patients' electronic health records, and develop personalized treatment plans. This approach offers benefits such as reduced travel, lower costs, and convenient follow-up assessments for patients.
The project involves collaborators from various FAU colleges and external partners, including the Marcus Neuroscience Institute and Loma Linda University Medical Center 123. The team is working to enhance the model's accuracy, expand testing across diverse patient populations, and pursue FDA approval for broader clinical adoption.
As telemedicine becomes increasingly integral to healthcare delivery, AI-powered diagnostic tools like this one have the potential to improve early detection, streamline specialist referrals, and reduce the burden on healthcare providers 123. Dr. Danesh emphasized that this innovation promises better outcomes for patients, regardless of their location, particularly benefiting low-income rural and remote communities.
Source: Medical Xpress
In parallel with the smartphone-based system, FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time 123. While early tests in controlled environments have shown promise, further improvements are needed to address challenges such as sensor noise and variability among individual users.
OpenAI reports on disrupting several covert operations using ChatGPT for propaganda, social engineering, and surveillance, with a focus on China-linked activities across multiple social media platforms.
8 Sources
Technology
17 hrs ago
8 Sources
Technology
17 hrs ago
Palantir CEO Alex Karp emphasizes the dangers of AI and the high stakes in the US-China tech race, highlighting Palantir's role in advancing US interests while addressing concerns about surveillance.
4 Sources
Technology
17 hrs ago
4 Sources
Technology
17 hrs ago
Microsoft's stock reaches a new all-time high, driven by its strategic AI investments and strong market position in cloud computing and productivity software.
3 Sources
Business and Economy
17 hrs ago
3 Sources
Business and Economy
17 hrs ago
Anysphere, the company behind the AI-powered code editor Cursor, has raised $900 million in funding, reaching a $9.9 billion valuation. The startup has surpassed $500 million in annualized recurring revenue, making it potentially the fastest-growing software startup ever.
3 Sources
Technology
1 hr ago
3 Sources
Technology
1 hr ago
A groundbreaking generative AI system developed by Northwestern Medicine has shown significant improvements in radiology efficiency and accuracy, potentially addressing the global radiologist shortage.
2 Sources
Health
1 hr ago
2 Sources
Health
1 hr ago