2 Sources
2 Sources
[1]
AI tool can interpret echocardiograms in minutes
Cardiologists use echocardiography to diagnose a range of functional or structural abnormalities of the heart. Using more than 100 videos and images that capture different parts of the heart, echocardiographers make dozens of measurements, such as the heart's size and shape, ventricle thickness, and the movement and function of each heart chamber, to assess patient heart health. A new study published in JAMA led by Yale School of Medicine (YSM) researchers finds that an artificial intelligence (AI)-enabled tool can interpret echocardiograms with a high degree of accuracy in just a few minutes. "Echocardiography is a cornerstone of cardiovascular care, but it requires a tremendous amount of clinical time from highly skilled readers to review these studies," says Rohan Khera, MD, MS, assistant professor of medicine (cardiovascular medicine) at YSM and of biostatistics (health informatics) at Yale School of Public Health. Khera is the paper's senior author and director of the Cardiovascular Data Science Lab (CarDS). "We wanted to develop a technology that can assist these very busy echocardiographers to help improve accuracy and accelerate their workflow." The researchers found the AI tool, PanEcho, could perform 39 diagnostic tasks based on multi-view echocardiography and accurately detect conditions such as severe aortic stenosis, systolic dysfunction, and left ventricle ejection fraction, among others. This study builds on previous publications, including a 2023 publication in the European Heart Journal, that demonstrated the technology's accuracy. Greg Holste, MSE, a Ph.D. student at UT Austin who is co-advised by Khera and is co-first author of the study, says, "We developed a tool that integrates information from many views of the heart to automatically identify the key measurements and abnormalities that a cardiologist would include in a complete report." PanEcho was developed using 999,727 echocardiographic videos collected from Yale New Haven Health patients between January 2016 and June 2022. Researchers then validated the tool using studies from 5,130 Yale New Haven Health patients as well as three external data cohorts from the Heart and Vascular Center of Semmelweis University in Budapest, Hungary; Stanford University Hospital; and Stanford Health Care. "The tool can now measure and assess a wide range of heart conditions, making it much more attractive for future clinical use," says Evangelos K. Oikonomou, MD, DPhil, clinical fellow (cardiovascular medicine) and co-first author of the study. "While it is highly accurate, it can be less interpretable than the read from a clinician. It's still an algorithm and it requires human oversight." Potential roles for AI in interpreting echocardiography While PanEcho is not yet available for clinical use, the paper discusses several potential future clinical applications of the technology. For instance, echocardiographers could utilize the tool as a preliminary reader to help assess images and videos in the echocardiography lab. It could also serve as a second set of eyes to help identify potentially missed abnormalities in existing databases. The researchers also note that this technology could be particularly valuable in low-resource settings, where access to equipment and skilled echocardiographers is limited. In these environments, clinicians often rely on handheld, point-of-care ultrasound devices, which produce lower-quality imaging that can be more challenging to interpret. To validate the model's accuracy with point-of-care ultrasounds, the researchers used imaging from the Yale New Haven Hospital emergency department, which performs point-of-care ultrasounds as part of routine care. "We replicated the experience of low-resource settings across the world, where clinicians typically use a handheld ultrasound and wait for those images to be interpreted by a cardiologist elsewhere," says Khera. "Even with lower-quality images, our model was very resilient and acquired the information needed to make a highly accurate determination." Ongoing research to assess effectiveness of AI tools Khera and his colleagues are now working to conduct studies to assess how using the tool might change patient care in the echocardiography laboratory at Yale. "We are learning much more about how clinicians use the tool in a real-world setting, including modifications to their workflow, their responses to the information, and the value, if any, that this tool adds in a clinical context," says Khera. "AI tools like the one validated in this study have the potential to help us increase our efficiency and accuracy, ultimately allowing us to screen and treat a larger number of patients with cardiovascular conditions," says Eric J. Velazquez, MD, Robert W. Berliner Professor of Medicine (cardiovascular medicine) and chief of Yale Cardiovascular Medicine. "I'm proud of Yale's continued commitment to investing in cutting-edge research to help us innovate new ways to deliver care." The full model and weights are available via open source, and the research team is encouraging other investigators to test the model using their echocardiographic studies and make improvements. Additional study authors include Zhangyang Wang, Ph.D., at the University of Texas at Austin, and Márton Tokodi MD, Ph.D., and Attila Kovács, MD, Ph.D., both of Semmelweis University.
[2]
AI Tool Interprets Echocardiograms in Minutes, New Yale Study Finds | Newswise
Newswise -- Cardiologists use echocardiography to diagnose a range of functional or structural abnormalities of the heart. Using often over 100 videos and images that capture different parts of the heart, echocardiographers make dozens of measurements, such as the heart's size and shape, ventricle thickness, and the movement and function of each heart chamber, to assess patient heart health. A new study in JAMA led by Yale School of Medicine (YSM) researchers finds that an artificial intelligence (AI)-enabled tool can interpret echocardiograms with a high degree of accuracy in just a few minutes. "Echocardiography is a cornerstone of cardiovascular care, but it requires a tremendous amount of clinical time from highly skilled readers to review these studies," says Rohan Khera, MD, MS, assistant professor of medicine (cardiovascular medicine) at YSM and of biostatistics (health informatics) at Yale School of Public Health. Khera is the paper's senior author and director of the Cardiovascular Data Science Lab (CarDS). "We wanted to develop a technology that can assist these very busy echocardiographers to help improve accuracy and accelerate their workflow." The researchers found the AI tool, PanEcho, could perform 39 diagnostic tasks based on multi-view echocardiography and accurately detect conditions such as severe aortic stenosis, systolic dysfunction, and left ventricle ejection fraction, among others. This study builds on previous publications, including a 2023 publication in the European Heart Journal, that demonstrated the technology's accuracy. Greg Holste, MSE, a PhD student at UT Austin who is co-advised by Khera and is co-first author of the study, says, "We developed a tool that integrates information from many views of the heart to automatically identify the key measurements and abnormalities that a cardiologist would include in a complete report." PanEcho was developed using 999,727 echocardiographic videos collected from Yale New Haven Health patients between January 2016 and June 2022. Researchers then validated the tool using studies from 5,130 Yale New Haven Health patients as well as three external data cohorts from the Heart and Vascular Center of Semmelweis University in Budapest, Hungary; Stanford University Hospital; and Stanford Health Care. "The tool can now measure and assess a wide range of heart conditions, making it much more attractive for future clinical use," says Evangelos K. Oikonomou, MD, DPhil, clinical fellow (cardiovascular medicine) and co-first author of the study. "While it is highly accurate, it can be less interpretable than the read from a clinician. It's still an algorithm and it requires human oversight." While PanEcho is not yet available for clinical use, the paper discusses several potential future clinical applications of the technology. For instance, echocardiographers could utilize the tool as a preliminary reader to help assess images and videos in the echocardiography lab. It could also serve as a second set of eyes to help identify potentially missed abnormalities in existing databases. The researchers also note that this technology could be particularly valuable in low-resource settings, where access to equipment and skilled echocardiographers is limited. In these environments, clinicians often rely on handheld, point-of-care ultrasound devices, which produce lower-quality imaging that can be more challenging to interpret. To validate the model's accuracy with point-of-care ultrasounds, the researchers used imaging from the Yale New Haven Hospital emergency department, which performs point-of-care ultrasounds as part of routine care. "We replicated the experience of low-resource settings across the world, where clinicians typically use a handheld ultrasound and wait for those images to be interpreted by a cardiologist elsewhere," says Khera. "Even with lower-quality images, our model was very resilient and acquired the information needed to make a highly accurate determination." Khera and his colleagues are now working to conduct studies to assess how using the tool might change patient care in the echocardiography laboratory at Yale. "We are learning much more about how clinicians use the tool in a real-world setting, including modifications to their workflow, their responses to the information, and the value, if any, that this tool adds in a clinical context," says Khera. "AI tools like the one validated in this study have the potential to help us increase our efficiency and accuracy, ultimately allowing us to screen and treat a larger number of patients with cardiovascular conditions," says Eric J. Velazquez, MD, Robert W. Berliner Professor of Medicine (cardiovascular medicine) and chief of Yale Cardiovascular Medicine. "I'm proud of Yale's continued commitment to investing in cutting-edge research to help us innovate new ways to deliver care." The full model and weights are available via open source, and the research team is encouraging other investigators to test the model using their echocardiographic studies and make improvements. Additional study authors include Zhangyang Wang, PhD, at the University of Texas at Austin, and Márton Tokodi MD, PhD, and Attila Kovács, MD, PhD, both of Semmelweis University.
Share
Share
Copy Link
A new AI-enabled tool called PanEcho, developed by Yale School of Medicine researchers, can interpret echocardiograms with high accuracy in minutes, potentially revolutionizing cardiovascular care and improving efficiency in both high-resource and low-resource settings.
Researchers at Yale School of Medicine have developed an artificial intelligence (AI) tool called PanEcho that can interpret echocardiograms with high accuracy in just minutes. This groundbreaking technology, detailed in a recent study published in JAMA, has the potential to revolutionize cardiovascular care by significantly reducing the time required for skilled echocardiographers to analyze complex heart imaging
1
.Source: Medical Xpress
PanEcho is capable of performing 39 diagnostic tasks based on multi-view echocardiography. It can accurately detect various heart conditions, including severe aortic stenosis, systolic dysfunction, and left ventricle ejection fraction. The AI tool integrates information from numerous views of the heart to automatically identify key measurements and abnormalities that a cardiologist would typically include in a complete report
2
.The development of PanEcho involved an extensive dataset of 999,727 echocardiographic videos collected from Yale New Haven Health patients between January 2016 and June 2022. To ensure its reliability, the researchers validated the tool using studies from 5,130 Yale New Haven Health patients and three external data cohorts from institutions in Hungary and California
1
.While PanEcho is not yet available for clinical use, the study outlines several potential future applications:
2
.The researchers tested PanEcho's effectiveness with point-of-care ultrasounds, which are often used in low-resource environments. Using imaging from the Yale New Haven Hospital emergency department, they found that the model remained highly accurate even with lower-quality images, demonstrating its potential for global application
1
.Related Stories
The Yale team is now conducting studies to assess how PanEcho might impact patient care in real-world clinical settings. They are examining changes in workflow, clinician responses to the AI-generated information, and the overall value added to the clinical context
2
.In a move to encourage further development and improvement, the full model and weights of PanEcho have been made available via open source. The research team is actively inviting other investigators to test the model using their own echocardiographic studies and contribute to its enhancement
1
.As AI continues to make strides in medical imaging interpretation, tools like PanEcho represent a significant step forward in improving the efficiency and accuracy of cardiovascular care. While human oversight remains crucial, the potential for AI to assist in screening and treating a larger number of patients with heart conditions is promising for the future of cardiology.
Summarized by
Navi
[1]
1
Business and Economy
2
Technology
3
Business and Economy