2 Sources
2 Sources
[1]
Artificial intelligence helps patients better comprehend medical findings
Technical University of Munich (TUM)Nov 21 2025 Medical reports written in technical terminology can pose challenges for patients. A team at the Technical University of Munich (TUM) has investigated how artificial intelligence can make CT findings easier to understand. In the study, reading time decreased, and patients rated the automatically simplified texts as more comprehensible and more helpful. To simplify the original documents, the researchers used an open-source large language model operated in compliance with data protection regulations on the TUM University Hospital's computers. An example: "The cardiomediastinal silhouette is midline. The cardiac chambers are normally opacified. [...] A small pericardial effusion is noted" was simplified by the AI as follows: "Heart: The report notes a small amount of fluid around your heart. This is a common finding, and your doctor will determine if it needs any attention." Medicine needs to use understandable language From the researchers' perspective, making medical terminology accessible is more than a minor aid. "Ensuring that patients understand their reports, examinations, and treatments is a central pillar of modern medicine. This is the only way to guarantee informed consent and strengthen health literacy," says Felix Busch, assistant physician at the Institute for Diagnostic and Interventional Radiology and co-last author of the study, which was published in the journal "Radiology". While previous research has shown that AI models can make specialist medical texts more comprehensible, little was known about their impact on actual patients. Therefore, the team included 200 patients who underwent CT imaging at the TUM University Hospital due to a cancer diagnosis. One half received the original report, while the other half received an automatically simplified version. Reading time reduced, satisfaction high The results were unambiguous: reading time fell from an average of seven minutes for the original reports to two minutes. Patients who received the simplified findings reported that they were much easier to read (81% compared with 17%) and easier to understand (80% compared with 9%). They also rated them as helpful (82% compared with 29%) and informative (82% compared with 27%) far more often. Various objective measurements also confirmed the improved readability of the simplified reports." Felix Busch, assistant physician, Institute for Diagnostic and Interventional Radiology Future studies are needed to determine whether these advantages translate into measurable improvements in patient health outcomes. From the researchers' perspective, however, the study clearly shows that patients can benefit from AI-supported simplification of medical reports by improving their understanding. "Providing automatically simplified reports as an additional service alongside the specialist report is conceivable. However, the prerequisite is the availability of optimized, secure AI solutions in the clinic," says Felix Busch. Review by health professionals remains necessary The team advises patients not to turn to a chatbot like ChatGPT as a stand-in doctor to simplify their report. "Aside from data protection concerns, language models always carry the risk of factual errors," says Dr. Philipp Prucker, first author of the study. In the investigation, 6% of the AI-generated findings contained factual inaccuracies, 7% omitted information, and 3% added new information. Before the reports were provided to patients, however, they were reviewed for errors and corrected if necessary. "Language models are useful tools, but they are no substitute for medical staff. Without trained specialists verifying the findings, patients may, in the worst case, receive incorrect information about their illness," Prucker concludes. Technical University of Munich (TUM) Journal reference: Prucker et al. "A Prospective Controlled Trial of Large Language Model-based Simplification of Oncologic CT Reports for Patients with Cancer". Radiology (2025). DOI: 10.1148/radiol.251844.
[2]
AI Helps Cancer Patients Better Understand CT Reports | Newswise
Newswise -- Medical reports written in technical terminology can pose challenges for patients. A team at the Technical University of Munich (TUM) has investigated how artificial intelligence can make CT findings easier to understand. In the study, reading time decreased, and patients rated the automatically simplified texts as more comprehensible and more helpful. To simplify the original documents, the researchers used an open-source large language model operated in compliance with data protection regulations on the TUM University Hospital's computers. An example: "The cardiomediastinal silhouette is midline. The cardiac chambers are normally opacified. [...] A small pericardial effusion is noted" was simplified by the AI as follows: "Heart: The report notes a small amount of fluid around your heart. This is a common finding, and your doctor will determine if it needs any attention." From the researchers' perspective, making medical terminology accessible is more than a minor aid. "Ensuring that patients understand their reports, examinations, and treatments is a central pillar of modern medicine. This is the only way to guarantee informed consent and strengthen health literacy," says Felix Busch, assistant physician at the Institute for Diagnostic and Interventional Radiology and co-last author of the study, which was published in the journal "Radiology". While previous research has shown that AI models can make specialist medical texts more comprehensible, little was known about their impact on actual patients. Therefore, the team included 200 patients who underwent CT imaging at the TUM University Hospital due to a cancer diagnosis. One half received the original report, while the other half received an automatically simplified version. The results were unambiguous: reading time fell from an average of seven minutes for the original reports to two minutes. Patients who received the simplified findings reported that they were much easier to read (81% compared with 17%) and easier to understand (80% compared with 9%). They also rated them as helpful (82% compared with 29%) and informative (82% compared with 27%) far more often. "Various objective measurements also confirmed the improved readability of the simplified reports," says Felix Busch. Future studies are needed to determine whether these advantages translate into measurable improvements in patient health outcomes. From the researchers' perspective, however, the study clearly shows that patients can benefit from AI-supported simplification of medical reports by improving their understanding. "Providing automatically simplified reports as an additional service alongside the specialist report is conceivable. However, the prerequisite is the availability of optimized, secure AI solutions in the clinic," says Felix Busch. The team advises patients not to turn to a chatbot like ChatGPT as a stand-in doctor to simplify their report. "Aside from data protection concerns, language models always carry the risk of factual errors," says Dr. Philipp Prucker, first author of the study. In the investigation, 6% of the AI-generated findings contained factual inaccuracies, 7% omitted information, and 3% added new information. Before the reports were provided to patients, however, they were reviewed for errors and corrected if necessary. "Language models are useful tools, but they are no substitute for medical staff. Without trained specialists verifying the findings, patients may, in the worst case, receive incorrect information about their illness," Prucker concludes.
Share
Share
Copy Link
German researchers demonstrate how AI can translate complex medical terminology into patient-friendly language, reducing reading time from seven minutes to two while significantly improving understanding and satisfaction among cancer patients.

Researchers at the Technical University of Munich (TUM) have demonstrated how artificial intelligence can bridge the communication gap between medical professionals and patients by automatically simplifying complex medical reports. The study, published in the journal Radiology, involved 200 cancer patients and showed dramatic improvements in comprehension and satisfaction when patients received AI-simplified versions of their CT scan reports
1
.The results of the controlled trial were striking. Reading time for medical reports dropped from an average of seven minutes for original reports to just two minutes for the simplified versions. More importantly, patient comprehension improved dramatically across multiple metrics. Among patients who received simplified reports, 81% found them easier to read compared to only 17% who received original reports. Similarly, 80% of patients found the simplified versions easier to understand, compared to just 9% for the original technical reports
2
.Patients also rated the simplified reports as significantly more helpful (82% versus 29%) and more informative (82% versus 27%) than their technical counterparts. These improvements were confirmed through various objective readability measurements, demonstrating that the benefits extended beyond subjective patient preferences.
The research team utilized an open-source large language model that operated in compliance with data protection regulations on TUM University Hospital's secure computers. The AI system transformed dense medical jargon into accessible language that patients could easily understand. For example, the technical phrase "The cardiomediastinal silhouette is midline. The cardiac chambers are normally opacified. A small pericardial effusion is noted" was simplified to: "Heart: The report notes a small amount of fluid around your heart. This is a common finding, and your doctor will determine if it needs any attention"
1
.Related Stories
Despite the promising results, the study revealed important limitations that underscore the need for careful implementation. The AI-generated reports contained factual inaccuracies in 6% of cases, omitted important information in 7% of instances, and added new information not present in the original reports in 3% of cases. To address these concerns, all AI-simplified reports were reviewed and corrected by medical professionals before being provided to patients
2
.Dr. Philipp Prucker, the study's first author, emphasized that patients should not rely on general chatbots like ChatGPT for medical report interpretation. "Aside from data protection concerns, language models always carry the risk of factual errors," Prucker noted, stressing that "language models are useful tools, but they are no substitute for medical staff."
Felix Busch, assistant physician at the Institute for Diagnostic and Interventional Radiology and co-last author of the study, positioned this research within the broader context of patient-centered care. "Ensuring that patients understand their reports, examinations, and treatments is a central pillar of modern medicine. This is the only way to guarantee informed consent and strengthen health literacy," Busch explained
1
.The researchers envision a future where automatically simplified reports could be provided as an additional service alongside specialist reports, though they emphasize that this would require optimized, secure AI solutions specifically designed for clinical environments. Future studies will be needed to determine whether improved patient understanding translates into measurable improvements in health outcomes.
Summarized by
Navi