Patients Turn to AI Chatbots for Medical Advice as Healthcare System Frustrations Mount

Reviewed byNidhi Govil

4 Sources

Share

Growing numbers of Americans are using AI chatbots like ChatGPT for health advice due to frustrations with the medical system, including long wait times, high costs, and impersonal care. While these tools offer 24/7 availability and empathetic responses, experts warn of significant risks from potential misinformation and overreliance on AI for medical decisions.

Rising Adoption of AI for Healthcare Guidance

A growing number of Americans are turning to artificial intelligence chatbots for medical advice, driven by mounting frustrations with the traditional healthcare system. According to a survey from KFF, a health policy research group, approximately one in six adults used chatbots to find health information at least once monthly last year, with the figure rising to about a quarter among adults under 30

1

. Liz Hamel, who directs survey research at KFF, indicated these numbers are likely higher now

2

.

The trend reflects widespread dissatisfaction with current medical care. Patients describe using chatbots to compensate for healthcare system shortcomings, including a self-employed Wisconsin woman who routinely asks ChatGPT whether it's safe to skip expensive appointments, a rural Virginia writer who used the technology to navigate surgical recovery, and a clinical psychologist in Georgia seeking answers after providers dismissed her cancer treatment concerns

1

.

Source: ET

Source: ET

The Appeal of AI Medical Consultation

Patients are drawn to AI chatbots for several compelling reasons. Unlike traditional healthcare, these tools are available 24/7, cost virtually nothing, and provide responses that create convincing impressions of empathy. The bots often express sympathy for symptoms and validate users' questions as "great" and "important"

2

.

Source: NYT

Source: NYT

The case of Wendy Goldberg, a 79-year-old retired lawyer from Los Angeles, illustrates this dynamic. When seeking specific protein intake recommendations for bone health, her doctor provided generic advice that seemed to ignore her individual circumstances. ChatGPT, however, delivered a precise daily protein goal in grams within seconds

1

.

AI chatbots differentiate themselves from traditional online health resources like Google or WebMD by providing seemingly authoritative, personalized analysis. This creates facsimiles of human relationships and can engender trust levels disproportionate to the bots' actual capabilities

4

.

Significant Risks and Medical Dangers

Despite their appeal, AI chatbots pose serious medical risks. The technology can fabricate information and tends to be overly agreeable, sometimes reinforcing incorrect assumptions. High-profile medical incidents have resulted from following chatbot advice, including a 60-year-old man who was hospitalized for weeks in a psychiatric unit after ChatGPT suggested replacing salt with sodium bromide, causing paranoia and hallucinations

1

.

Research indicates that most AI models no longer display medical disclaimers when users ask health questions, despite terms of service stating they aren't intended for medical advice. The chatbots routinely suggest diagnoses, interpret laboratory results, and recommend treatments, even providing scripts to help users persuade doctors

2

.

Expert Recommendations for Safer AI Use

Medical professionals are developing guidelines for safer AI health consultation. Pharmacist Deborah Grayson suggests AI works best when users already know their condition, but becomes risky for self-diagnosis due to the technology's tendency to present rare conditions as likely possibilities

3

.

Experts recommend providing detailed symptom information, checking credible sources, and using AI primarily to prepare for doctor visits rather than replace professional medical consultation. Users should request information only from verified platforms like NHS websites or research databases like PubMed

3

.

Certain "red flag" symptoms require immediate medical attention and should never be addressed solely through AI consultation, including unexplained weight loss, persistent fever, severe fatigue, continuous vomiting, abnormal bleeding, intense pain, heart rate issues, or sudden changes in bowel movements

3

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo