AI in Healthcare: Patients Trust AI Medical Advice Over Doctors, Raising Concerns and Challenges

Reviewed byNidhi Govil

3 Sources

Share

A study reveals patients' increasing reliance on AI for medical advice, often trusting it over doctors. This trend is reshaping doctor-patient dynamics and raising concerns about AI's limitations in healthcare.

AI Gains Trust in Medical Advice, Challenging Doctor-Patient Dynamics

A recent study published in The New England Journal of Medicine has revealed a surprising trend: patients are increasingly trusting AI-generated medical advice over that of human doctors, even when the AI's advice is of low quality

1

. This shift is reshaping the landscape of healthcare and raising concerns among medical professionals.

The Study: AI vs. Human Doctors

Source: Medscape

Source: Medscape

Researchers from MIT's Media Lab, Stanford University, Cornell University, and other institutions conducted a series of experiments to test people's responses to medical advice from OpenAI's GPT-3 model compared to that from human doctors

1

. The results were striking:

  1. Participants struggled to distinguish between AI-generated and doctor-written responses, performing only slightly better than chance.
  2. AI-generated responses, including those of low accuracy, were judged to be more valid and trustworthy than doctors' responses.
  3. When responses were labeled as coming from a doctor (even if they were actually AI-generated), participants rated them as significantly more trustworthy.

The Impact on Clinical Practice

The rise of AI in healthcare is changing the dynamics of doctor-patient interactions. Dr. Kumara Raja Sundar, a family medicine physician, shared his experience in JAMA, describing how patients now arrive at appointments with AI-generated diagnoses and treatment suggestions

2

.

This trend is creating new challenges for healthcare providers:

  1. Patients present information with increased confidence, sometimes subtly challenging the doctor's expertise.
  2. Physicians must navigate requests for unnecessary or impractical tests suggested by AI.
  3. Explaining the limitations of AI-generated advice without sounding dismissive has become a delicate balancing act.

The Double-Edged Sword of AI in Healthcare

Source: Economic Times

Source: Economic Times

While AI tools like ChatGPT offer patients access to vast amounts of medical information, they also present significant risks:

  1. AI lacks the context and judgment that human doctors possess, potentially leading to inappropriate recommendations.
  2. A case reported in the Annals of Internal Medicine highlighted the dangers of blindly following AI advice, where a patient was hospitalized after using sodium bromide as a salt substitute based on ChatGPT's recommendation

    3

    .

Adapting to the New Reality

Medical professionals are calling for new approaches to address this shift:

  1. Increased transparency about the limitations of AI in healthcare.
  2. Structured patient education on the proper use of AI-generated medical information.
  3. Emphasizing empathy and collaboration in doctor-patient interactions to build trust.

Dr. Sundar suggests acknowledging patients' concerns before moving to clinical reasoning: "I want to express my condolences. I can hardly imagine how you feel. I want to tackle this with you and develop a plan"

2

.

As AI continues to play a larger role in healthcare, finding the right balance between leveraging its benefits and mitigating its risks will be crucial for both patients and healthcare providers. The challenge lies in harnessing AI's potential while maintaining the irreplaceable human elements of medical care: judgment, empathy, and responsibility.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo