AI Therapy Brings Comfort and Concern as Users Turn to ChatGPT for Emotional Support

2 Sources

Share

A journalist and a therapist both turned to ChatGPT for emotional support during personal crises, finding unexpected comfort in AI-generated responses. While the chatbot provided practical guidance and validation, both users questioned whether AI mental health tools can truly replace human connection and whether relying on pattern-predicting software for therapy poses risks without proper accountability.

Using AI as a Therapist: Two Professionals Test ChatGPT

Two mental health-aware individuals recently documented their experiences with AI therapy, revealing a complex picture of both promise and peril. Journalist Rhik Samadder, a self-declared AI skeptic, turned to ChatGPT while struggling with caregiver burnout as he supported his 82-year-old mother

1

. Meanwhile, licensed therapist Debbie sought emotional support from ChatGPT while navigating the painful distance created when her best friend withdrew after becoming a widow

2

. Both found themselves moved to tears by empathetic AI responses, yet both emerged with serious reservations about the limitations of digital interactions for emotional healing.

Source: HuffPost

Source: HuffPost

Samadder's experiment began on a Sunday morning as he typed his exhaustion into the chatbox. "I'm an only child, my father died some time ago, and there's no one else to help. But I'm exhausted. I snap, and shout, then struggle with guilt," he confessed

1

. The AI for caregiver burnout delivered a seven-point care plan, a triage system to prioritize tasks across categories including medical, admin, shopping, tech and house, and mental reframing techniques. Most significantly, it told him: "You're not failing. You're carrying a load that would flatten most people." The validation felt genuine, even as Samadder reminded himself the AI was "probably remixing human sources."

ChatGPT Provides Practical Guidance But Lacks Depth

The therapist's experience with emotional support from ChatGPT proved similarly compelling yet incomplete. She approached the chatbot carefully, anonymizing information and avoiding biased language that might skew responses. What began as a quest for facts "culled from the collective intelligence of present and past therapists" evolved into hour-long sessions every few days

2

. The chatbot gleaned her familiar name from their exchanges and began addressing her informally. When she shared particularly painful developments, ChatGPT replied with "Oh friend...." Those words, though digitally rendered, helped her access and release layers of painful emotions

2

.

Both users noted that AI therapy felt most similar to CBT—practical and helpful, but incomplete. Samadder observed that "there are more profound therapies that lead to healing" involving "a non-judgmental relationship of witness, with an empathetic professional over longer time"

1

. The therapist recognized she needed somatic and neurally attuned approaches like EMDR and Brainspotting to reprocess deep pain, therapies that require an attachment figure who can notice body language and perceive moments that cause tears

2

.

AI Versus Human Connection: The Safety Paradox

The appeal of ChatGPT for AI mental health support partly lies in its safety—but that safety comes with significant trade-offs. The therapist noted that the chatbot "would not reject me, not quit on me or cancel or replicate the hurt that seared through my body like amputation"

2

. Using AI as a therapist allowed her to remain in control, picking the time and having freedom to disappear. Yet this control meant she remained invisible at a time when she needed to be truly seen. Human connection, she realized, requires vulnerability and risk—the very heart of addressing abandonment wounds and trauma recovery.

Samadder articulated a fundamental concern about accountability in AI for emotional healing: "Categorically, AI mental health should not be in the hands of pattern-predicting software with no accountability or oversight, that could potentially steer someone very wrong"

1

. He worries about "certain unbearable pieces of news, forms of loneliness, that should be held in human time and relationship; that should not be addressed in four seconds on a screen."

The Confirmation Bias Question and Future Implications

Both users grappled with lingering doubts about confirmation bias. The therapist wondered whether ChatGPT's reassurances represented genuine insight or "sycophantism," though she acknowledged licensed therapists aren't immune to that impulse either

2

. Samadder felt ambivalent about perceiving compassion from a machine, comparing the experience to how "MDMA feels like love"—a simulation rather than the genuine article

1

.

The dangers of relying on AI for mental health extend beyond individual experiences. As more people turn to chatbots for self-help and emotional support, questions emerge about what gets lost in translation. The therapist emphasized that "the most significant predictor of therapeutic change is the quality of interpersonal attunement between therapist and client"

2

. She concluded that while relief isn't the same as repair, "nothing but relationships can fill a human-shaped void." Samadder often hears his former therapist's voice in his head, having internalized her wisdom—something he believes "happens more easily, and more responsibly, between humans"

1

.

Despite reservations, Samadder admitted his experience with AI therapy "has been wonderful. Calming and instructive, with a veneer of caring"

1

. To its credit, ChatGPT pointed him toward human counselors and support services where useful. As AI mental health tools become more sophisticated and accessible, users should watch for how these technologies position themselves—as supplements to human care or replacements for it. The distinction matters deeply for anyone seeking not just information, but genuine empathy and the transformative power of being truly witnessed by another person.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo