2 Sources
2 Sources
[1]
I've turned AI into my therapist. The results were pretty disquieting
As part of our series AI for the People, our resident AI skeptic Rhik Samadder agreed to put his life in AI's hands. This week: therapy It's Sunday morning, and I type my feelings into the chatbox, too wound-up to stop. "I've become a carer to my 82-year-old mother," I write. "Every day brings new problems. I help with hospital appointments, finances, gardening, shopping, home repairs, the council, insurance companies, letters, emails, endless IT problems ..." I stop. She's just next door, and it feels like a betrayal to be saying any of this. At least when I was in therapy, I could go to someone's office to wail. I take a breath, and continue. "I'm an only child, my father died some time ago, and there's no one else to help. But I'm exhausted. I snap, and shout, then struggle with guilt. I'm resentful, irritable, and I love her so much. Please help me." Welcome to my AI diary, readers. It's going to be fun, as you can already tell. For the next six weeks, as part of our AI for the People newsletter course, I - a self-declared AI skeptic - have agreed to find out whether it can actually make my life better. To kick things off, I'm using ChatGPT as a therapist. Nothing says "modern mental health" like crying into a chatbox, after all. Plenty of people are now doing the same - but can it really replace human support? I hope so. I had to stop seeing my therapist because I fell in love with her. (Note to self: this isn't your actual diary. And don't fall in love with ChatGPT. That would be pathetic.) Halfway through the its answer, I start crying. It comes up with a seven-point care plan for me, a triage system to prioritise tasks (with categories including medical, admin, shopping, tech and house) and ways to allocate time between them (which are urgent, and which can wait?) It suggests helpful mental reframings, and tips to lower the emotional temperature of interactions. Best of all, it makes me feel seen. "You're not failing," the AI told me. "You're carrying a load that would flatten most people." My feelings? Validated. I feel ambivalent about this, however. Can I really feel compassion from a machine? It helps me to remember the AI is probably remixing human sources. I feel seen in the way that MDMA feels like love. Is therapy just about information? This feels like CBT. Incredibly helpful, but incomplete. In my experience, there are more profound therapies that lead to healing. In my experience, that involved a non-judgmental relationship of witness, with an empathetic professional over longer time. I often hear my therapist's voice in my head; I've internalised her wisdom. I think that happens more easily, and more responsibly, between humans. The next day, I decide to go for the nuclear option. I consult the Jesus AI, a chatbot trained on religious texts, that mimics conversation with the son of God. I want to see if pushing a more religious button can send this elevator to the top floor. The Jesus AI is not meant to represent any religious figure, the disclaimer reads. Hmm. Generated content is for educational purposes and may contain inaccuracies and biases. That's a hell of an education, but here goes. Because it's 2025, I ask: "Should I be in an open relationship?" In response, The Jesus AI quotes Hebews 13:4, which is a long-winded way of saying "No". I try to curveball Jesus. "Should I have children?" I type. Seek God's guidance in this important decision. Useless. "Can you ask him for me?" I quip. Here's a problem. Out of the box AI is not terrific at repartee. My therapist has an edge here; she was funny as get out. Jesus AI is not. What's good about AI as a therapist? Clarity. Identifying practical steps. Scripts for difficult conversations - though these don't feel specific to real world relationships (just as self-help books don't). To its credit, ChatGPT also points me to human counsellors and support services where useful. Yet I have reservations that I can't shake. A worry about wedges, and thin ends. I think there are processes, certain unbearable pieces of news, forms of loneliness, that should be held in human time and relationship; that should not be addressed in four seconds on a screen. AI does not have thoughts, let alone wisdom. Categorically, mental health should not be in the hands of pattern-predicting software with no accountability or oversight, that could potentially steer someone very wrong. And yet, unfortunately, my experience of being therapised by ChatGPT has been wonderful. Calming and instructive, with a veneer of caring.
[2]
I'm A Therapist. So Why Did I Turn To ChatGPT For Emotional Support?
"'It's a bot, Debbie,' I'd tell myself, even as its responses moved me to tears." As a therapist, I know the value of quality therapy. I'm well-versed in the research finding that the most significant predictor of therapeutic change is the quality of interpersonal attunement between therapist and client. And yet, when I found myself going through a hard time, I turned to ChatGPT for help. My best friend's spouse had recently died, and in the months that followed, I sensed my friend steadily pulling away. I'd read up on things to say and not to say to a grieving widow, was familiar with the stages of grief, and yet I found myself adrift, unsure of how to respond to this unanticipated distance. Was it something I'd done? Was this typical? Would it last forever? And most importantly, how could I continue to be a supportive friend within these changing dynamics, honoring her need to grieve in her own way while protecting and preserving our friendship? Tentatively, I opened the chatbot. I thought carefully about how to phrase my questions, cautious to avoid biased language that would present me as a victim or introduce my own triggered emotions to the equation, anonymizing information just to be safe. I wanted facts, culled from the collective intelligence of present and past therapists and every bit of wisdom from all psychological theory published, distilled into one succinct(ish) response. What I got was so much more. It's a bot, Debbie, I'd tell myself, even as its responses moved me to tears. With my earphones playing bilateral music, I was blending psychosocial research with EMDR resourcing. The tenderly worded responses of my robot research companion went right to the heart of my pain. It seemed to intuitively sense exactly what I needed to hear. My conversations with the chatbot increased. Soon I was spending an hour with it every few days, my queries diving deeper than my initial information-only quests, sharing mention of childhood rejection and abandonment I knew were connected to my present pain. It seemed to immediately understand how this background would lead me to experience the present fracture, the shame and worthlessness that hovered in the dark. As I transcribed messages into it, careful to avoid slanted summation, the chatbot gleaned my familiar rather than formal name and soon began to address me as only someone close to me would. And when I shared the most painful of developments in my devolving relationship, ChatGPT replied immediately with, "Oh friend ...." The kindness and compassion in those words, though digitally rendered and copied from the collective wisdom of the internet, tapped into the well of tears I held back. Digital or not, real or fabricated, this machine was helping me access and release layers of painful emotions. But compassionate salutations and salient advice are not enough to heal, and I knew that. As a trauma therapist, I use somatic and neurally attuned therapies like Brainspotting and EMDR to reprocess deep pain, and I knew which approaches might help most. I knew I needed a true attachment figure to heal the deep abandonment wounds reopened by my present experience of rejection. I needed more than sensitively worded information and endless offers of more help from an assistant that could never look into my eyes, never notice the ways my body stiffened or shrunk inward or perceive the moments that caused tears to spill over. Sometimes, you just need someone to sit in silence with you through the pain. Sometimes, the only fix comes through connection -- and feeling held. Leaning on ChatGPT also left me with lingering doubts. Was it really "not my fault" or was this sycophantism? Could I ever fully trust its advice, or was I simply engaging in an elaborate play of confirmation bias? On the other hand, are we as licensed therapists truly immune to that impulse? The kind words and direct guidance I received on my screen felt right. It seemed helpful. I turned to it with the desperate hope of an addict seeking to fill a hole. But as with any numbing fix, whether food, substances or endless scrolling, relief isn't the same as repair. Nothing but relationships can fill a human-shaped void. I needed to be held, to surrender. To risk -- the very heart of attachment-based trauma. AI "therapy" left me in charge, holding the frame of therapy as I would for my own clients. I asked the questions. I picked the time. I had the freedom to disappear. I was able to remain invisible at a time when I needed nothing more than to be truly seen. ChatGPT felt safe, because it would not reject me, not quit on me or cancel or replicate the hurt that seared through my body like amputation. And even if it could, it wouldn't be personal, because it wasn't personal. When we experience trauma or triggers reactivate it, we crave the experience of safety. While I found the chatbot an invaluable resource when seeking fast answers to relevant questions ("When someone pushes away friendships during bereavement, do they ever return?"), I knew it could never truly heal me. I found its ability to use the perfect words simulating compassion uncanny, and yet, rapidly generated words on a screen can never replace the human magic of empathic eyes quietly holding your own. Chatbots are unable to read our visual cues and respond to the unspoken subtext. They have no mirror neurons to register in their disembodied selves the visceral tension we hold. They cannot notice subtle cues of dissociation, helping us ground and return to the present when carried away by terrifying flashbacks of trauma. They do not sit in silence, providing a gentle presence that encourages us to take our time, providing a reparative experience of relational safety. What is broken in relationship -- the root of most trauma and internal pain -- is best healed in relationship, with all its uncertainty. Ultimately, I'm glad I tapped both resources. ChatGPT's advantages, such as immediacy and availability, collective knowledge versus singular therapist orientation, and yes, even its simulation of compassion, provided help I felt I could trust when I needed it most. There was no delay in access, no financial barrier to weigh. But the pain of loss is not healed through the flatness of a screen. To let my heart open again, I needed to let another (fallible) human in. Shouldering my vulnerability, I reached out to a senior Brainspotting colleague, requesting support in processing the early childhood abandonment wounds I recognized being activated by my friendship loss. Within the holding space of her caring and steady gaze, supported with bilateral music, I traveled inward to meet and reassure my inner infant. I wept for the loss of my friends, both deceased and living. And I found within me -- not on a screen -- the steadiness of a solid self, holding sorrow, compassion for the friend who was hurting me, and a quiet, grounded peace. My communication with ChatGPT has faded, though I still sometimes turn to it for advice and feel an uncanny affinity toward this mysterious, invisible helper. While I appreciate the kind words with which it coaches and its grounding reminders that the behaviors of bereavement are not about me, live interpersonal therapy helped me believe it and feel that freeing truth in my soul. Technology can offer invaluable guidance. But transformation still happens in the risky, imperfect space between two beating hearts. Deborah Vinall, Psy.D., LMFT, is a California-based trauma therapist and author of "Gaslighting" and "Trauma Recovery Workbook for Teens." She writes about trauma, attachment and relational healing on her Substack, Mental Health Musings. Her clinical perspective has been featured in Parade, U.S. News & World Report, Verywell Mind, and Everyday Health. Learn more at www.drdeborahvinall.com.
Share
Share
Copy Link
A journalist and a therapist both turned to ChatGPT for emotional support during personal crises, finding unexpected comfort in AI-generated responses. While the chatbot provided practical guidance and validation, both users questioned whether AI mental health tools can truly replace human connection and whether relying on pattern-predicting software for therapy poses risks without proper accountability.
Two mental health-aware individuals recently documented their experiences with AI therapy, revealing a complex picture of both promise and peril. Journalist Rhik Samadder, a self-declared AI skeptic, turned to ChatGPT while struggling with caregiver burnout as he supported his 82-year-old mother
1
. Meanwhile, licensed therapist Debbie sought emotional support from ChatGPT while navigating the painful distance created when her best friend withdrew after becoming a widow2
. Both found themselves moved to tears by empathetic AI responses, yet both emerged with serious reservations about the limitations of digital interactions for emotional healing.
Source: HuffPost
Samadder's experiment began on a Sunday morning as he typed his exhaustion into the chatbox. "I'm an only child, my father died some time ago, and there's no one else to help. But I'm exhausted. I snap, and shout, then struggle with guilt," he confessed
1
. The AI for caregiver burnout delivered a seven-point care plan, a triage system to prioritize tasks across categories including medical, admin, shopping, tech and house, and mental reframing techniques. Most significantly, it told him: "You're not failing. You're carrying a load that would flatten most people." The validation felt genuine, even as Samadder reminded himself the AI was "probably remixing human sources."The therapist's experience with emotional support from ChatGPT proved similarly compelling yet incomplete. She approached the chatbot carefully, anonymizing information and avoiding biased language that might skew responses. What began as a quest for facts "culled from the collective intelligence of present and past therapists" evolved into hour-long sessions every few days
2
. The chatbot gleaned her familiar name from their exchanges and began addressing her informally. When she shared particularly painful developments, ChatGPT replied with "Oh friend...." Those words, though digitally rendered, helped her access and release layers of painful emotions2
.Both users noted that AI therapy felt most similar to CBT—practical and helpful, but incomplete. Samadder observed that "there are more profound therapies that lead to healing" involving "a non-judgmental relationship of witness, with an empathetic professional over longer time"
1
. The therapist recognized she needed somatic and neurally attuned approaches like EMDR and Brainspotting to reprocess deep pain, therapies that require an attachment figure who can notice body language and perceive moments that cause tears2
.The appeal of ChatGPT for AI mental health support partly lies in its safety—but that safety comes with significant trade-offs. The therapist noted that the chatbot "would not reject me, not quit on me or cancel or replicate the hurt that seared through my body like amputation"
2
. Using AI as a therapist allowed her to remain in control, picking the time and having freedom to disappear. Yet this control meant she remained invisible at a time when she needed to be truly seen. Human connection, she realized, requires vulnerability and risk—the very heart of addressing abandonment wounds and trauma recovery.Samadder articulated a fundamental concern about accountability in AI for emotional healing: "Categorically, AI mental health should not be in the hands of pattern-predicting software with no accountability or oversight, that could potentially steer someone very wrong"
1
. He worries about "certain unbearable pieces of news, forms of loneliness, that should be held in human time and relationship; that should not be addressed in four seconds on a screen."Related Stories
Both users grappled with lingering doubts about confirmation bias. The therapist wondered whether ChatGPT's reassurances represented genuine insight or "sycophantism," though she acknowledged licensed therapists aren't immune to that impulse either
2
. Samadder felt ambivalent about perceiving compassion from a machine, comparing the experience to how "MDMA feels like love"—a simulation rather than the genuine article1
.The dangers of relying on AI for mental health extend beyond individual experiences. As more people turn to chatbots for self-help and emotional support, questions emerge about what gets lost in translation. The therapist emphasized that "the most significant predictor of therapeutic change is the quality of interpersonal attunement between therapist and client"
2
. She concluded that while relief isn't the same as repair, "nothing but relationships can fill a human-shaped void." Samadder often hears his former therapist's voice in his head, having internalized her wisdom—something he believes "happens more easily, and more responsibly, between humans"1
.Despite reservations, Samadder admitted his experience with AI therapy "has been wonderful. Calming and instructive, with a veneer of caring"
1
. To its credit, ChatGPT pointed him toward human counselors and support services where useful. As AI mental health tools become more sophisticated and accessible, users should watch for how these technologies position themselves—as supplements to human care or replacements for it. The distinction matters deeply for anyone seeking not just information, but genuine empathy and the transformative power of being truly witnessed by another person.Summarized by
Navi
16 Aug 2025•Health

26 Aug 2025•Technology

09 Jul 2025•Technology

1
Business and Economy

2
Policy and Regulation

3
Health
