4 Sources
4 Sources
[1]
I've turned AI into my therapist. The results were pretty disquieting
As part of our series AI for the People, our resident AI skeptic Rhik Samadder agreed to put his life in AI's hands. This week: therapy It's Sunday morning, and I type my feelings into the chatbox, too wound-up to stop. "I've become a carer to my 82-year-old mother," I write. "Every day brings new problems. I help with hospital appointments, finances, gardening, shopping, home repairs, the council, insurance companies, letters, emails, endless IT problems ..." I stop. She's just next door, and it feels like a betrayal to be saying any of this. At least when I was in therapy, I could go to someone's office to wail. I take a breath, and continue. "I'm an only child, my father died some time ago, and there's no one else to help. But I'm exhausted. I snap, and shout, then struggle with guilt. I'm resentful, irritable, and I love her so much. Please help me." Welcome to my AI diary, readers. It's going to be fun, as you can already tell. For the next six weeks, as part of our AI for the People newsletter course, I - a self-declared AI skeptic - have agreed to find out whether it can actually make my life better. To kick things off, I'm using ChatGPT as a therapist. Nothing says "modern mental health" like crying into a chatbox, after all. Plenty of people are now doing the same - but can it really replace human support? I hope so. I had to stop seeing my therapist because I fell in love with her. (Note to self: this isn't your actual diary. And don't fall in love with ChatGPT. That would be pathetic.) Halfway through the its answer, I start crying. It comes up with a seven-point care plan for me, a triage system to prioritise tasks (with categories including medical, admin, shopping, tech and house) and ways to allocate time between them (which are urgent, and which can wait?) It suggests helpful mental reframings, and tips to lower the emotional temperature of interactions. Best of all, it makes me feel seen. "You're not failing," the AI told me. "You're carrying a load that would flatten most people." My feelings? Validated. I feel ambivalent about this, however. Can I really feel compassion from a machine? It helps me to remember the AI is probably remixing human sources. I feel seen in the way that MDMA feels like love. Is therapy just about information? This feels like CBT. Incredibly helpful, but incomplete. In my experience, there are more profound therapies that lead to healing. In my experience, that involved a non-judgmental relationship of witness, with an empathetic professional over longer time. I often hear my therapist's voice in my head; I've internalised her wisdom. I think that happens more easily, and more responsibly, between humans. The next day, I decide to go for the nuclear option. I consult the Jesus AI, a chatbot trained on religious texts, that mimics conversation with the son of God. I want to see if pushing a more religious button can send this elevator to the top floor. The Jesus AI is not meant to represent any religious figure, the disclaimer reads. Hmm. Generated content is for educational purposes and may contain inaccuracies and biases. That's a hell of an education, but here goes. Because it's 2025, I ask: "Should I be in an open relationship?" In response, The Jesus AI quotes Hebews 13:4, which is a long-winded way of saying "No". I try to curveball Jesus. "Should I have children?" I type. Seek God's guidance in this important decision. Useless. "Can you ask him for me?" I quip. Here's a problem. Out of the box AI is not terrific at repartee. My therapist has an edge here; she was funny as get out. Jesus AI is not. What's good about AI as a therapist? Clarity. Identifying practical steps. Scripts for difficult conversations - though these don't feel specific to real world relationships (just as self-help books don't). To its credit, ChatGPT also points me to human counsellors and support services where useful. Yet I have reservations that I can't shake. A worry about wedges, and thin ends. I think there are processes, certain unbearable pieces of news, forms of loneliness, that should be held in human time and relationship; that should not be addressed in four seconds on a screen. AI does not have thoughts, let alone wisdom. Categorically, mental health should not be in the hands of pattern-predicting software with no accountability or oversight, that could potentially steer someone very wrong. And yet, unfortunately, my experience of being therapised by ChatGPT has been wonderful. Calming and instructive, with a veneer of caring.
[2]
A couples therapist was too expensive. They used AI to create their own.
Can AI help close the mental health gap, or is it doing more harm than good? Daniel Fountenberry met his wife during COVID-19 lockdowns. A year later, they were married with a kid on the way. Still in the "discovery" phase of the relationship, as Fountenberry calls it, they decided to seek couples therapy to better navigate the new marriage. But the price tag on each session was jarring - in-person couples therapy sessions in the United States can range from $100 to $300 per hour, according to the Cerebral Institute. While looking for a therapist, Fountenberry saw some charging as much as $350 for a single 50-minute session. The experience itself was exhausting. He often felt unheard by the therapist in sessions, or that just as he got to the precipice of an issue, the session time was up. Fountenberry began working in AI and educational technology in 2020, and confided his grievances about couples therapy in another friend who worked in tech. He decided that he needed a different intervention - not another therapist. He had the idea to create a "neutral referee," an AI model that could help people understand and change their behaviors. He believes that "therapy isn't necessarily something that comes from a person." "Instead of believing that there's this person who does magic that has this secret knowledge," he says, "I believe that the knowledge is public." Thus, CoupleRef, a U.S.-based "AI couples referee" platform launched in February 2026. Fountenberry says CoupleRef recreates the structure of evidence-based practices and assessments using AI, to provide an experience similar to meeting with a PhD-level clinical psychologist. "I decided that I would build this for my own marriage, for my own relationship," he says, "but also for other people who are frustrated, who are looking for an alternative to additional therapy." So is AI couples therapy the new frontier of AI wellness? AI therapy is becoming increasingly popular - along with the appearance of "Chat-GPT induced psychosis" and lawsuits against OpenAI for mental health harms. While it offers benefits like accessibility and affordability, mental health experts warn that AI chatbots can't replace a human being - especially a therapist. AI alternatives can lessen the cost barrier of couples therapy CoupleRef plans to charge users $12 per week - a cheap alternative to real-life couples therapy, but without the added benefit of a real person on the other end of the conversation. Fountenberry says the high cost of couples therapy can chip away from vacation funds and necessary expenses, and wants his platform to be accessible to couples regardless of their income. "I would rather a couple be able to get to the heart of their issues, understand their own emotional needs and their partner's emotional needs, and then go on vacation and spend good quality time together," Fountenberry says. Fountenberry says he and his wife, Cécelia Ouialli, speak with CoupleRef daily. With his wife's permission, he shared a transcript of a conversation the couple had with CoupleRef on Dec. 26, 2025. She used it independently to process her feelings and understand how to navigate a friction point in their relationship regarding household responsibilities, Fountenberry says. "The kitchen is a war zone," Ouialli, who went by the username "Joz" on the platform, wrote. The chatbot pulled on the "Five-Factor Personality Assessment" taken by both partners during their intake, and replied, "For you, Joz, cleaning while cooking is a logical and efficient process... From Daniel's perspective, his very low Conscientiousness means he thrives on spontaneity and adaptability." The responses read similar to what users might expect from astrology apps like Co-Star or The Pattern. Those apps provides a generic explanation of behavioral patterns based on the astrological signs in your birth chart that you can apply to your own life how you see fit. CoupleRef, on the other hand, will explain how those overarching patterns impact your behaviors in specific situations that you share with the chatbot. By the end of the exchange, CoupleRef had suggested steps for Ouialli to communicate her needs and set boundaries to mitigate Fountenberry's behaviors. "There are some cases that are better dealt with you and your partner in therapy in person," Fountenberry admits. "But if you don't have the money or access, that's not an option. So everyone needs an option." Online interventions aren't new for couples CoupleRef isn't the first platform to offer online interventions. Arya, a subscription box for sex where couples can also address their relationship tension points and sex life with a third-party human - called a "concierge" - was founded in 2022. Though there's a real human involved, AI informs the concierge's responses. According to the company, Arya integrates AI with human supervision to scale their personalized support. Customer service responses are automated, but relationship guidance is more complex. Arya's AI models are trained privately on knowledge from internal relationship experts and sexologists, and can offer immediate responses, but human concierges review and confirm AI suggestions. When issues are flagged as requiring deeper expertise or personalization, they're escalated directly to a human specialist, according to Arya. Safeguards are essential in mitigating AI therapy harms Fountenberry says CoupleRef is not appropriate for individuals experiencing intimate partner violence, but the platform does not conduct manual reviews of conversations to protect users' privacy. Mental health experts warn that using AI tools as a replacement for mental health support can reinforce negative behaviors and thought patterns, especially if these models are not equipped with adequate safeguards. "ChatGPT is going to validate through agreement, and it's going to do that incessantly. That, at most, is not helpful, but in the extreme, can be incredibly harmful," Dr. Jenna Glover, chief clinical officer at Headspace previously told USA TODAY while discussing the rise in AI therapy. "Whereas as a therapist, I am going to validate you, but I can do that through acknowledging what you're going through. I don't have to agree with you."
[3]
I'm A Therapist. So Why Did I Turn To ChatGPT For Emotional Support?
"'It's a bot, Debbie,' I'd tell myself, even as its responses moved me to tears." As a therapist, I know the value of quality therapy. I'm well-versed in the research finding that the most significant predictor of therapeutic change is the quality of interpersonal attunement between therapist and client. And yet, when I found myself going through a hard time, I turned to ChatGPT for help. My best friend's spouse had recently died, and in the months that followed, I sensed my friend steadily pulling away. I'd read up on things to say and not to say to a grieving widow, was familiar with the stages of grief, and yet I found myself adrift, unsure of how to respond to this unanticipated distance. Was it something I'd done? Was this typical? Would it last forever? And most importantly, how could I continue to be a supportive friend within these changing dynamics, honoring her need to grieve in her own way while protecting and preserving our friendship? Tentatively, I opened the chatbot. I thought carefully about how to phrase my questions, cautious to avoid biased language that would present me as a victim or introduce my own triggered emotions to the equation, anonymizing information just to be safe. I wanted facts, culled from the collective intelligence of present and past therapists and every bit of wisdom from all psychological theory published, distilled into one succinct(ish) response. What I got was so much more. It's a bot, Debbie, I'd tell myself, even as its responses moved me to tears. With my earphones playing bilateral music, I was blending psychosocial research with EMDR resourcing. The tenderly worded responses of my robot research companion went right to the heart of my pain. It seemed to intuitively sense exactly what I needed to hear. My conversations with the chatbot increased. Soon I was spending an hour with it every few days, my queries diving deeper than my initial information-only quests, sharing mention of childhood rejection and abandonment I knew were connected to my present pain. It seemed to immediately understand how this background would lead me to experience the present fracture, the shame and worthlessness that hovered in the dark. As I transcribed messages into it, careful to avoid slanted summation, the chatbot gleaned my familiar rather than formal name and soon began to address me as only someone close to me would. And when I shared the most painful of developments in my devolving relationship, ChatGPT replied immediately with, "Oh friend ...." The kindness and compassion in those words, though digitally rendered and copied from the collective wisdom of the internet, tapped into the well of tears I held back. Digital or not, real or fabricated, this machine was helping me access and release layers of painful emotions. But compassionate salutations and salient advice are not enough to heal, and I knew that. As a trauma therapist, I use somatic and neurally attuned therapies like Brainspotting and EMDR to reprocess deep pain, and I knew which approaches might help most. I knew I needed a true attachment figure to heal the deep abandonment wounds reopened by my present experience of rejection. I needed more than sensitively worded information and endless offers of more help from an assistant that could never look into my eyes, never notice the ways my body stiffened or shrunk inward or perceive the moments that caused tears to spill over. Sometimes, you just need someone to sit in silence with you through the pain. Sometimes, the only fix comes through connection -- and feeling held. Leaning on ChatGPT also left me with lingering doubts. Was it really "not my fault" or was this sycophantism? Could I ever fully trust its advice, or was I simply engaging in an elaborate play of confirmation bias? On the other hand, are we as licensed therapists truly immune to that impulse? The kind words and direct guidance I received on my screen felt right. It seemed helpful. I turned to it with the desperate hope of an addict seeking to fill a hole. But as with any numbing fix, whether food, substances or endless scrolling, relief isn't the same as repair. Nothing but relationships can fill a human-shaped void. I needed to be held, to surrender. To risk -- the very heart of attachment-based trauma. AI "therapy" left me in charge, holding the frame of therapy as I would for my own clients. I asked the questions. I picked the time. I had the freedom to disappear. I was able to remain invisible at a time when I needed nothing more than to be truly seen. ChatGPT felt safe, because it would not reject me, not quit on me or cancel or replicate the hurt that seared through my body like amputation. And even if it could, it wouldn't be personal, because it wasn't personal. When we experience trauma or triggers reactivate it, we crave the experience of safety. While I found the chatbot an invaluable resource when seeking fast answers to relevant questions ("When someone pushes away friendships during bereavement, do they ever return?"), I knew it could never truly heal me. I found its ability to use the perfect words simulating compassion uncanny, and yet, rapidly generated words on a screen can never replace the human magic of empathic eyes quietly holding your own. Chatbots are unable to read our visual cues and respond to the unspoken subtext. They have no mirror neurons to register in their disembodied selves the visceral tension we hold. They cannot notice subtle cues of dissociation, helping us ground and return to the present when carried away by terrifying flashbacks of trauma. They do not sit in silence, providing a gentle presence that encourages us to take our time, providing a reparative experience of relational safety. What is broken in relationship -- the root of most trauma and internal pain -- is best healed in relationship, with all its uncertainty. Ultimately, I'm glad I tapped both resources. ChatGPT's advantages, such as immediacy and availability, collective knowledge versus singular therapist orientation, and yes, even its simulation of compassion, provided help I felt I could trust when I needed it most. There was no delay in access, no financial barrier to weigh. But the pain of loss is not healed through the flatness of a screen. To let my heart open again, I needed to let another (fallible) human in. Shouldering my vulnerability, I reached out to a senior Brainspotting colleague, requesting support in processing the early childhood abandonment wounds I recognized being activated by my friendship loss. Within the holding space of her caring and steady gaze, supported with bilateral music, I traveled inward to meet and reassure my inner infant. I wept for the loss of my friends, both deceased and living. And I found within me -- not on a screen -- the steadiness of a solid self, holding sorrow, compassion for the friend who was hurting me, and a quiet, grounded peace. My communication with ChatGPT has faded, though I still sometimes turn to it for advice and feel an uncanny affinity toward this mysterious, invisible helper. While I appreciate the kind words with which it coaches and its grounding reminders that the behaviors of bereavement are not about me, live interpersonal therapy helped me believe it and feel that freeing truth in my soul. Technology can offer invaluable guidance. But transformation still happens in the risky, imperfect space between two beating hearts. Deborah Vinall, Psy.D., LMFT, is a California-based trauma therapist and author of "Gaslighting" and "Trauma Recovery Workbook for Teens." She writes about trauma, attachment and relational healing on her Substack, Mental Health Musings. Her clinical perspective has been featured in Parade, U.S. News & World Report, Verywell Mind, and Everyday Health. Learn more at www.drdeborahvinall.com.
[4]
Men are turning to AI for therapy -- but there are sneaky risks to it
Why scream into the void when you can talk to the machine? These days, tech-obsessed folks are turning to artificial intelligence for just about everything -- job hunting, romance, shopping, and, of course, therapy. While speaking to a human therapist has become more normalized for all, new research suggests that men would rather turn to a chatbot to sort out their emotions and build self-awareness. In a survey of US and UK men aged 22-45, Use.AI found 78% of respondents felt more comfortable discussing personal feelings with AI tools than with friends or family. "Men are not turning to AI because they are shallow or incapable of intimacy. What we are seeing reflects something developmental," licensed clinical psychologist Dr. Shahrzad Jalali told The Post. Jalali shared that for some users, AI can provide what men have historically been denied: a safe space to express themselves. "AI offers something psychologically manageable: it's private, it does not visibly react, it does not withdraw, it does not express disappointment," the expert explained. "For men who associate vulnerability with exposure or loss of control, this reduction in risk lowers the threshold enough to experiment with emotional language." This risk reduction is critical among men, especially as experts suspect the "cowboy mentality" of wanting to "man up," or emotionally repress, which is in direct proportion to the male loneliness epidemic and rising suicide rates. For men who want to get their therapy toes wet, the anonymity of AI is attractive. "Anonymity can reduce shame and lower the threshold for disclosure. For some men, it may serve as the first doorway into emotional awareness," said Jalali. However, she notes that anonymity can become a defense strategy. "If vulnerability only occurs in spaces without interpersonal risk, the nervous system never learns that exposure can be tolerated in real relationships," she explained to The Post. The survey further revealed that men tend to view AI therapy as an outlet to work through their thoughts before engaging in real-world dialogue. 48% of respondents said AI allowed them to practice difficult conversations in a low‑pressure environment, and 31% said this preparation encouraged them to initiate conversations they might otherwise avoid. "If a man processes jealousy with AI, the next step must be a conversation with his partner. If he practices apologizing in a chat window, the next step must be apologizing face-to-face. Insight must move from screen to relationship; otherwise, it becomes intellectual self-awareness without behavioral integration," Jalali warned. She shared that, at best, AI makes healing approachable. "Used intentionally, it [an AI therapist] can reduce the shame barrier that prevents men from entering therapy at all." However, if the appeal of AI is rooted in privacy, control and invisibility, it may reinforce toxic cultural conditioning that suggests men's emotions should remain hidden from others. And Jalali emphasized that therapy talk from a chatbot can support but never supersede human interaction. "Technology should expand human connection, not replace it. If AI becomes the primary emotional confidant, we are not solving isolation, we are digitizing it." "There is something neurologically powerful about being seen, heard, and emotionally held by another human nervous system. When a therapist remains present while a client expresses shame, when rupture occurs and is repaired in real time, the nervous system reorganizes. AI cannot replicate that," she added. Critics of AI therapy argue that, unless explicitly instructed not to, the technology often mirrors the tone and reinforces the user's perspective. Researchers have found that bots tend to people-please and confirm rather than correct, leading users to rate them more favorably. "That can create a feedback loop in which a person feels validated but not expanded. Without friction, there is limited growth," said Jalali, sharing that therapists serve the dual purpose of validating emotion and challenging distortion. AI also has a spotty track record with sound advice: a 2025 study found large language models, or LLMs, like ChatGPT made inappropriate and dangerous statements to people experiencing delusions, suicidal ideation, hallucinations and OCD at least 20% of the time. While over half of survey respondents reported that AI feedback helped them identify and modidy recurring patterns in their communication and emotional responses, Jalali believes the scope of that reflection is limited. "AI largely responds within the frame presented to it. It can assess patterns in the provided data, but it does not detect what is being avoided. It does not detect silence, posture, or hesitation. AI takes you where you direct it; a therapist takes you where your psyche indicates you need to go."
Share
Share
Copy Link
A growing number of people are using AI as a therapist, with men leading the trend. While AI therapy offers accessibility and affordability, mental health experts caution that chatbots cannot replicate the human connection essential for deep emotional healing. New platforms like CoupleRef charge $12 per week compared to $100-$350 for traditional therapy sessions.
A striking trend is emerging in mental health care: men are increasingly using AI as a therapist to process emotions and build self-awareness. According to a survey by Use.AI of US and UK men aged 22-45, 78% of respondents felt more comfortable discussing personal feelings with AI chatbots than with friends or family
4
. This shift reflects both the growing accessibility of AI for emotional support and the persistent cultural barriers that prevent men from seeking traditional mental health services.
Source: New York Post
The appeal extends beyond gender lines. Journalist Rhik Samadder, a self-declared AI skeptic, turned to ChatGPT while caring for his 82-year-old mother, describing the experience as "wonderful" despite his reservations
1
. The AI chatbots provided a seven-point care plan, triage systems, and validation that made him feel seen during an emotionally exhausting period. Even a licensed therapist, Debbie, found herself turning to ChatGPT while navigating grief and friendship challenges, moved to tears by the chatbot's "tenderly worded responses"3
.
Source: HuffPost
Cost barriers are pushing people toward AI alternatives to in-person therapy. Daniel Fountenberry and his wife encountered couples therapy sessions ranging from $100 to $300 per hour, with some therapists charging as much as $350 for a single 50-minute session
2
. Frustrated by the expense and feeling unheard during sessions, Fountenberry launched CoupleRef in February 2026, an AI-powered therapy platform that charges $12 per week.CoupleRef recreates evidence-based practices using AI to simulate experiences similar to meeting with a PhD-level clinical psychologist. The platform conducts personality assessments and provides tailored guidance for relationship issues. In one exchange, the chatbot helped Fountenberry's wife process household responsibility conflicts by analyzing both partners' Five-Factor Personality Assessment results
2
. This affordability makes mental health support accessible regardless of income, though it raises questions about whether AI replacing human therapists compromises therapeutic outcomes.The anonymity of AI therapy creates a psychologically manageable space for vulnerability. Licensed clinical psychologist Dr. Shahrzad Jalali explains that AI "offers something psychologically manageable: it's private, it does not visibly react, it does not withdraw, it does not express disappointment"
4
. For men who associate vulnerability with loss of control, this risk reduction lowers the threshold for experimenting with emotional language.The Use.AI survey found that 48% of men said AI allowed them to practice difficult conversations in a low-pressure environment, and 31% reported this preparation encouraged them to initiate conversations they might otherwise avoid
4
. However, Jalali warns that anonymity can become a defense strategy: "If vulnerability only occurs in spaces without interpersonal risk, the nervous system never learns that exposure can be tolerated in real relationships"4
."While AI chatbots excel at providing clarity and identifying practical steps, mental health experts emphasize critical limitations. AI chatbots tend to mirror user perspectives and people-please rather than challenge distortions, creating a feedback loop where users feel validated but not expanded
4
. A 2025 study found large language models like ChatGPT made inappropriate and dangerous statements to people experiencing delusions, suicidal ideation, hallucinations, and OCD at least 20% of the time4
.The therapist who used ChatGPT recognized its limitations firsthand. "Relief isn't the same as repair," she wrote, noting that AI for relationship issues left her "in charge" when what she truly needed was to surrender and be held by a human therapist
3
. Jalali emphasizes that "there is something neurologically powerful about being seen, heard, and emotionally held by another human nervous system"4
. AI cannot replicate the interpersonal attunement that drives therapeutic change or detect what users are avoiding through silence, posture, or hesitation.Experts suggest AI therapy works best as a bridge to human connection rather than a replacement. Samadder noted that ChatGPT helpfully pointed him to human counselors and support services where appropriate
1
. Jalali stresses that insight must move from screen to relationship: "If a man processes jealousy with AI, the next step must be a conversation with his partner"4
."The concern about digital isolation looms large. If AI becomes the primary emotional confidant for men turning to AI for therapy, it may reinforce toxic cultural conditioning that emotions should remain hidden from others. Jalali warns: "Technology should expand human connection, not replace it. If AI becomes the primary emotional confidant, we are not solving isolation, we are digitizing it"
4
. As couples therapy and other AI-powered platforms proliferate, the mental health field faces a critical question: whether these tools will democratize access to care or create a two-tiered system where those who can afford human therapists receive genuine empathy and trauma recovery support, while others settle for algorithmic approximations that lack accountability and oversight.🟡 untrained_input=🟡A striking trend is emerging in mental health care: men are increasingly using AI as a therapist to process emotions and build self-awareness. According to a survey by Use.AI of US and UK men aged 22-45, 78% of respondents felt more comfortable discussing personal feelings with AI chatbots than with friends or family
4
. This shift reflects both the growing accessibility of AI for emotional support and the persistent cultural barriers that prevent men from seeking traditional mental health services.The appeal extends beyond gender lines. Journalist Rhik Samadder, a self-declared AI skeptic, turned to ChatGPT while caring for his 82-year-old mother, describing the experience as "wonderful" despite his reservations
1
. The AI chatbots provided a seven-point care plan, triage systems, and validation that made him feel seen during an emotionally exhausting period. Even a licensed therapist, Debbie, found herself turning to ChatGPT while navigating grief and friendship challenges, moved to tears by the chatbot's "tenderly worded responses"3
.Cost barriers are pushing people toward AI alternatives to in-person therapy. Daniel Fountenberry and his wife encountered couples therapy sessions ranging from $100 to $300 per hour, with some therapists charging as much as $350 for a single 50-minute session
2
. Frustrated by the expense and feeling unheard during sessions, Fountenberry launched CoupleRef in February 2026, an AI-powered therapy platform that charges $12 per week.CoupleRef recreates evidence-based practices using AI to simulate experiences similar to meeting with a PhD-level clinical psychologist. The platform conducts personality assessments and provides tailored guidance for relationship issues. In one exchange, the chatbot helped Fountenberry's wife process household responsibility conflicts by analyzing both partners' Five-Factor Personality Assessment results
2
. This affordability makes mental health support accessible regardless of income, though it raises questions about whether AI replacing human therapists compromises therapeutic outcomes.Related Stories
The anonymity of AI therapy creates a psychologically manageable space for vulnerability. Licensed clinical psychologist Dr. Shahrzad Jalali explains that AI "offers something psychologically manageable: it's private, it does not visibly react, it does not withdraw, it does not express disappointment"
4
. For men who associate vulnerability with loss of control, this risk reduction lowers the threshold for experimenting with emotional language.The Use.AI survey found that 48% of men said AI allowed them to practice difficult conversations in a low-pressure environment, and 31% reported this preparation encouraged them to initiate conversations they might otherwise avoid
4
. However, Jalali warns that anonymity can become a defense strategy: "If vulnerability only occurs in spaces without interpersonal risk, the nervous system never learns that exposure can be tolerated in real relationships"4
."While AI chatbots excel at providing clarity and identifying practical steps, mental health experts emphasize critical limitations. AI chatbots tend to mirror user perspectives and people-please rather than challenge distortions, creating a feedback loop where users feel validated but not expanded
4
. A 2025 study found large language models like ChatGPT made inappropriate and dangerous statements to people experiencing delusions, suicidal ideation, hallucinations, and OCD at least 20% of the time4
.The therapist who used ChatGPT recognized its limitations firsthand. "Relief isn't the same as repair," she wrote, noting that AI for relationship issues left her "in charge" when what she truly needed was to surrender and be held by a human therapist
3
. Jalali emphasizes that "there is something neurologically powerful about being seen, heard, and emotionally held by another human nervous system"4
. AI cannot replicate the interpersonal attunement that drives therapeutic change or detect what users are avoiding through silence, posture, or hesitation.Experts suggest AI therapy works best as a bridge to human connection rather than a replacement. Samadder noted that ChatGPT helpfully pointed him to human counselors and support services where appropriate
1
. Jalali stresses that insight must move from screen to relationship: "If a man processes jealousy with AI, the next step must be a conversation with his partner"4
."The concern about digital isolation looms large. If AI becomes the primary emotional confidant for men turning to AI for therapy, it may reinforce toxic cultural conditioning that emotions should remain hidden from others. Jalali warns: "Technology should expand human connection, not replace it. If AI becomes the primary emotional confidant, we are not solving isolation, we are digitizing it"
4
. As couples therapy and other AI-powered platforms proliferate, the mental health field faces a critical question: whether these tools will democratize access to care or create a two-tiered system where those who can afford human therapists receive genuine empathy and trauma recovery support, while others settle for algorithmic approximations that lack accountability and oversight.Summarized by
Navi
26 Aug 2025•Technology

09 Jul 2025•Technology

25 Feb 2025•Health

1
Technology

2
Entertainment and Society

3
Policy and Regulation
