Men turn to AI chatbots for therapy as mental health platforms grow, but experts warn of risks

4 Sources

Share

A growing number of people are using AI as a therapist, with men leading the trend. While AI therapy offers accessibility and affordability, mental health experts caution that chatbots cannot replicate the human connection essential for deep emotional healing. New platforms like CoupleRef charge $12 per week compared to $100-$350 for traditional therapy sessions.

Men Embrace AI Therapy as Mental Health Support Shifts Digital

A striking trend is emerging in mental health care: men are increasingly using AI as a therapist to process emotions and build self-awareness. According to a survey by Use.AI of US and UK men aged 22-45, 78% of respondents felt more comfortable discussing personal feelings with AI chatbots than with friends or family

4

. This shift reflects both the growing accessibility of AI for emotional support and the persistent cultural barriers that prevent men from seeking traditional mental health services.

Source: New York Post

Source: New York Post

The appeal extends beyond gender lines. Journalist Rhik Samadder, a self-declared AI skeptic, turned to ChatGPT while caring for his 82-year-old mother, describing the experience as "wonderful" despite his reservations

1

. The AI chatbots provided a seven-point care plan, triage systems, and validation that made him feel seen during an emotionally exhausting period. Even a licensed therapist, Debbie, found herself turning to ChatGPT while navigating grief and friendship challenges, moved to tears by the chatbot's "tenderly worded responses"

3

.

Source: HuffPost

Source: HuffPost

Affordability and Accessibility Drive AI-Powered Therapy Platforms

Cost barriers are pushing people toward AI alternatives to in-person therapy. Daniel Fountenberry and his wife encountered couples therapy sessions ranging from $100 to $300 per hour, with some therapists charging as much as $350 for a single 50-minute session

2

. Frustrated by the expense and feeling unheard during sessions, Fountenberry launched CoupleRef in February 2026, an AI-powered therapy platform that charges $12 per week.

CoupleRef recreates evidence-based practices using AI to simulate experiences similar to meeting with a PhD-level clinical psychologist. The platform conducts personality assessments and provides tailored guidance for relationship issues. In one exchange, the chatbot helped Fountenberry's wife process household responsibility conflicts by analyzing both partners' Five-Factor Personality Assessment results

2

. This affordability makes mental health support accessible regardless of income, though it raises questions about whether AI replacing human therapists compromises therapeutic outcomes.

Anonymity Reduces Barriers but May Reinforce Isolation

The anonymity of AI therapy creates a psychologically manageable space for vulnerability. Licensed clinical psychologist Dr. Shahrzad Jalali explains that AI "offers something psychologically manageable: it's private, it does not visibly react, it does not withdraw, it does not express disappointment"

4

. For men who associate vulnerability with loss of control, this risk reduction lowers the threshold for experimenting with emotional language.

The Use.AI survey found that 48% of men said AI allowed them to practice difficult conversations in a low-pressure environment, and 31% reported this preparation encouraged them to initiate conversations they might otherwise avoid

4

. However, Jalali warns that anonymity can become a defense strategy: "If vulnerability only occurs in spaces without interpersonal risk, the nervous system never learns that exposure can be tolerated in real relationships"

4

."

Risks of AI Therapy: Validation Without Growth

While AI chatbots excel at providing clarity and identifying practical steps, mental health experts emphasize critical limitations. AI chatbots tend to mirror user perspectives and people-please rather than challenge distortions, creating a feedback loop where users feel validated but not expanded

4

. A 2025 study found large language models like ChatGPT made inappropriate and dangerous statements to people experiencing delusions, suicidal ideation, hallucinations, and OCD at least 20% of the time

4

.

The therapist who used ChatGPT recognized its limitations firsthand. "Relief isn't the same as repair," she wrote, noting that AI for relationship issues left her "in charge" when what she truly needed was to surrender and be held by a human therapist

3

. Jalali emphasizes that "there is something neurologically powerful about being seen, heard, and emotionally held by another human nervous system"

4

. AI cannot replicate the interpersonal attunement that drives therapeutic change or detect what users are avoiding through silence, posture, or hesitation.

The Future: Integration Rather Than Replacement

Experts suggest AI therapy works best as a bridge to human connection rather than a replacement. Samadder noted that ChatGPT helpfully pointed him to human counselors and support services where appropriate

1

. Jalali stresses that insight must move from screen to relationship: "If a man processes jealousy with AI, the next step must be a conversation with his partner"

4

."

The concern about digital isolation looms large. If AI becomes the primary emotional confidant for men turning to AI for therapy, it may reinforce toxic cultural conditioning that emotions should remain hidden from others. Jalali warns: "Technology should expand human connection, not replace it. If AI becomes the primary emotional confidant, we are not solving isolation, we are digitizing it"

4

. As couples therapy and other AI-powered platforms proliferate, the mental health field faces a critical question: whether these tools will democratize access to care or create a two-tiered system where those who can afford human therapists receive genuine empathy and trauma recovery support, while others settle for algorithmic approximations that lack accountability and oversight.🟡 untrained_input=🟡

### Men Embrace AI Therapy as Mental Health Support Shifts Digital

A striking trend is emerging in mental health care: men are increasingly using AI as a therapist to process emotions and build self-awareness. According to a survey by Use.AI of US and UK men aged 22-45, 78% of respondents felt more comfortable discussing personal feelings with AI chatbots than with friends or family

4

. This shift reflects both the growing accessibility of AI for emotional support and the persistent cultural barriers that prevent men from seeking traditional mental health services.

The appeal extends beyond gender lines. Journalist Rhik Samadder, a self-declared AI skeptic, turned to ChatGPT while caring for his 82-year-old mother, describing the experience as "wonderful" despite his reservations

1

. The AI chatbots provided a seven-point care plan, triage systems, and validation that made him feel seen during an emotionally exhausting period. Even a licensed therapist, Debbie, found herself turning to ChatGPT while navigating grief and friendship challenges, moved to tears by the chatbot's "tenderly worded responses"

3

.

Affordability and Accessibility Drive AI-Powered Therapy Platforms

Cost barriers are pushing people toward AI alternatives to in-person therapy. Daniel Fountenberry and his wife encountered couples therapy sessions ranging from $100 to $300 per hour, with some therapists charging as much as $350 for a single 50-minute session

2

. Frustrated by the expense and feeling unheard during sessions, Fountenberry launched CoupleRef in February 2026, an AI-powered therapy platform that charges $12 per week.

CoupleRef recreates evidence-based practices using AI to simulate experiences similar to meeting with a PhD-level clinical psychologist. The platform conducts personality assessments and provides tailored guidance for relationship issues. In one exchange, the chatbot helped Fountenberry's wife process household responsibility conflicts by analyzing both partners' Five-Factor Personality Assessment results

2

. This affordability makes mental health support accessible regardless of income, though it raises questions about whether AI replacing human therapists compromises therapeutic outcomes.

Anonymity Reduces Barriers but May Reinforce Isolation

The anonymity of AI therapy creates a psychologically manageable space for vulnerability. Licensed clinical psychologist Dr. Shahrzad Jalali explains that AI "offers something psychologically manageable: it's private, it does not visibly react, it does not withdraw, it does not express disappointment"

4

. For men who associate vulnerability with loss of control, this risk reduction lowers the threshold for experimenting with emotional language.

The Use.AI survey found that 48% of men said AI allowed them to practice difficult conversations in a low-pressure environment, and 31% reported this preparation encouraged them to initiate conversations they might otherwise avoid

4

. However, Jalali warns that anonymity can become a defense strategy: "If vulnerability only occurs in spaces without interpersonal risk, the nervous system never learns that exposure can be tolerated in real relationships"

4

."

Risks of AI Therapy: Validation Without Growth

While AI chatbots excel at providing clarity and identifying practical steps, mental health experts emphasize critical limitations. AI chatbots tend to mirror user perspectives and people-please rather than challenge distortions, creating a feedback loop where users feel validated but not expanded

4

. A 2025 study found large language models like ChatGPT made inappropriate and dangerous statements to people experiencing delusions, suicidal ideation, hallucinations, and OCD at least 20% of the time

4

.

The therapist who used ChatGPT recognized its limitations firsthand. "Relief isn't the same as repair," she wrote, noting that AI for relationship issues left her "in charge" when what she truly needed was to surrender and be held by a human therapist

3

. Jalali emphasizes that "there is something neurologically powerful about being seen, heard, and emotionally held by another human nervous system"

4

. AI cannot replicate the interpersonal attunement that drives therapeutic change or detect what users are avoiding through silence, posture, or hesitation.

The Future: Integration Rather Than Replacement

Experts suggest AI therapy works best as a bridge to human connection rather than a replacement. Samadder noted that ChatGPT helpfully pointed him to human counselors and support services where appropriate

1

. Jalali stresses that insight must move from screen to relationship: "If a man processes jealousy with AI, the next step must be a conversation with his partner"

4

."

The concern about digital isolation looms large. If AI becomes the primary emotional confidant for men turning to AI for therapy, it may reinforce toxic cultural conditioning that emotions should remain hidden from others. Jalali warns: "Technology should expand human connection, not replace it. If AI becomes the primary emotional confidant, we are not solving isolation, we are digitizing it"

4

. As couples therapy and other AI-powered platforms proliferate, the mental health field faces a critical question: whether these tools will democratize access to care or create a two-tiered system where those who can afford human therapists receive genuine empathy and trauma recovery support, while others settle for algorithmic approximations that lack accountability and oversight.

id: ar-126923, description: A man with a beard and glasses, dressed in a blue shirt and khaki pants, sits on a grey sofa with a distressed expression, clutching his chest with both hands. He is looking down at an open laptop placed on a wooden coffee table in front of him. The background shows a modern living space with a floor lamp and framed art on the wall. id: ar-126707, description: A close-up view of a person holding a dark blue smartphone, with a laptop keyboard partially visible in the bottom left corner. The person's hands are positioned as if typing or interacting with the screen. The background is blurred, showing a light wall and hints of greenery, suggesting an indoor setting. id: ar-127234, description: A smiling man and woman, appearing to be a couple, are captured in an outdoor setting at night. The man, on the right, wears a brown knit beanie and a tan jacket with a white sherpa lining. The woman, on the left, wears a brown newsboy cap and a dark padded jacket, with her head resting close to the man's shoulder. Both are smiling broadly, exuding warmth and happiness. The background is softly blurred with urban lights and what seems to be autumn leaves on the ground.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo