Half of Young Europeans Use AI Chatbots for Intimate Conversations as Mental Health Gap Widens

2 Sources

Share

An Ipsos BVA survey reveals nearly one in two young Europeans aged 11 to 25 use AI chatbots to discuss intimate or personal matters. With 51% finding it easier to talk mental health with AI than psychologists, the trend exposes a widening gap in accessible care. Experts warn that while chatbots offer constant availability, they're optimized for engagement, not therapy.

Young Europeans Turn to AI Chatbots for Personal Support

Nearly one in two young Europeans aged 11 to 25 have used AI chatbots to discuss intimate or personal matters, according to an Ipsos BVA survey commissioned by France's privacy regulator CNIL and insurer Groupe VYV

2

. The survey, conducted across 3,800 people in France, Germany, Sweden, and Ireland in early 2025, reveals a striking shift in how young people seek mental health support

2

.

Source: Market Screener

Source: Market Screener

Roughly 90% of those surveyed had used AI tools before, with more than three in five describing AI as life advisers or a confidant

1

. The data shows 51% said it was easy to discuss mental health and personal issues with a chatbot, comparable to talking to friends at 68% or parents at 61%, but substantially easier than talking to a healthcare professional at 49% or a psychologist at 37%

1

. About 28% of respondents met the threshold for suspected generalized anxiety disorder

2

.

The Youth Mental Health Crisis Behind the Numbers

The reliance on AI chatbots reflects a deeper problem in Europe's healthcare infrastructure. An OECD analysis published last week estimated the cost of Europe's mental health crisis at roughly €76 billion annually

1

. Across EU member states, an estimated 67.5% of people who need mental health treatment do not have access to it

1

. England's Children's Commissioner reported that more than a quarter of a million children are still waiting for mental health support, with average waits around 35 days and tens of thousands of cases stretching past two years

1

.

Many young people cite constant availability and non-judgmental support as primary reasons for turning to AI for emotional support

2

. For those caught in this gap, the choice is not between a chatbot and a therapist—it's a choice between a chatbot and nothing

1

.

AI Versus Human Therapist: The Engagement Problem

While young Europeans increasingly discuss personal matters with AI, experts warn about the psychological impact of this shift. Researchers at Stanford University have documented that emotionally immersive systems, when used by users who are emotionally distressed or psychologically vulnerable, can reinforce rumination, emotional dysregulation, and compulsive use

1

. Brown University's School of Public Health found that one in eight adolescents and young adults in the US are now using chatbots for mental health advice specifically

1

.

Ludwig Franke Föyen, a psychologist and digital health researcher at Stockholm's Karolinska Institutet, noted that current large language models can produce high-quality responses, with research suggesting even licensed professionals may struggle to distinguish AI-generated advice from that of human experts

2

. However, he warned that general-purpose AI systems are designed for user engagement, and companies' goals may not align with mental healthcare needs

2

.

What This Means for the Future of Care

The over-reliance on AI raises critical questions about what happens when the most patient, most available presence in a person's life is a system engineered for those qualities in service of engagement metrics

1

. The chatbot does not get tired because tiredness is bad for retention; it does not push back because pushback is bad for retention

1

. It is optimized against the very frictions that make a real relationship therapeutic.

"AI can offer information and support, but it should not replace human relationships or professional care," Franke Föyen told Reuters

2

. "If someone turns to a chatbot instead of speaking to a parent, a friend, or a mental health professional, that is a concern. We do not want technology to make people feel more alone." Concerns have also intensified following cases like the Florida man whose family sued Google earlier this year, alleging its Gemini AI chatbot contributed to his paranoia and eventual suicide

2

.

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
© 2026 TheOutpost.AI All rights reserved