Daily AI Chatbot Users Show 30% Higher Depression Rates, New Study of 21,000 U.S. Adults Finds

Reviewed byNidhi Govil

3 Sources

Share

A national survey of nearly 21,000 U.S. adults reveals that people using AI chatbots like ChatGPT or Google Gemini daily are 30% more likely to report moderate depression. The study highlights correlation not causation, with researchers noting that loneliness and personal use for emotional support may drive the connection.

Daily AI Chatbot Use Tied to Higher Depression Rates

People who interact with AI chatbots daily face significantly elevated risks of mental health symptoms, according to a comprehensive study published in JAMA Network Open. The research, led by psychiatrist Dr. Roy Perlis from Mass General Brigham, surveyed 20,847 U.S. adults between April and May 2025 and found that those using generative AI tools at least once per day showed roughly 30% higher odds of experiencing at least moderate depression compared to less frequent users

1

2

. Within the survey population, approximately 10% reported daily AI use, while 5% engaged with systems like ChatGPT, Google Gemini, or Microsoft Copilot multiple times throughout the day

3

.

Source: NBC

Source: NBC

Understanding the Link Between AI and Depression

The study revealed that daily AI chatbot use for personal reasons—including seeking advice, recommendations, or emotional support—correlated with elevated symptoms of depression and anxiety. Among daily users, 87.1% reported using AI for personal reasons rather than strictly for work or school

2

. Participants completed standard mental health questionnaires assessing whether they had experienced trouble concentrating, sleeping, eating, or thoughts of self-harm in the previous two weeks. The research demonstrated what scientists call a "dose response"—the more frequently someone used AI, the stronger their mental health symptoms appeared

2

. Notably, using AI for work or school wasn't associated with symptoms of depression and anxiety, suggesting the nature of interaction matters significantly.

Source: Digital Trends

Source: Digital Trends

Middle-Aged Adults Face Particularly Elevated Risks

Age emerged as a critical factor in the relationship between AI chatbot use and mental health symptoms. Regular AI users between 45 and 64 years of age demonstrated a 54% higher risk of depression, compared to 32% higher odds among those between 25 and 44

3

. The mean age of survey participants was 47, with the study population consisting of mostly white men and women

2

. Researchers also found that men, younger adults, higher earners, those with higher education, and those in urban settings used AI more frequently, though the reasons behind these patterns remain unclear

2

.

Correlation Not Causation: Which Comes First?

Roy Perlis and his research team stress an important distinction: the findings show an association, not proof that AI chatbot use causes depression

1

. Dr. Jodi Halpern, co-director for the Kavli Center for Ethics, Science and the Public at UC Berkeley, noted that "it could go in either direction" and "could be a vicious cycle"

2

. People already experiencing depressive symptoms or loneliness may be more inclined to seek companionship through AI interactions rather than the technology triggering their condition. Nicholas Jacobson, associate professor at Dartmouth College, suggested that individuals may turn to AI for therapy because standard care remains difficult to access: "There's nowhere near enough providers to go around. And folks are looking for greater support than they can access otherwise"

2

.

The Role of Loneliness and Emotional Support

Loneliness appears to play a pivotal role in driving people toward AI for personal use. Dr. Sunny Tang, assistant professor of psychiatry at Northwell Health's Feinstein Institutes for Medical Research, explained that "a lot of people are feeling more and more isolated these days, whether it's because they're working remotely or for other reasons." She emphasized that loneliness serves as "a really strong predictor of mental health symptoms like depression, anxiety and irritability"

3

. For some individuals, AI chatbots provide a way of having social interaction that otherwise would be difficult for them

2

. However, mental health professionals caution that heavy reliance on AI interactions can make underlying issues harder to address with human connection and professional mental health support.

Need for Better Guardrails and Industry Responsibility

The findings highlight urgent questions about AI design and safety measures. Dr. Sunny Tang emphasized that AI companies must recognize that "people with mental illness and mental health symptoms are going to be actively engaging with their products" and should prioritize the principle of "first, do no harm"

3

. She called for better guardrails to ensure AI systems don't provide advice that worsens existing mental health symptoms and suggested companies ask themselves whether they can build AI to be more supportive of people with mental health needs

3

. While specialized therapeutic settings using CBT-based or clinically guided systems show some evidence of reducing depressive symptoms when designed with safeguards, general-purpose chatbots lack these protections

1

. The American Psychological Association advises against using AI as a replacement for therapy and psychological treatment

2

. As AI tools become more interwoven with daily life, experts recommend people remain mindful of why they're turning to artificial voices and not hesitate to seek human connection when needed.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo