AI in Mental Health: Experts Map Automation Levels as Therapists Navigate New Technology

Reviewed byNidhi Govil

5 Sources

Share

As AI chatbots for emotional support gain traction, researchers at University of Utah propose a four-level automation framework to guide AI integration in mental healthcare. Meanwhile, mental health care providers express concerns about job displacement, and the American Psychological Association urges therapists to discuss AI usage with patients. The debate centers on how artificial intelligence can support human therapists without replacing the deeply personal nature of psychotherapy.

AI Integration Reshapes Mental Healthcare Delivery

Artificial intelligence is entering mental healthcare at a rapid pace, driven by a global crisis in access to psychological services. The World Health Organization estimates that depression alone affects more than 300 million people globally, yet patients often face months-long waits for therapy

1

. AI tools for mental health are expanding to fill this gap, from conversational therapy chatbots like Woebot and Wysa to predictive analytics that identify suicide risk

1

. Digital therapeutics platforms such as reSET deliver structured cognitive behavioral therapy modules that adapt based on patient progress, while behavioral monitoring applications track sleep patterns and smartphone usage as potential mental health indicators

1

.

Source: Medscape

Source: Medscape

University of Utah Proposes Automation Framework for Psychotherapy

Recognizing the need for clarity around AI automation in psychotherapy, researchers from the University of Utah developed a comprehensive framework published in Current Directions in Psychological Science. Led by educational psychology professor Zac Imel, the team outlined four distinct categories representing different levels of automation in therapy

2

. Category A involves scripted systems where chatbots deliver prewritten content through decision trees. Category B focuses on AI evaluating therapists by reviewing sessions and providing feedback. Category C describes AI assisting therapists with suggested interventions while human therapists deliver care. Category D represents fully autonomous AI providing therapy directly to patients

2

. Associate professor Vivek Srikumar from the Kahlert School of Computing compared these levels to self-driving cars, noting that different automation levels carry vastly different risks and benefits

5

.

Source: Newswise

Source: Newswise

Therapists Urged to Discuss AI Usage With Patients

The American Psychological Association now recommends that mental health care providers ask patients about their use of AI chatbots for emotional support, similar to how they inquire about sleep, diet, and exercise. "We're not saying that AI use is good or bad," explains Shaddy Saba, assistant professor at New York University's Silver School of Social Work, "just like we wouldn't say substance use is necessarily good or bad"

3

. Learning about patient interactions with ChatGPT or other AI systems can provide valuable insight into coping strategies, relationship challenges, and topics patients may avoid discussing directly

3

. Former National Institute of Mental Health director Dr. Tom Insel notes that people often use chatbots to discuss things they cannot share with others due to fear of judgment, including suicidal thoughts

3

. However, therapists should also inform patients about risks, particularly data privacy concerns, as many AI companies use conversations to train their models

3

.

Mental Health Care Workforce Expresses Concerns About Job Displacement

The rapid adoption of AI in mental health has sparked anxiety among practitioners. In March 2026, 2,400 mental health care providers at Kaiser Permanente in Northern California and the Central Valley staged a 24-hour strike, partly protesting changes to the triage system

4

. Licensed clinical social worker Ilana Marcucci-Morris described how her triage role was reassigned in May 2025, with 10-to-15-minute screenings by licensed clinicians replaced by unlicensed operators following scripts or e-visits

4

. At Kaiser Permanente's Walnut Creek facility, the triage team shrank from nine providers to three

4

. Kaiser Permanente confirmed it is evaluating AI tools from U.K. company Limbic, though the technology is not currently in use, and emphasized that "our use of AI does not replace clinical expertise"

4

.

Support for Human Therapists Through Administrative Automation

Despite fears, the American Psychological Association's Vaile Wright notes she has not seen AI replace mental health jobs yet. Instead, AI adoption focuses on improving efficiency around documentation and automated activities

4

. Nearly 40 different products now offer transcription and documentation support services, helping therapists with time-consuming tasks like billing insurance companies and updating electronic health records

4

. The University of Utah team is partnering with SafeUT, Utah's statewide text-based crisis line, to develop tools that evaluate crisis counselors' sessions and provide feedback for skill maintenance and development

2

. Imel emphasizes that evaluating psychotherapy sessions is "tremendously labor-intensive" and rarely used, but Large Language Models can quickly capture core treatment components and provide real-time feedback to therapists

5

.

Balancing Innovation With Patient Safety and Ethical Considerations

As AI in mental health continues to expand, questions about patient safety, consent, and clinical responsibility intensify. The University of Utah researchers note that while anyone can now turn to ChatGPT for counseling resembling psychotherapy, Large Language Models carry significant risks since they fabricate information, encode biases, and respond unpredictably

5

. Srikumar questions why one would deploy the riskiest version of a tool when lighter versions like note-taking applications already make life easier

5

. Companies like Limbic, deployed across 63% of the U.K.'s National Health Service and in 13 U.S. states, offer AI assistants for patient intake and support, including a chatbot trained on cognitive behavioral therapy that provides evidence-based tools at 3 a.m. when human therapists are unavailable

4

. The challenge ahead lies in determining which tasks benefit from clinician support through AI and which require the irreplaceable human connection at the heart of psychotherapy.

Source: NPR

Source: NPR

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
Ā© 2026 TheOutpost.AI All rights reserved