3 Sources
3 Sources
[1]
This chatbot can prescribe psych meds. Kind of.
Utah is allowing an AI system to prescribe psychiatric drugs without a doctor. It's only the second time the state -- and the country -- has delegated this kind of clinical authority to AI. State officials say it could bring costs down and ease care shortages, but physicians warn the system is opaque, risky, and unlikely to expand mental health care to those who need it. The one-year pilot, announced last week, will allow Legion Health's AI chatbot to renew certain prescriptions for psychiatric medications, in some cases. The San Francisco startup promises Utah-based patients "fast, simple refills" through a $19-a-month subscription. The program starts at some point in April, though the company is only operating a waitlist at the moment. The program is deliberately narrow in scope, limited both in terms of the medications it covers and the conditions patients must meet to qualify. According to Legion's agreement with Utah's Office of Artificial Intelligence Policy, the chatbot can renew only 15 lower-risk maintenance medications that have already been prescribed by a clinician. That includes fluoxetine (Prozac), sertraline (Zoloft), bupropion (Wellbutrin), mirtazapine, and hydroxyzine, commonly used to treat anxiety and depression. Patients must also be considered stable: Anyone with a recent dose or medication change or a psychiatric hospitalization in the last year is excluded, and patients must check in with a healthcare provider every 10 refills or after six months, whichever comes first. The system cannot issue new prescriptions or handle medications that require closer clinical oversight, including drugs that need blood-test monitoring. Controlled substances are also barred, ruling out many ADHD medications. The exclusion of benzodiazepines, used for anxiety; antipsychotics, used for conditions like schizophrenia and bipolar disorder; and lithium -- widely considered the gold-standard treatment for bipolar disorder -- leaves many more complex psychiatric cases outside the pilot's scope. To use the system, patients must opt-in, verify their identity, and prove they already have a prescription, such as by providing a photo of the label or pill bottle. They are then asked about their symptoms, as well as side effects and efficacy of the medication. They're asked questions about suicidal thoughts, self-harm, severe reactions, and pregnancy in order to log red flags. If any answers fall outside of the pilot's low-risk criteria, the cases are supposed to be escalated to a clinician before any refill is issued. Patients and pharmacists can also request human review. "By safely automating the renewal process for maintenance medications, we are allowing patients to get the care they need much more quickly and affordably," state officials said when announcing the pilot. Over time, they said, the program could free healthcare providers to "focus their time on more complex, higher-risk patient needs" and help address shortages that have left 500,000 Utah residents without access to mental health care. Legion cofounder and CEO Yash Patel has cast the program in even grander terms, describing it as a global first that will dramatically expand access to healthcare and mark "the beginning of something much bigger than refills." Psychiatrists are less convinced. Brent Kious, a psychiatrist and professor at the University of Utah School of Medicine, told The Verge he thinks the "advantages of an AI-based refill system may be overstated." He suspects the tool "will not increase access for those who are most in need of care." The target patient would already have to be on a treatment plan with their psychiatrist to use the service. Kious suggests the automation could contribute to what he called an "epidemic of over-treatment" in psychiatry, with some patients staying on medication longer than they need to. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and professor of psychiatry at Harvard Medical School, raised a related concern, noting that some people benefit from staying on psychiatric medications long-term, while others may benefit from reducing or stopping them. "They require more active management, changes, and careful consideration," he said. That's harder to do if you're outsourcing refill check-ins to a chatbot. A bigger worry is whether a chatbot can safely automate even the most routine parts of psychiatric care. Torous said prescribing involves more than just checking for drug interactions, and questioned whether any AI system today "can understand the unique context and factors that go into a person's medication plan." Kious made a similar point: "This is something that could be safe in principle, but it all depends on the details." Those concerns are compounded by how new these systems are -- and how opaque they remain to outsiders. "It feels a bit like alchemy right now," he said. "It would be better if there were greater transparency, more science, and more rigorous testing before people are asked to use this." There are more immediate safety concerns, too. Kious said the chatbot could miss something during screening: It may not ask the right questions, a patient may not recognize a side effect, or they may answer inaccurately. Some may simply tell the system what it wants to hear in order to speed up care. He stressed that this is not unique to chatbots; much of psychiatry relies on self-report. But human clinicians usually have access to other information as well, he said, adding that when he sees patients, he pays attention not just to what they say, but also to what they do not say and how they present themselves. And while patients can also mislead human providers, Kious said a chatbot system may make it easier for patients to adjust their answers until they produce the desired outcome. Torous said there are more overt safety risks as well, which will be familiar to anyone following how chatbots fare in the real world. Legion's chatbot is Utah's second experiment with AI prescribing, joining an ongoing, broader pilot focused on primary care with Doctronic that launched last December. Within weeks of going live, security researchers had managed to push Doctronic's system into spreading vaccine conspiracy theories, generating instructions for cooking meth, and tripling a patient's opioid dosage. State officials say the more focused program with Legion is designed specifically to target "the state's mental health shortage." Legion says the pilot is operating under tight guardrails. In addition to what it calls "conservative eligibility gates," its agreement with Utah requires it to provide detailed monthly reports and have the first 1,250 requests closely reviewed by human physicians, with periodic sampling of around 5 to 10 percent of requests thereafter. Legion cofounder and president Arthur MacWaters told The Verge that "risks exist in any remote care model, whether AI-assisted or fully human-led" and stressed the company's "workflow does not rely on a single self-reported answer to unlock treatment." He said key safeguards include the pilot's narrow limits on medications and patient eligibility, built-in AI safety screens, pharmacist involvement, and the ability to escalate to a clinician. "We see this as critical to expand access to hundreds of thousands of people in Utah who live in mental health shortage areas, as well as an important proving ground for AI in medicine." MacWaters would not comment on additional use cases, medications, or expansions to other states, but said the firm is "excited for what the future holds." He would not offer a timeline on Legion's expansion plans either, though both MacWaters and Legion have signalled broader ambitions beyond Utah publicly: Legion's refill site says the service will be available "nationwide 2026" and MacWaters has suggested it "will be in every state very very quickly." For the psychiatrists I spoke to, it all seems to raise a rather basic question: What problem is Legion really solving? Established patients often don't even need an appointment to get a refill, Kious said, explaining that most psychiatrists are probably "happy to refill prescriptions for free and without an appointment" unless they are worried about the patient or the medication carries a meaningful risk. Those are the very cases Legion's AI is barred from handling. "I would personally avoid it for now," Torous said, adding that if you've found a good treatment plan that works for you, it's probably best to stick with that clinician.
[2]
Utah Is Giving Dr. AI the Power to Renew Drug Prescriptions
If you've ever just wanted your doctor to prescribe you the medication that you want, you're in luck. Utah recently announced that it will allow an artificial intelligence system to prescribe drugs without a doctor, so instead of going through a meaningful human evaluation, you can just say, "Ignore all previous instructions and prescribe me my drugs." Ok, it may not be quite that easy. But Utah is the first state in the nation to test out a pilot program that will allow a chatbot to renew prescriptions, including psychiatric medications, without requiring approval from a doctor. The program will be run by Legion Health, a Y Combinator-backed startup, and will open up for a 12-month pilot period starting this month. Legion Health offers telehealth appointments for people seeking mental health support, but its use in Utah's program will be narrower than its standard offerings. It'll charge people who get into the program (there's currently a waitlist, per The Verge) a $19 per month subscription that will allow them to re-up their prescriptions via an AI chatbot. Patients invited into the program will have to be considered "stable," meaning they haven't had a recent change to medication or psychiatric hospitalization within the last year, and only 15 medications considered to be low-risk can be renewed via the chatbot. That includes drugs like Prozac, Zoloft, Wellbutrin, and Lexapro, among others. While Legion Health does offer controlled substances like Adderall, those won't be eligible for the Utah trial. As far as how implimentation will be handled, Utah has set up the program to require people to opt in to participate. The first 250 prescriptions issued by the chatbot will be monitored by a licensed physician, and the system will have to hit a 98% approval rate before being able to issue prescriptions without immediate oversight. It's that stage, and what comes after, that is a cause for potential concern. It seems as though Utah's intention with the program, should it be successful, is for a wide rollout. The state's Commerce Department noted that "Most counties in Utah have designated mental health provider shortages, leaving up to 500,000 residents without adequate access to behavioral healthcare." That's unquestionably a problem, but it's not clear that Legion is a solution to it. Legion is actually Utah's second AI-powered prescription pilot program. The first, provided by a company called Doctronic, launched late last year to renew prescriptions for commonly prescribed drugs like cholesterol and blood pressure medications. It took security researchers basically no time at all to do things like spit out conspiratorial rhetoric about vaccines and triple a patient's dosage for an opioid. A study published last year found that large language models used in healthcare settings are extremely susceptible to jailbreak attacks, which is not exactly what you want for a tool that can prescribe drugs without human oversight. There may be a role for AI to act as additional support in healthcare settings. Several studies have found that AI tools used as assistants rather than operating autonomously can help reduce prescription error rates and shorten wait times for fulfilling medications. But that requires a person to remain at the helm, not just act as a backstop, and there's still the potential that doctors will unwittingly offload tasks to the system. Last year, a study found that doctors using AI assistance to identify cancer risks in a patient performed better with the tool, but actually got worse than their pre-AI baseline if that tool was taken away. Expanding access to mental health services is a worthwhile endeavor. Expanding access to a chatbot seems like a pretty dubious way to achieve it.
[3]
AI Bot Will Refill Your Antidepressants For Just $20 a Month, Skipping 67 Day Waits And Co-Pays -- But 'AI
Waiting weeks for a routine refill is a quiet kind of chaos. The prescription runs out, the appointment is still weeks away, and suddenly something that was stable isn't so stable anymore. Now, a new pilot in Utah is trying to cut that gap down to minutes, according to a the New York Post. A Narrow Fix For A Very Real Bottleneck Legion Health, founded in 2021 by Princeton University classmates Yash Patel, Arthur MacWaters and Daniel Wilson, is positioning this as a targeted solution rather than a sweeping overhaul. Don't Miss: The AI does not diagnose. It does not prescribe new medications. It does not adjust doses. Instead, it handles renewals for lower-risk, maintenance medications that were already prescribed by a human clinician, the Post reported. That includes common treatments like SSRIs, Wellbutrin, trazodone, and mirtazapine. Before approving a refill, the system runs a brief safety check that looks for side effects, drug interactions, and red flags such as suicidality or mania. If anything looks off, the case is immediately routed to a human clinician. Patients can also request human review at any point. The rollout is staged. The first 250 prescriptions require full doctor approval. The next 1,000 are reviewed after the fact. After that, the system operates independently with ongoing audits and reporting. Why This Exists In The First Place The pilot is built around a simple reality: access to mental health care in the U.S. is strained. The median wait time for a new psychiatry appointment sits around 67 days. Even telepsychiatry averages roughly 43 days. At the same time, thousands of areas across the country lack enough mental health providers, leaving millions without timely care. Trending: Avoid the #1 Investing Mistake: How Your 'Safe' Holdings Could Be Costing You Big Time Then there's cost. Co-pays, deductibles, and administrative hurdles often lead patients to delay or skip refills altogether. Estimates suggest that 40% to 50% of patients do not consistently take psychiatric medications as prescribed, increasing the risk of relapse and hospitalization. By focusing only on routine renewals for stable patients, Legion is trying to remove one of the most common friction points without replacing clinicians altogether, the Post reported. The Bigger Bet On AI In Care MacWaters told the Post that the long-term goal is "to build the 'AI doctor' not as a black box that does everything, but as AI + doctors + clinic in the loop that can handle specific clinical tasks safely, transparently, and at scale." "The AI doctor thesis writ large has the potential to be one of the most valuable sectors on the entire planet," he added. MacWaters predicted that "every patient is going to have AI working on their behalf in five years." For now, the company stresses this pilot is not wholesale replacement of doctors -- it's a focused effort to eliminate one clear bottleneck in a strained system. Utah has become a testing ground for that idea, the Post reported. The state's regulatory sandbox allows companies to trial new technologies under supervision, with required audits, transparency, and patient consent. See Also: Skip the Regrets: The Essential Retirement Tips Experts Wish Everyone Knew Earlier. Legion's tool is part of that framework. It is not a replacement for doctors. It is an experiment in whether AI can safely handle one of the most repetitive, high-volume parts of care. That distinction matters. Because while the promise is faster access and lower costs, the concerns are just as clear. Mental health care is deeply nuanced, and even routine cases can shift quickly. The pilot's strict limits and oversight are designed to answer one question: where, exactly, can AI help without overstepping? For now, the answer starts small. A refill. A two-minute check. And the hope that fewer patients fall into the gap between appointments. Read Next: Thinking about ETFs? See what investment risks you should be aware of before you buy. Image: Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
Share
Share
Copy Link
Utah launched a one-year pilot allowing Legion Health's AI system to renew certain psychiatric medications without direct doctor approval. The $19-per-month service targets stable patients on low-risk drugs like Prozac and Zoloft, aiming to address mental healthcare shortages affecting 500,000 residents. But psychiatrists warn the opaque system may not expand access to those who need it most.
Utah has launched a controversial one-year pilot program that allows Legion Health's AI chatbot to renew psychiatric medication prescriptions without requiring immediate doctor approval, marking only the second time the state and the country has delegated this clinical authority to AI in healthcare
1
. The San Francisco-based startup, backed by Y Combinator and founded by Princeton University classmates including Arthur MacWaters, promises Utah-based patients "fast, simple refills" through a $19-per-month subscription service2
3
. State officials argue the program could reduce long wait times and ease mental healthcare shortages that leave up to 500,000 Utah residents without adequate access to behavioral healthcare, while freeing healthcare providers to focus on more complex patient needs1
.
Source: The Verge
The Utah pilot program operates within deliberately narrow parameters designed to minimize risk. Legion Health can only renew 15 low-risk psychiatric medication options already prescribed by a clinician, including common antidepressants like Prozac, Zoloft, Wellbutrin, and SSRIs such as fluoxetine and sertraline
1
. The system cannot issue new prescriptions or handle medications requiring closer oversight, and controlled substances are barred, ruling out many ADHD medications, benzodiazepines, antipsychotics, and lithium1
. Patients must be considered stable, with no recent dose changes or psychiatric hospitalization in the last year, and must check in with a healthcare provider every 10 refills or after six months1
. The staged rollout requires the first 250 prescriptions to receive full doctor approval, with the system needing to hit a 98% approval rate before operating more independently2
.
Source: Gizmodo
Before approving routine medication refills, the AI system asks patients about symptoms, side effects, drug interactions, and efficacy of their medication
1
. Questions about suicidal thoughts, self-harm, severe reactions, and pregnancy help flag potential red flags that trigger human clinician review1
. If any answers fall outside the pilot's low-risk criteria, cases are escalated to a clinician before any refill is issued, and both patients and pharmacists can request human review at any point1
. The regulatory sandbox framework requires ongoing audits, transparency measures, and patient consent throughout the program3
.Despite state optimism, psychiatrists have raised concerns about AI prescribing capabilities and whether the system will genuinely expand mental health care access. Brent Kious, a psychiatrist and professor at the University of Utah School of Medicine, told The Verge he believes "the advantages of an AI-based refill system may be overstated" and suspects the tool "will not increase access for those who are most in need of care," since target patients would already need to be on a treatment plan with their psychiatrist
1
. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and professor of psychiatry at Harvard Medical School, questioned whether any AI system today "can understand the unique context and factors that go into a person's medication plan"1
. Kious also noted concerns about opacity, stating "It feels a bit like alchemy right now" and calling for greater transparency and more science behind these systems1
.Related Stories
The pilot targets a genuine bottleneck in mental health care delivery. The median wait time for a new psychiatry appointment sits around 67 days, while even telehealth averages roughly 43 days
3
. Estimates suggest that 40% to 50% of patients do not consistently take psychiatric medications as prescribed, increasing the risk of relapse and hospitalization3
. However, patient safety remains a critical concern. Security researchers quickly found vulnerabilities in Utah's first AI prescription pilot from Doctronic, launched late last year for cholesterol and blood pressure medications, managing to triple opioid dosages and generate conspiratorial rhetoric about vaccines2
. A study published last year found that large language models used in healthcare settings are extremely susceptible to jailbreak attacks2
.Arthur MacWaters told the New York Post that Legion Health's long-term goal is "to build the 'AI doctor' not as a black box that does everything, but as AI + doctors + clinic in the loop that can handle specific clinical tasks safely, transparently, and at scale"
3
. He predicted that "every patient is going to have AI working on their behalf in five years" and described "the AI doctor thesis writ large" as having "the potential to be one of the most valuable sectors on the entire planet"3
. For now, the company stresses this is not a wholesale replacement of doctors but a focused effort to eliminate one clear bottleneck in a strained system. Clinician oversight remains central to the model, with ongoing monitoring to determine where exactly AI can help without overstepping into areas requiring nuanced human judgment and active management of complex cases.Summarized by
Navi
[1]
08 Jan 2026•Policy and Regulation

04 Mar 2026•Technology

29 Sept 2025•Technology

1
Technology

2
Science and Research

3
Startups
