AI in the mental health care workforce is met with fear, pushback — and enthusiasm



Artificial intelligence is reshaping mental health care—but not without controversy. From large hospital networks to solo practitioners, providers are experimenting with AI tools to streamline treatment delivery. Yet this rapid adoption, coupled with alarming reports of individuals experiencing harm after interacting with general-purpose chatbots, has sparked significant concern among clinicians and researchers.


> *"There is a lot of fear and anxiety about AI,"* says psychologist Vaile Wright, senior director of health care innovation at the American Psychological Association (APA). *"And in particular fear around AI replacing jobs."*


Those anxieties came to a head last month when 2,400 mental health providers employed by Kaiser Permanente in Northern California and the Central Valley staged a 24-hour strike.


 Triage by Algorithm?


Ilana Marcucci-Morris, a licensed clinical social worker based in Oakland, California, worked as a triage clinician for Kaiser Permanente's telepsychiatry intake hub starting in 2019. That changed in May 2025.


*"I have been reassigned from triage to other duties,"* she says.


The shift reflects Kaiser's broader effort to overhaul its intake process. *"What used to always be a 10- to 15-minute screening from a licensed clinician like myself is now being conducted by unlicensed lay operators following a script,"* Marcucci-Morris explains. *"Or, an e-visit."*



She and her colleagues worry that this downsizing of licensed triage staff is a stepping stone toward AI automation. At Kaiser's Walnut Creek location, a triage team once staffed by nine providers has shrunk to three, according to Harimandir Khalsa, a marriage and family therapist who also performs triage duties.


*"The jobs that we did [are] being handled by these telephone service representatives,"* Khalsa says.


The March 18 strike centered in part on these staffing changes. *"Part of our unfair labor practice strike really is about the erosion of licensed triage within the health plan,"* says Marcucci-Morris.


In response, Lionel Sims, senior vice president of human resources at Kaiser Permanente Northern California, stated: *"At Kaiser Permanente, our use of AI does not replace clinical expertise."* The health system confirmed it is evaluating AI tools from UK-based company Limbic, though it emphasized: *"Limbic is not in use at this time."*


Where AI Is Actually Being Used


Despite headlines about AI therapists, widespread clinical replacement isn't happening—yet.


*"I have not seen within mental health care any jobs be replaced by AI as of yet,"* says Wright of the APA. Instead, adoption has focused on administrative support.


*"One clear positive use case of AI tools is in the use of improving efficiencies around documentation and other automated types of activities,"* she notes.


Tasks like insurance billing or updating electronic health records consume hours that could otherwise be spent with patients. *"Most providers want to help people, and when they get mired down with excessive paperwork or documentation to get paid, that takes away time from direct patient care,"* Wright adds. *"And so I do think that there are benefits to incorporating these tools into your practice based on your personal comfort level."*


A growing marketplace supports this shift: nearly 40 products now offer transcription or documentation assistance for mental health providers. Companies like Blueprint provide AI assistants that summarize therapy sessions, update records, and help track patient progress.


Other firms target large health systems. Limbic, for example, has built AI assistants for intake, patient support, and more. *"We are deployed across 63% of the U.K.'s National Health Service, and we are currently serving patients in 13 U.S. states,"* says founder and CEO Ross Harper.


One of Limbic's tools, Limbic Care, is trained in cognitive behavioral therapy (CBT) techniques and offers direct patient support. Harper illustrates a potential use case: *"Let's imagine you're an individual. It's 3 a.m. in the morning on a Wednesday. You can't sleep, and you think 'I may actually need some help.'"* In that moment, a patient could access Limbic Care via a portal and receive evidence-based CBT strategies immediately.


 Clinical AI: Not Ready for Prime Time?


Despite growing interest, direct clinical use of AI remains limited.


*"We're not seeing a lot of clinical use of AI today,"* says Dr. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center in Boston. Why? *"They're not well tested,"* he says. Additionally, implementation can be costly and complex: *"You need a large IT team. You need infrastructure. There are safety things that have to go in place."*


Most small practices and community mental health centers lack the resources to adopt these platforms. Wright agrees, noting the regulatory gray area: *"At this point, because there is little regulation, it is incumbent on the provider to do the legwork and the research to figure out, 'Are the tools that are on the market and available, safe and effective?'"*


Toward a Hybrid Future


Torous believes AI adoption will accelerate as the technology matures—and that clinicians must engage proactively.


*"I think AI is going to transform the future of mental health care for the better,"* he says. *"But we as the clinical community have to learn to use it and work for it. So that means there's going to be a lot more training. We have to upskill ourselves."*


Opting out entirely isn't sustainable, he warns: *"Because if you take this approach and companies come in with products that may be good, maybe really bad and dangerous, we won't know how to evaluate them."*


Involving clinicians in AI development is critical—a point echoed by striking Kaiser workers who want a seat at the table. *"If AI is utilized, don't keep us clinicians out of the human process of engaging with our patients in determining the right level of care,"* says Khalsa.


As tools improve, Torous envisions a "hybrid or blended model of care": human therapists delivering treatment while AI assistants support patients with homework, skill practice, and real-time progress feedback for providers.


Wright emphasizes that human connection remains irreplaceable: *"There are no AI digital solutions that can replace human-driven psychotherapy or care."*


The path forward, experts suggest, isn't about choosing between humans and machines—but thoughtfully integrating technology to expand access, reduce burnout, and keep the therapeutic relationship at the heart of healing.

Post a Comment

Previous Post Next Post