Blog

The Future of AI in Mental Health Care: Where Do We Stand?

MAY 13, 2026
Munsch Hardt Legal Health Care Update

Artificial intelligence is making significant inroads into mental health care — and it's sparking important conversations about the future of the profession.

The Current Landscape:

Most AI adoption today focuses on administrative efficiency, documentation, billing, and electronic health records, freeing providers to spend more time with patients. Companies like Blueprint and Limbic are leading this space, with Limbic now deployed across 63% of the UK's National Health Service and currently serving patients in 13 U.S. states.

Limbic and Direct Patient Care:

Notably, Limbic is not limiting itself to back-office tasks. Its chatbot, Limbic Care, is trained on cognitive behavioral therapy (CBT) skills and provides direct patient support — offering evidence-based CBT tools and techniques to patients in real time, including outside of normal business hours. Kaiser Permanente has confirmed it is currently evaluating Limbic’s tools to assist members in accessing care. This kind of AI-driven, patient-facing intervention raises an important question: when an AI tool moves beyond scheduling and billing and into the therapeutic space, who bears responsibility if something goes wrong?

The Tension:

Not everyone is on board. Last month, 2,400 mental health providers at Kaiser Permanente went on strike, citing concerns about triage system changes they believe could open the door to AI replacing licensed clinicians. Their message: don't keep clinicians out of the process.

What Experts Are Saying:

According to Dr. John Torous of Beth Israel Deaconess Medical Center, we're likely moving toward a "hybrid" model — where human providers deliver therapy while AI assists with patient support, skill-building, and real-time feedback.

But challenges remain: AI clinical tools are largely untested, implementation is expensive, and regulation is minimal.

The Legal Implications:

As AI tools like Limbic Care move into direct patient interactions—providing therapeutic guidance rather than merely handling paperwork — the legal risk profile changes dramatically. If a patient suffers a bad outcome after relying on AI-generated therapeutic advice, several legal theories could come into play. Medical malpractice and negligence claims could target the health system that deployed the tool, particularly if the AI provided care that fell below the applicable standard or if the system failed to ensure adequate clinical oversight. Product liability claims could reach the AI developer itself, under theories that the tool was defective in design or that it failed to carry adequate warnings about its limitations.

There are also serious questions around informed consent — were patients told they were interacting with an AI rather than a licensed clinician, and did they understand the risks? The current regulatory landscape, as experts note, offers little guidance: there is minimal regulation of clinical AI tools, and the burden largely falls on providers to determine whether the tools they adopt are safe and effective. Health systems integrating AI into patient care should be thinking carefully about contractual indemnification with AI vendors, clinical governance and oversight protocols, transparent patient disclosures, and compliance with state and federal telehealth and licensure requirements. The legal frameworks governing AI in health care are still developing, but the liability exposure is real — and growing.

The Bottom Line:

AI will likely transform mental health care — but human-driven therapy isn't going anywhere. As Dr. Vaile Wright of the American Psychological Association puts it: "There are no AI digital solutions that can replace human-driven psychotherapy or care."

The key? Involving clinicians in AI development to ensure technology serves patients — not the other way around — and building the legal and regulatory infrastructure to protect patients before, not after, something goes wrong.

What are your thoughts on AI's role in mental health care?