How an AI Bot Can Help Mental Health Workers
Pitzer Professor Marcus Rodriguez’s research team received a Claremont Graduate University grant to develop a bot to help early-career therapists remember foundational principles.

What place does artificial intelligence have in mental health services—a field involving empathy and complex human emotions? Psychology Professor Marcus Rodriguez is not interested in AI as a stand-in therapist but rather as a tool for therapists to wield. Similar to how Siri gives map directions and Google sends calendar alerts, he believes an AI bot can offer reminders to therapists about principles of their profession.
Rodriguez and other collaborators received a $24,967 grant from Claremont Graduate University’s (CGU) BLAIS Foundation to train a bot to support community mental health workers (CMHWs) during their sessions. CMHWs are local, frontline mental health care providers who work in multidisciplinary teams, which include nurses and social workers. Rodriguez’s team is partnering with CMHWs supporting underserved youth in Jurupa Valley, which is in Riverside County approximately 21 miles from Claremont.
“We’re not expecting AI to deliver treatment,” said Rodriguez. “The therapist is providing that connection and validation. But the bot can support clinicians by reinforcing the basic skills that are otherwise easy to forget. It’s not replacing therapy but helping therapists stay grounded in the fundamentals of our profession.”
Rodriguez and his partners are working with Processica to develop the interactive coaching bot using generative AI and Natural Language Processing. The project is part of a larger study by CGU Professor C. Anderson Johnson and the Institute for Health Promotion & Disease Prevention Research. The project is also a partnership with the Community Translational Research Institute and funded through Medi-Cal contracts from the Inland Empire Health Plan and Jurupa Unified School District.
Supporting early-career therapists
CGU Professor Alexandra Auslander first reached out to Rodriguez about partnering on a project in AI and mental health. Rodriguez said that he grew the most as a therapist when his clinical supervisors provided “bug-in-the-ear” supervision during actual sessions with clients. Real-time feedback was more helpful to him than reporting to a supervisor a week later, which is usually how the process works. He suggested to Auslander that they study whether AI could offer a modified version of this kind of guidance.
Rodriguez pointed out how clinicians sometimes forget the building blocks of counseling. These include active listening, empathetic responding, open-ended questioning, using silence purposefully, and helping clients identify their core values and how they can live in greater alignment with those values. Motivational interviewing techniques like these help clients figure out the changes they want to make in their lives.
“There’s so much we’re supposed to do,” said Rodriguez. “Set the frame. Validate. Maintain a nonjudgmental stance. Show up with a warm, grounded presence. Keep an emotional focus. Teach new skills. Be mindful of systems of oppression. Then it’s also easy for early-career clinicians to lose sight of the bigger picture, to lose the forest for the trees.”
Recognizing emotions and motivations is the first step of healing. However, Rodriguez finds that many clinicians skip validation and try to jump straight to problem solve instead, or worse, advice giving. He hopes that the AI bot can provide real-time recommendations to therapists to put their validation skills into practice, which will in turn strengthen the therapeutic alliance. Ultimately, the hope is that as clinicians see improved outcomes in their clients, they might experience less burnout.
“The therapy belongs to the clinicians,” said Rodriguez. “The bot is simply offering recommendations. It’s still up to the clinicians to use their best judgment, guided by their instincts, training, and ethical standards.”
Training to decode and validate emotions
Another dimension Rodriguez and his collaborators plan to explore is whether AI can pick up on emotions and read between the lines of a client’s words. Rodriguez’s previous experience with AI made him curious about this possibility.
“Google’s AI analyzes video calls to estimate participant engagement by tracking factors like eye contact, body posture, and speech frequency,” said Rodriguez. “In my limited experience with the technology, the feedback it generated aligned closely with what I experienced as a participant in the meeting.”
Rodriguez is feeding the AI bot with information about validation and motivational interviewing. He is pulling from materials including his own articles, book chapters, training materials, and presentations. Once the AI developers finish programming the bot, Rodriguez will pose as a client. After several rounds of testing, Rodriguez and the team plan to field test with clinicians to see if they find the bot useful.
The project aligns with Rodriguez’s research with his students at his Global Mental Health Lab at Pitzer, whose mission is “to increase access to effective mental health care in a way that is affordable, feasible, scalable, and culturally appropriate.” Rodriguez plans to include his student James “Oliver” Dean ’26 in the CGU-funded project.
News Information
Published
Author
Bridgette Ramirez