Articles
Over 1,600 therapists shared how they’re integrating artificial intelligence into their practices, their hopes and reservations about this new technology, and what they wish legislators knew.
There's much more to read, including research highlights, book recommendations, profiles of inspiring providers, and strategies for managing your practice.
The best part? It'll be sent to your door for free.
Few of us are strangers to disruptive technology. It was just over five years ago when the mental health industry was upended by Covid and rebuilt online. Now we’re facing artificial intelligence, which has implications so far-reaching, it’s hard to imagine where they end. For many therapists, the AI boom is yet another clash between the worlds of tech and mental health, where the responsibility for maintaining integrity and quality of care often lands heavily on providers’ shoulders.
“Many clinicians struggle to reconcile their commitment to the deeply human relationships at the core of psychotherapy with new technologies, including AI,” says Elisabeth Morray, Vice President of Clinical at Alma. “I believe that AI-supported tools can effectively address certain burdensome tasks, including documentation, in a way that actually enhances the therapeutic alliance, but clinical oversight is essential.”
To find out where clinicians stand on AI, we conducted a survey of over 1,600 therapists. The results reveal diverse and sometimes opposing views, but one thing that’s certain is that they are eager to play a bigger role: 79% of therapists believe it’s critical they have a voice in designing the next generation of AI-supported tools.
“AI is not a passing trend; it’s the next step in technological evolution. As providers, we have a choice: ignore it and let others dictate its role in mental health care, or engage with it and shape its use to truly support clinicians and patients. Personally, I’d rather have a seat at the decision-making table.” —Anastasia Sfiroudis, LCSW
AI is quickly making its way into therapists’ workflows, with 36% of surveyed mental health providers using AI-supported tools with at least some of their clients.
69% of mental health providers using AI-supported tools indicated an improvement in their ability to deliver care to their clients.
Those surveyed also felt they had more time to focus on serving their clients instead of getting bogged down in administrative tasks. 45% of those providers indicated more than half their admin time could realistically be managed by AI without compromising quality of care. This percentage shifted dramatically for providers not currently using AI, with only 11% believing that AI could cut their admin time in half.
“AI note-writing helps me spend less time drowning in paperwork and more time helping my clients. It keeps my notes organized, clear, and complete, so I don’t have to scramble to remember key details later. It also helps me track progress more easily, making sure treatment stays on point. Plus, it takes some of the mental load off, so I’m not running on empty by the end of the day.” —Anonymous Therapist
54% of therapists we surveyed indicated they’ve never used AI-supported tools with their clients, while an additional 10% noted they’ve used AI-supported tools in the past but don’t currently use them.
The potential for bias is another major concern—62% of all therapists we surveyed believe that AI tools could potentially reinforce biases in mental health care.
“AI models are only as good as the data they are trained on. Without diverse, representative data, these systems may reinforce biases and fail to provide equitable recommendations.” —Angelita Pritchett, LCSW
“Person-to-person interactions help us navigate interpersonal and intrapersonal growth. Talking to an AI bot is a one-sided interaction, which might provide some help personally, but lends itself to unrealistic expectations of others.” —Marshneil Lal, LCSW
Only 25% of therapists feel confident that their clients are comfortable with them using AI-supported tools in their practice.
“As a social worker, I stand behind a person's right to choice and autonomy. If a person decides to seek help from an AI chatbot, who am I to say that isn't allowed? I encourage regulators to help industry set standards for identifying when these tools are helping people versus when these tools are inhibiting an individual's progress, and/or potentially harming them.” —Sarah Shea, LCSW
We asked therapists to tell us how the use of AI in mental health care makes them feel, and the responses varied widely based on their own personal usage of AI in their practice. The majority of therapists who currently use AI tools feel hopeful about this new technology, while the vast majority of therapists not currently using AI tools expressed concern.
Despite the range of experiences and presumptions, 91% of therapists we surveyed believe that while therapy may be supplemented by AI-supported tools, the human-to-human relationship is core to driving positive outcomes in therapy.
64% of therapists believe there should be federal legislation regulating the use of AI in mental health care, and many would like the opportunity to communicate with regulators about putting guardrails in place.
“I would want regulators to understand how delicate a client and therapist relationship is and how one incorrect move could take a long time to repair.” —Kristin Oparaji, LPC
“I would tell them not to regulate it at all. This is an industry in its infancy, and any regulation will enforce the biases of the regulators. It is absolutely vital not to interfere with the direct market feedback from the actual consumers and providers.” —Anonymous Therapist
Without regulation, mental health care providers and organizations must take a vigilantly cautious approach when integrating AI, and establish common principles and guidelines to ensure that it’s used ethically and effectively.
Protecting the human-therapeutic relationship, creating transparency around when and how AI is involved, gaining prior consent, safeguarding patient data, taking steps to reduce bias, and tracking impact and outcomes are all necessary considerations.
Meanwhile, providers are entering into new territory with their clients, some of whom will be vulnerable to AI generated misinformation or chatbots that claim to support mental health but are primarily designed to keep users engaged.
“As we integrate AI into the therapeutic space, we face the challenge of preserving the human connection at the heart of healing,” says Morray. “As clinicians, we have the opportunity to advocate for the healthy use of this technology, to ensure that it ultimately solves more problems than it creates.
Alma members benefit from a supportive community of peers who help each other grow as clinicians, provide referrals, and share their experiences with training and certification.
Alma membership also includes access to a wide range of continuing education workshops and webinars, at no extra cost. Alma is accredited with the American Psychological Association, the Association of Social Work Boards, the National Board for Certified Counselors, and the New York State Board.
Alma surveyed 1,643 mental health providers across the country between February and March 2025. For multiple-choice questions, respondents had the ability to pick up to three options. Respondents indicated whether or not they wanted their quotes to remain anonymous.
We believe that when clinicians have the support they need, mental health care gets better for everyone.