Increasing Efficiency: AI for Integrative Mental Health Professionals
A robot stenographer
As mental health professionals, we can often find ourselves squeezed for time. Between 25+ client hours per week, consultation, documentation, accounting, marketing, and more, we’re stretched thin. If you have a family, like myself, you know that can add a lot of pressure as well. Additionally, trying to maintain your own self-care practices can be daunting. So, how do we optimize and save time, so that we can open up more time for self-care and doing what we love?
In this blog post, I’ll explore how AI can assist you in optimizing your practice, so that you can save time, avoid burnout, and spend more time doing what you love the most.
AI Documentation Tools
The easiest and most effective way to improve efficiency using AI is to use an AI notetaker (scribe) to help reduce your documentation burden. At Integrative Care Collective (ICC), we are sponsored by one such company, Perspectives Health.
PerspectivesHealth has a tool that can integrate directly into your Electronic Medical Record (EMR) system. What I love about this tool in particular is that it has the ability to do intake assessments, such as biopsychosocials (BPS) or ASAM assessments.
Additionally, this tool will cover your standard SOAP/DAP format notes. Notably, it is HIPAA compliant and boasts SOC 2 Type II security compliance.
The way these kinds of tools work is that you turn on a recorder at the beginning of the session, and then turn it off at the end. The tool records and transcribes the session, and then outputs a note that you can put into your EMR. Of course, always be sure to double check it before submitting as AI can make errors.
Clinical Consultation and Resource Creation
You may be surprised, but AI can be a really effective tool for getting ideas on how to plan for your sessions. Be sure to give it enough context so that it can really understand what the client is experiencing and give good feedback.
Caution: Never put protected health information (PHI) into an AI system. These systems automatically store data, and usually delete it after 30-days. If you are putting client information into this system, it’s technically a HIPAA violation since they’re storing data on behalf of a healthcare organization.
Really any of the leading AI models should give similar results, so use the one that you prefer. I use ChatGPT, but many models will be similarly effective.
An example prompt might be:
“I would like some feedback on how to approach a couple’s therapy session. I am sensing that the husband is holding back some information. The wife does the majority of the talking, and it’s hard to get him to open up. For context, they’ve been together for 8 years, and the wife is worried that the husband may be cheating on her. The husband has been caught texting other women, but when prompted, he shuts down. How can I encourage him to open up?”
You’ll be surprised what kind of feedback you receive!
If you really want to get fancy, you can try using this Prompt Maker tool. Current models benefit from things like having a role (e.g., “You are an expert couple’s therapist with 20-years of experience), breaking things down step-by-step, providing context, and knowing when to stop. This tool can help improve your prompts to get better results.
Resource Sheets
ChatGPT can also create resource documents for you. Again, ultimately use your clinical judgment to determine whether it’s really a good document or not. I had ChatGPT create a document for a client about addiction education and I did not like it. So, this is your reminder to not get too overly reliant on these. That said, it can also be really supportive at times!
If you would like to create a resource sheet, though, you can prompt the model similar to the above. An example might be:
“Create a one-page handout PDF that I can use to educate clients on interpersonal effectiveness skills. This is targeted toward a client that struggles with social anxiety and needs some practical advice.”
Feel free to use the Prompt Maker to enhance your prompt as well if you’d like.
Marketing and Content Generation
Another area where AI can really support you is in making online content. Publishing blogs can be a real time sink, but it’s a great way to educate clients and to grow your online presence. My suggestion would be to start a newsletter, have a way to sign-up on your website, and then send out blog posts via the newsletter weekly, bi-weekly, or monthly. AI tools can really support you if you’re looking to grow in this way.
A simple way to get AI’s support in making content is to have it write blog posts for you. The level of writing competency is pretty good, and it continues to excel. If you have a specific niche, you can use it to create content specific to your niche. As always, the more context you can provide, the better. For example:
“Write me a 500-word blog post on healthy eating in addiction recovery. Act as an expert in nutrition and addiction. Use a professional tone and make this at an appropriate reading level for the general population. Focus on the following areas: Macronutrients, Restoring the body after addiction, and Avoiding Processed Foods.”
Be sure to use your professional judgment. Edit the content as necessary, make changes, and don’t be afraid to tell the model where it’s gone wrong. In my experience, AI generated content is often astonishingly good or astonishingly bad. It can help to go back-and-forth, but sometimes you just can’t fix it, so use your judgment wisely.
Closing Thoughts
AI is rapidly changing the landscape of work, and is no-doubt affecting our profession. I hope that we can move away from fear and find ways to use it to benefit ourselves and our clients. In this article, I hope you picked up some common themes like: Don’t over rely on it, double check everything, and be sure to provide lots of context for the best results.
We are also witnessing the rise of AI mental health support tools like using AI chatbots to do therapy. My take on this is to not worry about it too much. Sitting in grounded presence with clients is powerful and always will be, and there’s lots of people out there who need our help. If you have clients using AI tools to support their mental health, be sure to remind them that AI often produces overly validating responses, and watch for signs of AI psychosis. AI psychosis is when someone’s reality gets deeply distorted by an AI model reinforcing delusional beliefs.
Thank you for taking the time to read this article and please feel free to reach out to cole@iccpbc.com with any questions or comments. Additionally, if you’re a mental health provider in private or small group practice (1 location and/or 10 clinicians max) and you’re looking to join a free online community, we invite you to check out our membership page to learn more about us.
About the Author:
Cole Butler, LPCC, ADDC, MACP
Cole Butler, LPCC, ADDC, MACP is a Mental Health Therapist and Writer. He co-founded Integrative Care Collective in 2023 to support mental health providers that are passionate about integrative care and to foster community amongst them. You can learn more about and connect with him on LinkedIn: https://www.linkedin.com/in/cole-butler/