Pritma Dhillon-Chattha, DNP, MHA, RN is the cofounder of Lavender.

As artificial intelligence (AI) evolves, it’s becoming more widespread in mental healthcare.

Consider a 2023 survey of some “psychiatrists affiliated with the American Psychiatric Association,” which revealed that 44% had used the 3.5 version of ChatGPT and 33% had used the 4.0 version to “‘assist with answering clinical questions.’” The research also found that 70% of psychiatrists surveyed indicated that they “somewhat agreed/agree” that with AI tools, “‘documentation will be/is more efficient.’” Another survey, which was conducted by PsychologyJobs.com, examined how psychologists view AI—and uncovered that 25% are “currently using AI in their practice, with a further 20% considering it for the future.” The top three use cases in social psychology, the study found, were AI-driven chatbots “for client interactions (21%),” automated diagnostic tools “for advanced treatment (16%)” and natural language processing “for text analysis (16%).” However, in both surveys, some respondents expressed concern about the use of AI in their respective fields.

Concerns about AI in mental healthcare are valid, and it’s essential to discuss them and find solutions. AI is here to stay in mental healthcare, and avoiding it is not the right approach. Instead, mental health providers from all backgrounds—including psychiatrists, psychiatric nurse practitioners, clinical psychologists, therapists and counselors—should work together to understand and navigate the unique opportunities and challenges AI poses for the field.

Key Use Cases For AI In Mental Healthcare

When people think about leveraging AI in mental healthcare, therapy chatbots might first come to mind. While therapy chatbots might be able to one day supplement the work mental health providers do and fill the gap in mental healthcare on a broad scale, they currently pose dangers due to a lack of testing and regulation. In my view, AI should not be used to give people therapy (at least, not yet). Instead, I recommend that mental health providers focus on other AI use cases in the field.

There are opportunities to use AI in a manner that supplements, not replaces, the work that mental health providers do. First, there are AI notetakers, which are being used at various healthcare organizations to assist providers with documenting patient visits. By using AI notetakers, providers can capture conversations more quickly and accurately. They can also strengthen their connections with patients. Instead of shifting their focus to jot down notes, they can be more present. Then there are AI-powered question-and-answer chatbots, which can help patients and providers speedily access information. Finally, AI can be used to perform aggregate data analysis, helping providers with their clinical decision-making and giving them insights that could indicate trends worth further exploration.

While these use cases have benefits, they also have drawbacks. For one, AI notetakers can pose privacy risks and can make some patients feel uncomfortable. Providing disclosures and obtaining patient consent is crucial before using an AI notetaker during an appointment. As for AI chatbots, they can, in theory, be used to attain and sift through medical information, but they can produce wrong or biased results. An alternative use case for AI chatbots is to use them to help patients and providers access nonmedical information, such as finding out a facility’s operating hours and reviewing internal policies (which is something my team uses AI for), respectively. And when it comes to leveraging AI for data analysis, it’s important that providers remain mindful of the possibility of erroneous or biased outputs, and always double-check the work AI tools produce.

The Importance Of Responsible, Safe And Ethical AI Implementation

Mental health providers who incorporate AI into their work can streamline their tasks, lowering their administrative burden and saving time, which in turn gives them the opportunity to improve the patient journey. However, it’s important that the leaders of mental healthcare organizations implement AI in a responsible, safe and ethical manner.

Just as mental health providers practice evidence-based care, they should practice evidence-based AI implementation. At the foundational level, mental healthcare leaders should make sure that any AI solution they roll out to their teams is HIPAA-compliant and carefully review how it handles data storage. Moreover, I recommend that mental health leaders seek AI tools that are built with medical expertise.

After mental healthcare leaders decide on which AI solution to implement in their organizations, they should get input from their lawyers and healthcare team members to craft AI usage policies and procedures. Once they implement a solution, they should extensively train their healthcare team members on how to use it. Finally, they should regularly evaluate any AI solution that’s being used at their organization to update usage policies and procedures as needed.

Why It’s Paramount That Mental Health Providers Participate In The AI Conversation

As AI continues to become more commonplace, it’s paramount that mental health providers participate in the AI conversation. They should voice their thoughts and get involved in the development and implementation of AI solutions in mental healthcare.

There are many different areas of mental healthcare, and unfortunately, these different areas can be siloed. But moving forward, information-sharing and collaboration are essential. There are various ways the field can become more cohesive. For instance, national and state associations representing the different professions within mental healthcare can work together to produce webinars and conferences that bring providers together. They can also create working groups to help craft state and national policies around AI usage in behavioral health. Leaders of private practices can share their experiences with each other—the owner of, say, a psychiatry practice can meet with the owner of a counseling practice in the same city to trade notes on their experiences using AI at work. These are just a few ways providers can work together to navigate AI.

Ultimately, as mental health providers, it’s our professional obligation to work together and inform how AI will shape our industry. Through teamwork, we can improve both provider and patient journeys.

Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?

Read the full article here

Share.
Exit mobile version