By Nicole M. Arcuri-Sanders, PhD, LPC-S, BC-TMH

In an era of artificial intelligence (AI), many clients find mental health support through technology. While AI platforms can provide information about symptoms, experiences and diagnoses, they may not account for the complexities of clients’ lives. When clients share details of their life within AI platforms, associated risks related to privacy and confidentiality may arise.

Professional counselors can consider four questions to help them prepare preventative and intervention measures to safeguard the welfare of their clients who use AI.

  • Do people choose AI platforms for private advice due to stigma around mental health? 
    Many of my clients shared that, prior to seeking services, they had sought advice from AI to get information and support in private. Some clients found this easier than having to adjust their schedules to prioritize counseling, which might alert others to their current state. Many felt comfortable sharing secrets with AI because AI removed the element of human judgment and was available 24/7. Clients could use AI from their bed with smartphones while other household members were asleep. Some clients indicated that, even if the AI system advised seeking a mental health provider’s support, they did not feel confident sharing their information with another human.

  • Can AI complicate the issue of self-diagnosis and misdiagnosis for clients? 
    In one instance, before a client sought services, they had asked multiple AI systems, “Why do I feel sad all the time?”

    AI generated responses such as the following: “You are depressed”; “I’m not a mental health professional, but there could be various reasons why you might be feeling sad all the time”; or “You are experiencing underlying mental health conditions. In some cases, AI connected the experience to life factors, stressors or genetic factors.”

    Other clients brought AI terminology into our sessions and began to define themselves using AI’s terms. Therefore, without having had any formal assessment or interpretation, these clients began to label themselves with diagnoses that needed an oriented treatment. Common statements included the following: “I have depression”; “I have an anxiety disorder”; “I’m involved with a narcissistic partner”; and “My parent/partner/child is bipolar.”

    As the counselor, I had to work backwards from the AI diagnosis during intake to connect experiences throughout their life while also seeking other experiences that may not be directly linked to the diagnosis they had shared. This often complicated the intake process because many clients felt they did not need to examine other aspects of their life since they already had their answer. I had to educate clients about the importance of gaining insight into the details of their lived experience to best support them.

  • Can AI fail to consider a person’s worldview or contextual factors?
    Clients shared the only reason they were here for counseling was because the AI platform was beginning to not understand what they were trying to say. When revisiting the platform, they often had to start over and re-explain themselves. Ultimately, AI was not accounting for their worldview, lens, cultural experiences and the whole self-picture that mental health providers strive to gain during intake and continue to account for with progress notes.

    Misdiagnosis can have a negative impact on a client and their therapeutic progression. Not only can it affect clients’ identity, but it may cause people to seek treatment that does not align with their needs. In addition, some people might have preconceived notions of how they should be treated and become defensive if treatment does not align with an AI-determined diagnosis.

  • Can AI maintain the confidence of a counselor-client relationship?
    Many clients shared that, over time, they divulged to AI more details about their life experiences and symptoms, such as when feelings began, events surrounding symptom development and severity of symptoms. They also shared about other people in their life related to their experiences. Some used AI systems like a journal or shared immediate responses without opinions often offered by friends and family.

    AI systems learn from users to continue to progress in their ability. Therefore, the information one client may share using the system can be regenerated in different fashions to support AI development. AI does not have the ethical responsibility counselors have to abide by confidentiality and privacy practices. If clients do not understand the gravity of such disclosures, especially when they are in a vulnerable state, they may share sensitive and often coveted data about themselves and others on a public platform. Some clients were surprised to learn they had more confidence guaranteed in session with me than when using AI. 

AI and the Counseling Profession’s Future

Thinking about these questions and my clients’ experience with AI led me to ask my clients: “So, why counseling now? If AI is so amazing and can offer privacy and an abundance of information to help you, why are you now here with me for counseling?”

They often answered that AI was “missing something.” Clients sought human connection to go through the journey. One client remembered crying for hours with her phone. Despite having typed for what seemed like endless hours and receiving AI responses in return, she felt truly alone.

Counselors can offer empathy not just with our verbal language through intervention implementation but also with nonverbal validation. The space we can create for clients can often be therapeutic. With community education we can further combat the stigma of mental health treatment and educate the public about AI strengths and weaknesses, the value counseling relationships offer and what confidentiality is within a therapeutic relationship versus AI platforms.

So, what does this mean for counselors as AI continues to grow? Counselors should share knowledge of AI in relation to mental health services to help communities understand the complexity of treatment. They should highlight the intake process and explain how the correct diagnosis can take time with truly accounting for the whole individual and their experiences. Counselors can help the public understand not only what mental health providers do but also what certain technologies cannot replace.

Nicole M. Arcuri-Sanders, PhD, LPC-S, BC-TMH, is a counselor educator and supervisor at Coastal Carolina University. Nicole has been licensed as a counselor and supervisor in numerous states, nationally certified counselor, board certified telemental health counselor and approved clinical supervisor at the national level.

Search CT Articles

Current Issue