The American Counseling Association has convened a panel of counseling experts representing academia, private practice and students to comprise its AI Work Group. The work group used research-based and contextual evidence; the ACA Code of Ethics; and clinical knowledge and skill to develop the following recommendations. The goal is to both prioritize client well-being, preferences, and values in the advent and application of AI, while informing counselors, counselor-educators and clients about the use of AI today. The recommendations also highlight the additional research needed to inform counseling practice as AI becomes a more widely available and accepted part of mental health care.
While research into the role of Artificial Intelligence (AI) in assessment and diagnosis has expanded, more studies are needed to explore AI's assistance to counselors in these domains. The ensuing recommendations adopt an interdisciplinary approach to AI's integration into assessment and diagnosis processes, highlighting potential applications and encouraging caution. Counselors are urged to pursue additional research in this field and closely track AI advancements. Counselors are advised to exercise prudence when leveraging AI support for assessment and diagnosis while also maintaining receptivity to its possibilities.
Recommendation: AI may augment or help reimagine diagnostic frameworks
Recognize that AI may lead counselors to reconsider how they categorize mental health
disorders (Minerva & Giubilini, 2023). At present, counselors diagnose by observing, classifying,
and assessing predominately in accordance to their level of ability. Counselors may consult with
others to improve the validity of their diagnosis. AI may expand a counselor's knowledge base
by including speech-pattern analysis and other datasets, providing more markers correlating
with specific diagnoses. In this sense, AI would help fine-tune diagnosis, which may render
DSM-style categorical diagnoses incomplete or in need of substantial revision.
Recommendation: Carefully examine AI outputs for potential bias
Recognize that AI may reduce or increase the incidence of bias in diagnosis. Mainly depending
on the dataset in which it is trained, the diagnosis offered by an AI may be more or less biased
than a human counselor. AI trained with representative datasets may offer a degree of
impartiality to diagnosis, serving as a valid source of information for the counselor to consider
before giving a diagnosis. Conversely, AI trained with non-representative datasets may
incorrectly diagnose, miss a diagnosis, or lead the counselor astray with miscalculated
information. The process mirrors Type I and II errors in statistics. Counselors are encouraged to
scrutinize the offerings of AI for possible bias (Fulmer et al., 2021)
Recommendation: Weighing AI's potential against ethical obligations is imperative
Integrating AI into assessment may help inform the diagnostic process, aiding counselors in
improving the accuracy, consistency, and objectivity of diagnoses, given the potential for
humans to be influenced by their own emotions and cognitive biases when conducting
assessment and issuing diagnoses (Featherston et al., 2020). Despite industry assurances, it is
imperative to check whether the AI-powered applications and software for clinical purposes
comply with HIPAA, local laws, and employer or clinical site regulations and policies. Data can
easily be reidentified in the digital era (Marks & Haupt, 2023). More research and legislation
efforts are needed to establish HIPAA-compliant LLMs and other AI-power applications with
specific knowledge and considerations pertinent to the counseling profession. Practicing
counselors and counseling researchers are encouraged to actively participate in developing
these LLMs and other machine-learning models in light of their domain knowledge and practical
experience.
Recommendation: Ensure HIPAA compliance
Session notes help with assessment and diagnosis in counseling (Prieto & Scheel, 2002).
Despite the temptation to use AI tools for automated note-taking, client data should not be input
into non-HIPAA-compliant applications. Failure to do so will result in a violation of HIPAA laws
and confidentiality. Before adopting AI tools, counselors must practice due diligence and ensure
these tools comply with HIPAA and local laws and regulations. If using HIPAA-compliant AI tools
for automated note-taking, counselors must review notes for accuracy and adherence to
professional standards and edit automated notes for compliance with professional standards as
needed.
Recommendation: Develop ethical, accurate AI tools through diverse data sets and
collaboration
To develop AI tools that are ethical, just, accurate, and efficient in counseling, research should
incorporate inclusive and diverse data sets that reflect a broad spectrum of client experiences to
reduce bias and improve the generalizability of the algorithms. This effort should be
underpinned by interdisciplinary collaboration, bringing together counseling and computer
science researchers, counselors, and clients to integrate clinical expertise with AI development,
ensuring the tools are clinically relevant and grounded in therapeutic best practices. Rigorous
counseling research can help examine the role of AI tools in counseling, from targeted
prevention and early intervention to assessment and diagnosis. Their function can be compared
with traditional approaches to test their effectiveness, accuracy, efficiency, and client
satisfaction. Engaging clients in the research process to gather feedback on their experiences
with AI tools can also help enhance the development process and understand client perception,
paving the way for client-centered AI tools that are more aligned with client needs and
expectations.
Recommendation: Exercise caution and critical thinking
Counselors using AI to facilitate assessment and diagnosis are encouraged to exercise caution,
critical thinking, and remember not to rely solely on the information an AI provides. In
accordance with ethics code E.9.b., counselors should “qualify any conclusions, diagnoses, or
recommendations made that are based on assessments or instruments with questionable
validity or reliability.”
Recommendation: Counselors should integrate AI insights with relational dynamics for
cultural responsiveness
As indicated in the ACA Code of Ethics (ACA, 2014), “Counselors use assessment as one
component of the counseling process, taking into account the clients’ personal and cultural
context.” (Section E, Introduction). AI tools may potentially be a useful aid for counselors
seeking to understand broad cultural considerations of clients. However, counselors must
ensure that they are attending to relational dynamics with clients when exploring cultural
formulations of presenting problems to promote individualized therapeutic relationships and
culturally responsive treatment planning
Abd-Alrazaq, A., Alhuwail, D., Schneider, J., Toro, C. T., Ahmed, A., Alzubaidi, M., ... & Househ, M. (2022). The performance of artificial intelligence-driven technologies in diagnosing mental disorders: an umbrella review. NPJ Digital Medicine, 5(1), 87. https://doi.org/10.1038/s41746-022-00631-8
Featherston, R., Downie, L. E., Vogel, A. P., & Galvin, K. L. (2020). Decision making biases in the allied health professions: a systematic scoping review. PLoS One, 15(10), e0240716.
Jarvis, G. E., Kirmayer, L. J., Gómez-Carrillo, A., Aggarwal, N. K., & Lewis-Fernández, R. (2020). Update on the cultural formulation interview. Focus, 18(1), 40-46.
Kulkarni, P. A., & Singh, H. (2023). Artificial Intelligence in Clinical Diagnosis: Opportunities, Challenges, and Hype. JAMA, 330(4), 317–318. https://doi.org/10.1001/jama.2023.11440
Lewis-Fernández, R., Aggarwal, N. K., & Kirmayer, L. J. (2020). The Cultural Formulation Interview: Progress to date and future directions. Transcultural Psychiatry, 57(4), 487-496.
Lewis-Fernández, R., Aggarwal, N. K., Lam, P. C., Galfalvy, H., Weiss, M. G., Kirmayer, L. J., ... & Vega-Dienstmaier, J. M. (2017). Feasibility, acceptability and clinical utility of the Cultural Formulation Interview: mixed-methods results from the DSM-5 international field trial. The British Journal of Psychiatry, 210(4), 290-297.
Fulmer, R., Davis, T., Costello, C., & Joerin, A. (2021). The ethics of psychological artificial intelligence: Clinical considerations. Counseling and Values, 66(2), 131-144.
Marks, M., & Haupt, C. E. (2023). AI Chatbots, health privacy, and challenges to HIPAA compliance. Jama.
Minerva, F., & Giubilini, A. (2023). Is AI the Future of Mental Healthcare?. Topoi : an international review of philosophy, 42(3), 1–9. Advance online publication. https://doi.org/10.1007/s11245-023-09932-3
Prieto, L. R., & Scheel, K. R. (2002). Using case documentation to strengthen counselor trainees' case conceptualization skills. Journal of Counseling & Development, 80(1), 11-21.
Sun, J., Dong, Q.-X., Wang, S.-W., Zheng, Y.-B., Liu, X.-X., Lu, T.-S., Yuan, K., Shi, J., Hu, B., Lu, L., & Han, Y. (2023). Artificial intelligence in psychiatry research, diagnosis, and therapy. Asian Journal of Psychiatry, 87. https://doi.org/10.1016/j.ajp.2023.103705
S. Kent Butler, PhD University of Central Florida | Russell Fulmer, PhD Husson University | Morgan Stohlman Kent State University |
Fallon Calandriello, PhD Northwestern University | Marcelle Giovannetti, EdD Messiah University- Mechanicsburg, PA | Olivia Uwamahoro Williams, PhD College of William and Mary |
Wendell Callahan, PhD University of San Diego | Marty Jencius, PhD Kent State University | Yusen Zhai, PhD UAB School of Education |
Lauren Epshteyn Northwestern University | Sidney Shaw, EdD Walden University | Chip Flater |
Dania Fakhro, PhD University of North Carolina, Charlotte |