AI in Behavioral Health Documentation: Ethical Considerations for Mental Health Clinicians

In Brief

In the wake of OpenAI’s release of ChatGPT in November of 2022, the explosion of artificial intelligence has been atmospheric. In every sector of business and our lives, the presence of AI has become unavoidable.

Blueprint customer names and identifiable information have been changed for the writing of this piece. Customer quotes have been kept intact to maintain integrity.

Sarah, a Marriage and Family Therapist in Buffalo, New York, joined me for a late evening Zoom interview in the early weeks of March 2024. She looked exhausted and said as much – it had been a long day, week, and year – however the night was just beginning, as she transitioned from patient care to documentation. Pleasant and instantly engaging, like most therapists, she started off with a brief introduction, quickly sharing her passion and early interest in the profession, how serving in the helping professions was deeply meaningful, how the pandemic impacted the acuity of the clients she sees, as well as the challenges associated with running a private practice – and caring for her family. The call focused on her experience of using Blueprint’s offering that allows mental health providers to record a therapy session, transcribe and summarize, using artificial intelligence (AI). The output includes a detailed description of the clinical session, along with suggested clinical documentation in progress note format for the therapist to review for accuracy and quality, prior to adding to the client’s electronic health record (EHR).

 “It has streamlined my note taking process significantly,” she shared. “It has decreased the amount of time it takes me to write a really solid note that is reflective of the work I do with a client - and allows me more time to focus on the client.” Sarah also disclosed how the bleeding of work into home life had become a real issue for her and her family. “I don't have to go home and write notes anymore, my husband is like ‘oh I have a wife again.’” As I reflected, this sounded meaningful, not only to her work – but to her family. Sarah teared up, composed herself, and redirected the conversation back to patient care. “It lets me focus more on the session and not worry that I'm missing something or need to write a lot during sessions.” Appreciating the moment, I thanked Sarah for her candor, time, and for the important work she does for those in need.

While Sarah’s experience is indicative of the transformative power that AI assistants hold in the mental health care space, as AI is a still evolving technology therapists may have some reservations about incorporating it into their practice. This is in part because of the ethics around AI. But while the technology is new, it’s worth considering whether we already have ethics in place to guide us.
 

How We Got Here: Looking back

In the wake of OpenAI’s release of ChatGPT in November of 2022, the explosion of artificial intelligence has been atmospheric. Since then, AI has become unavoidable in every sector of business and our lives. As leaders seek to elevate the sophistication of business processes, improving the efficiency and effectiveness of how work gets done, AI software has become the go-to. In healthcare, the use of AI has demonstrated its value across a growing number of use cases, including diagnostics, risk prediction, research, imaging, care personalization, and clinical decision support (Topol, 2019). Within the behavioral health sector, the automation and augmentation of tasks and services has been equally striking. A leading behavioral health use case which has garnered ample attention in the broader AI healthcare conversation, is the application of AI to automate clinical documentation. In parity with the field’s growing interest around AI are equal levels of curiosity and concern around ethics and the need for guidance.

The History of Ethical and Moral Failures

The application of ethics in the field of medicine is far from new. Western biomedical ethicists have traced contemporary ethical principles, including “respect for persons, beneficence (doing good), nonmaleficence (not doing harm), and justice (treating people fairly),” and “veracity” (truth telling)… and confidentiality (the guarding of patient’s privacy)” (Hoop, et al., 2008) from the roots of Hippocrates (i.e., Hippocratic Oath), Prussian philosophy in the 1700’s, through the drafting of the Nuremberg Code in 1947 (Charland, 2021), to the present. Today, ethical frameworks adopted by professional guilds, such as the American Psychological Association and the American Medical Association, echo such principles in their professional ethics codes. 

Within psychiatry, historical examples of egregious experimentation (e.g., Milgram’s Shocking Obedience Experiments, Zimbardo’s Simulated Prison Experiment), applications of inhumane “treatments'' (e.g., lobotomies, fever-induction, shock therapies) and a colored history of dehumanizing and racist research practices (e.g. social isolation, segregation, asylums, trials assuming biological and race-based inferiority) continue to haunt the profession, despite significant advancement and maturation of ethics in research and practice (NAMI, 2024; Brannigan, 2020; Gomory & Dunleavy, 2017). When considering the evolution of ethics in mental health, it is striking to observe how much has changed in our understanding of science and society. Yet, little has changed when it comes to the central tenets of ethical principles foundational to the frameworks upon which the professional codes of ethics are based. 

APA’s Ethical Principles of Psychologists and Code of Conduct 

With close to 200,000 members, the American Psychological Association (APA) is the leading professional organization of psychologists in the United States and one of the largest professional guilds of its type in the world. In relation to ethics, the APA’s Ethical Principles of Psychologists and Code of Conduct (aka the Ethics Code) is the source of guidance for psychologists and other professionals alike. In response to the psychologist's role in World War II, the APA’s ethics code was first published by the APA in 1953 and has evolved over the decades as the needs of the profession have changed (APA, 2024). Since its first publication, the Ethics Code has been revised nine times with the most recent update published in 2016 (APA, 2024). Widely viewed as influential to the development of other mental, behavioral health and counseling professional organization’s codes of ethics, the APA’s Ethics Code is framed around five general ethical principles:

  • Beneficence and nonmaleficence
  • Fidelity and responsibility
  • Integrity
  • Justice
  • Respect for people's rights and dignity

In addition, the APA has drafted and amended ten ethical standards which serve as enforceable rules to help manage and protect consumers and the profession at large. These include:

  • Resolving Ethical Issues
  • Competence
  • Human Relations
  • Privacy and Confidentiality
  • Advertising and Other Public Statements
  • Record Keeping and Fees
  • Education and Training
  • Research and Publication
  • Assessment
  • Therapy 

What is Missing in the Code:

While the Ethics Code makes note of “in person, postal, telephone, Internet, and other electronic transmissions,” what’s notably missing from the APA’s Code of Ethics are any dedicated principles or standards aligned specifically to the use of technology or AI in practice. But this is about to change. 

Vaile Wright, Ph.D., Senior Director of Healthcare Innovation and Co-Chair of the APA’s Mental Health Technology Advisory Committee shared, “The APA is revising our ethical guidelines. In the updated Ethics Code, we anticipate the inclusion of new language dedicated to the use of technology in practice.” As for when these will be available, “They're likely coming out mid-2025 with a period of public comments opening by mid-summer prior to final revisions and approval.” 

And given the speed at which technology and AI is moving into the care delivery space, this new addition to the APA’s Code of Ethics can’t come soon enough. 

Where Does AI and New Tech Fit in Our Ethics Today?

When asked about the role of technology and AI in the context of clinical practice and documentation, Dr. Wright shared:

 “AI tools must be developed safely, should be effective and managed responsibly. We believe such tools show tremendous promise in helping to improve workflow efficiency, addressing burnout, and reducing administrative burden. But there are multiple challenges. Not the least of which is that many AI tools are entering the marketplace at a pace faster than the research, which is key to evaluating whether they're effective - as well as moving at speeds faster than the professional and regulatory guidance. This was one of the driving forces behind launching the APA’s Mental Health Advisory Technology Committee, which focuses on the intersection of technology and clinical practice.”

Our Ethical Duty to Guide and Advise

Dr. David Cooper, Psy.D., Executive Director of Therapists in Tech, and fellow APA Mental Health Technology Advisory Committee member added to Dr Wright’s assertion that:

“It is important that the APA and other professional bodies provide guidance to technology companies and say - We are telling our members not to buy a solution unless it is HIPAA compliant, SOC 2, HITRUST or whatever. So, FYI, if you want to sell to our members, these are the questions they're going to be asking - because practitioners and psychologists alike need to have some baseline standards to help make informed decisions around appropriate technology.” 

AI Scribe Solutions and Benefits

While capabilities and features vary, all AI clinical documentation solutions broadly work the same way. Leveraging advanced technologies including machine learning (ML), large language models (LLMs), and natural language processing (NLP), the software first captures spoken or written content from clinical interactions before transcribing and summarizing them into semi-structured clinical documentation. From there, the documentation can be edited and pasted or ported into an electronic record system. Dozens of such solutions exist in healthcare today. According to the American Academy of Family Physicians, such solutions are estimated to reduce healthcare clinical documentation time by 72% on average (American Academy of Family Physicians, 2024). At Blueprint, internal data suggests for a typical outpatient behavioral health therapist, AI-generated progress notes reduce clinical documentation time by 5-10 hours per week for an average therapist, depending on caseload size and volume of weekly visits. Such compelling ROI continues to drive demand, adoption, and use of such AI solutions in the market today.

In addition to documenting the ethical practice of behavioral health services provided to consumers, clinical documentation in behavioral health serves many interests including, the documentation of patient care, addressing legal and compliance-related issues, supporting key components of the revenue cycle and insurance billing, providing data for research, quality assurance and improvement, facilitating care team communication and care coordination, enabling population, and public health activities. As the systemic reliance on clinical documentation has grown, so too has the burdens experienced by frontline behavioral health providers. In parallel with the advent and adoption of electronic health and medical records systems (EHRs/EMRs), the cognitive and administrative demands on care providers to treat their patients while staying on top of electronic documentation has contributed significantly to record high levels of reported ‘burnout’ as well as frustration in the clinical community (Shanafelt et al., 2016; Sinsky et al., 2016).   

In part because of the lasting challenges posed by the pandemic, the behavioral health workforce has faced unprecedented demand for their services, intensifying concerns that providers won’t be able to meet the growing demand for mental health and substance use treatment services (National Council for Mental Wellbeing, 2023). This surge in demand has led to growing workload and longer waitlists, compounded by a staggering 72% rise in clinical severity and complexity of cases (National Council for Mental Wellbeing, 2023). In response to these escalating pressures, therapists are increasingly finding themselves overwhelmed, not just by the pressures that accompany the emotionally intense nature of the role, but also due to the direct impact of their workload.

A recent survey by the American Medical Informatics Association (AMIA, 2024) found that nearly three-quarters of healthcare providers believe that the time and effort required for clinical documentation significantly impedes patient care. Additionally, 77% of respondents reported finishing work later than desired or needing to complete tasks at home due to excessive documentation requirements. In a related time-study examining task allocation across four ambulatory healthcare specialties, researchers found that providers spent less than 27% of their time on direct patient care, compared to other administrative tasks, with excessive time spent on clinical documentation and interacting with the electronic health record (EHR) system (Sinsky et al., 2016). Concerning psychologists, a meta-analytic review of 29 studies found that workload was one of the most common factors contributing to burnout among applied psychologists (McCormack, 2018). Further supporting these findings, recent survey data from the National Council for Mental Wellbeing, found that of the 750 behavioral health employees surveyed, 93% reported experiencing burnout, with over 62% reporting burnout severity levels of an 8 or 9, on a scale from 1 to 10, where 1 is “no burnout,” and 10 is “significantly burned out” (National Council for Mental Wellbeing, 2023).

In the same way that EHRs have replaced paper charts, AI scribe solutions are slowly replacing human medical scribes and documentation assistants in health care settings. In 2018, a JAMA study estimated that at that time over 20,000 scribes were employed within the healthcare sector, further predicting this number to grow to over 100,000 by 2020 (Mishra, Kiang & Grant, 2018). In their 12 month study evaluating the association of medical scribes with primary care physician workflow and patient experience, researchers found that medical scribes were strongly associated with decreased physician documentation burden, improved work efficiency, and improved patient/provider interactions (i.e., patient satisfaction) (Mishra, Kiang & Grant, 2018). A multitude of similar studies have corroborated these findings, thus strengthening the value proposition of augmenting providers via scribes within healthcare (Gidwani, et al., 2017; Shultz & Holmstrom, 2015; Danila et al., 2018; Koshy et al., 2010). 

Since 2018, technology innovation in the documentation automation space has skyrocketed, resulting in a plethora of accurate, accessible, and affordable solutions, therefore calling into question the future of human medical scribes and documentation assistants in healthcare. Given the state of behavioral health, it makes sense that the field would look to such technology solutions aimed at reducing administrative burden, increasing the efficiency of workflows, and overall improving the day-to-day experience of delivering care to those who need it most.      

How Best to Act Before the Code Exists

As the field awaits clearer guidance from federal, state, and professional organizations around ethical use of AI and technology, experts challenged that the questions at hand are less novel than most believe. Per Dr. Cooper,

“I think it was Alan Kay who said, ‘technology is anything that was invented after you were born, everything else is just stuff.’ There are all kinds of tools that we as psychologists use that we don't think about because we've already had the discussions. We've already thought through computers, email, telephone, texting - all of this ‘stuff.’ At the end of the day, it comes down to learning how to use a new tool – and in this case, it may be an AI scribe solution. It's all about learning, but guess what? Somebody had to learn email. Somebody had to learn to write using Word. Somebody had to learn how to appropriately use an EMR.” 

So, in the absence of clear guidance, what should the practice community do? To this question Dr. Cooper added, “the APA has a well-defined Ethics Code, complete with ethical principles and standards. The task for the field now becomes applying these as a framework to new technology.”   

Per the APA experts interviewed, the following four domains are what every provider should consider before bringing in a new technology, such as AI documentation automation, into their practice: 

Informed Consent:

  • When considering use of an AI documentation automation solution, it is important that behavioral health providers obtain informed consent. This includes three key components: providing information, weighing the risks and benefits, and obtaining and documenting oral or written consent.
  • When providing information to a patient about an AI scribe solution, the provider should offer the appropriate amount of information to the individual, so they are prepared to make an informed decision about the tool’s use. This includes an overview of how the tool works (i.e., the session will be recorded, a transcript will be made, a draft note will be generated, reviewed, edited, the recording will be deleted, what happens to the transcript etc.).
  • The benefits and risks of using such a tool should also be noted, such as allowing the provider to be more present during the therapy session, reducing the need to type or take notes – as well as an overview of the potential risks associated.
  • And last, it is important that the clinician obtain and document oral or written consent within the legal record, most often in their EHR. It is also important for behavioral health providers to be aware of any state or license specific regulations geared towards the use of recording.
  • Best practices also include documenting consent each session regardless of any change in consent.

Data Privacy & Security:

  • Data privacy is a growing concern in today’s digital world - and when it comes to protected health information (PHI) and personally identifiable information (PII) in healthcare, it is critical that technology is safe, private, and secure. Documentation automation solutions, like other healthcare technologies, are powered by written and recorded content which includes both PHI and PII.
  • How the technology handles capture, transmission, and storage of such sensitive data is of critical importance. Fortunately, most technology vendors, working in good faith, seek and achieve minimum security certifications, such as HIPAA (Health Insurance Portability & Accountability Act) and SOC2 (Service Organization Control Type 2), which are issued following rigorous third-party evaluation.
  • To achieve these certifications, a technology vendor must achieve requisite compliance objectives, in addition to demonstrating that they have the appropriate controls in place to mitigate risks across the organization’s information system.
  • Domains evaluated commonly include security, availability, processing integrity, confidentiality, and privacy (HIPPA Journal, 2023). Behavioral health providers should make sure that the technology they seek to use have obtained such certifications, in addition to ensuring that the vendor has written policies committing to not sharing or selling consumer data.

Dr. Karen Fortuna, Ph.D., Assistant Professor of Psychiatry at Dartmouth College, Co-Founder of Collaborative Design for Recovery and Health, and fellow committee member of the APA’s Mental Health Technology Advisory Committee, referenced additional questions her team developed related to selecting technologies. “In a recent study, we conducted a qualitative analysis with individuals from across the country to identify what aspects of technology are most important to them. While the study focused on peer support specialists and consumers, we were able to define eight domains important to the selection and use of technology in care settings, which I believe are applicable here.”

Specifically relevant to privacy and data security, Dr. Fortuna’s study proposed the following questions key to technology selection (adapted below for the use case at hand): 

  • Does this technology protect personal information?
  • There is a clear privacy policy for me to read.
  • There is a clear privacy policy for me to consent to.
  • I am aware if this consent is time limited.
  • I understand the privacy policy.
  • I know how data is shared.
  • If data is shared, I know who it is shared with.
  • If my data is shared, I know it is shared securely.
  • I am aware if I can opt out of data collection.
  • I am aware how I can opt out of data collection.
  • If data is shared with others, I am aware of the limits to what they can keep confidential.
  • I am aware that this technology will not be used to incriminate me (or my client).

(Adapted from Mbao et al., 2021)

With increasing demands on clinicians, as an industry it’s imperative we adopt the tools available to support them and enable the best care possible for client progress. While placing trust in emerging AI assistant tools is of course not to be done blindly, we can draw from existing and longstanding ethics in order to guide its application in a responsible, empowering manner. 

Clinical Integrity & Accuracy:

  • As clinical documentation automation solutions grow in popularity; it is important for clinicians to know that their licenses are on the line when it comes to the clinical integrity, since the accuracy of the documentation such solutions produce and what ultimately ends up in the legal record.
  • Best practices include conducting a thorough review of the documentation outputs produced, making appropriate edits to ensure the accuracy of notation. Not only is this critical to ethical practice, but also is important when it comes to fraud, waste, and abuse.
  • While always important to ensure the accuracy of documentation and billing information, when interacting with insurance companies and health plans, this is of critical importance due to the legal ramifications which may stem from risks associated with fraudulent billing. 

Equity & Bias:

  • The topic of bias in large language models (LLMs) and AI algorithms have received notable attention as the rise of AI has lifted the visibility of such advanced technologies.
  • In healthcare, the consequences of biases within big data can result in the inadvertent propagation of systemic health disparities. This has historically have disproportionately impacted underrepresented populations and perpetuated health inequities in access, care, and outcomes (Norori, et al., 2021).
  • What is unique about AI documentation scribe tools is that the clinician is always the final reviewer, thus allowing for clinical judgment to be applied and edits made prior to suggested clinical documentation being accepted.
  • As such, behavioral health providers need not be data scientists or technologists to evaluate the use of documentation automation solutions, but rather must ensure that the final documentation – whether produced by an AI scribe tool or independently written, is accurate, equitable, and free from biased language.   

The Opportunity

The opportunity for technology to improve how care is both delivered and received is tremendous. The benefit potential for such solutions as documentation automation, can’t be overstated, especially given their ability to improve workflow efficiency, reduce documentation burden, alleviate provider ‘burnout,’ and increase the overall documentation quality. 

While the practice community awaits formal guidance from regulatory and professional organizations regarding legal and ethical use of such tools in care, trust should be taken in the structures the clinical community have looked to and relied on for years, including our professional ethics codes. The magic and promise of shiny new technology can at times be distracting and daunting.

That said, solace can be taken when applying discerning and thoughtful evaluation using the familiar lens of the ethical principles which remain true to today: respect for persons, do good and do no harm, treat all people fairly, tell the truth, ensure, and protect confidentiality, and privacy. Perhaps the guidance has been here all along.

As for the less tangible benefits, a solution incorporating AI Assistants allows for the deepening of presence in the therapeutic relationship, allow for more family dinners, reading books at night to one’s kids, or improving the confidence of the therapist to do their life’s best work. While reflecting on these questions with Sarah, the Marriage and Family Therapist in Buffalo, New York, she quietly shared: “It makes me feel like a better therapist.”

___________________________

Dr. Dylan Ross is the Head of Clinical at Blueprint, a leading therapist enablement technology company. As an organizational psychologist and independently licensed behavioral health clinician (LPCC, LMFT). Dr. Ross also chairs the American Psychological Association’s National Advisory Committee for Measurement-Based Care and the Mental and Behavioral Health Registry where he and the committee promote the advancement, adoption, and implementation of standardized clinical measurement within the field of behavioral health.

The perspectives and views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the American Psychological Association (APA). The APA does not endorse or take responsibility for the content, accuracy, or opinions presented herein.

Citations: 

American Academy of Family Physicians. (n.d.). Technologies that reduce documentation burden. Retrieved August 2, 2024, from https://www.aafp.org/family-physician/practice-and-career/administrative-simplification/doc-burden/technologies-doc-burden.html#5

American Medical Informatics Association. (2024, May). TrendBurden: Pulse Survey on Excessive Documentation Burden for Health Professionals. Presented at the 2024 Clinical Informatics Conference, Boston, MA.

Brannigan, A. (2020). The Use and Misuse of the Experimental Method in Social Psychology: A Critical Examination of Classical Research. Routledge 

Charland, L. C. (2021). A historical perspective. Psychiatric Ethics, 11.

Danila, M. I., Melnick, J. A., Curtis, J. R., Menachemi, N., & Saag, K. G. (2018). Use of scribes for documentation assistance in rheumatology and endocrinology clinics: Impact on clinic workflow and patient and physician satisfaction. JCR: Journal of Clinical Rheumatology, 24(3), 116-121.

Gidwani, R., Nguyen, C., Kofoed, A., Carragee, C., Rydel, T., Nelligan, I., ... & Lin, S. (2017). Impact of scribes on physician satisfaction, patient satisfaction, and charting efficiency: A randomized controlled trial. The Annals of Family Medicine, 15(5), 427-433.

Gomory, T., & Dunleavy, D. J. (2017). Madness: A critical history of ‘mental healthcare’ in the United States. In Routledge International Handbook of Critical Mental Health (pp. 117-125). Routledge.

Hoop, J. G., DiPasquale, T., Hernandez, J. M., & Roberts, L. W. (2008). Ethics and culture in mental healthcare. Ethics & Behavior, 18(4), 353-372.

Koshy, S., Feustel, P. J., Hong, M., & Kogan, B. A. (2010). Scribes in an ambulatory urology practice: Patient and physician satisfaction. The Journal of Urology, 184(1), 258-262.

Mbao, M., Zisman, Y., Gold, A., Myers, A., Walker, R., & Fortuna, K. L. (2021). Co-production development of a decision support tool for peers and service users to choose technologies to support recovery. Patient Experience Journal, 8(3), 45. 

McCormack, H. M., MacIntyre, T. E., O'Shea, D., Herring, M. P., & Campbell, M. J. (2018). The prevalence and cause(s) of burnout among applied psychologists: A systematic review. Frontiers in Psychology, 9, 1897.

 Mishra, P., Kiang, J. C., & Grant, R. W. (2018). Association of medical scribes in primary care with physician workflow and patient experience. JAMA Internal Medicine, 178(11), 1467-1472.

 National Council for Mental Wellbeing. (2023, March 16). Industry Workforce Shortages Survey. Prepared by The Harris Poll. Retrieved from: https://www.thenationalcouncil.org/wp-content/uploads/2023/04/Workforce-Shortage-Survey-Results-1.pdf?gfaction=event_send&category&action&label&entryid=0&nonce=e9b7fc240d

 Norori, N., Hu, Q., Aellen, F. M., Faraci, F. D., & Tzovara, A. (2021). Addressing bias in big data and AI for healthcare: A call for open science. Patterns, 2(10).

OpenAI. (2022). ChatGPT. Retrieved from: https://openai.com/index/chatgpt/

Shanafelt, T. D., Dyrbye, L. N., Sinsky, C., Hasan, O., Satele, D., Sloan, J., & West, C. P. (2016). Relationship between clerical burden and characteristics of the electronic environment with physician burnout and professional satisfaction. Mayo Clinic Proceedings, 91(7), 836-848. https://doi.org/10.1016/j.mayocp.2016.05.007

Shultz, C. G., & Holmstrom, H. L. (2015). The use of medical scribes in healthcare settings: A systematic review and future directions. The Journal of the American Board of Family Medicine, 28(3), 371-381.

 Sinsky, C., Colligan, L., Li, L., Prgomet, M., Reynolds, S., Goeders, L., Westbrook, J., Tutty, M., & Blike, G. (2016). Allocation of physician time in ambulatory practice: A time and motion study in 4 specialties. Annals of Internal Medicine, 165(11), 753-760.

Topol, E. J. (2019). Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again. Basic Books.

Latest Articles
See all posts