The Effect of Electronic Health Record Notes on Racial Bias

Article

Clinicians need to avoid negative terms in describing patients

Language used to describe patients in electronic health records may be perpetuating racial bias and other negative stereotypes in health care, a new study suggests.

Using machine learning techniques, researchers analyzed potentially stigmatizing language in more than 40,000 history and physical notes in EHRs from 18,459 adult patients at an academic medical center in Chicago. They found that the odds of Black patients having one or more negative terms, such as “refused,” “(not) compliant” and “agitated” applied to them were more than twice (2.54) those of white patients, even after adjusting for sociodemographic and health characteristics.

The differences were not solely race-based. Medicare and Medicaid beneficiaries, for example, were more likely to have negative descriptors applied to them than were patients with commercial or employer-based insurance, as were unmarried patients compared to patients who were married.

The findings are especially troubling, they say, given that other studies have found that less than 20% of text in inpatient progress notes are original, with most of the rest imported from prior documentation. As a result, “subsequent providers may read, be affected by, and perpetuate the negative descriptors, reinforcing stigma to other health care teams,” they write.

The researchers found that notes written for outpatient visits were less likely to have negative descriptors included in the patient’s EHR than inpatient encounters. They theorize this may result from inpatient settings being inherently more stressful, thereby increasing the risk of doctors using stereotypes as “cognitive shortcuts.”

They also found that race-based differences in the use of negative descriptors narrowed after March 1, 2020. They theorize this was due to the stark racial differences in care and outcomes highlighted by the COVID-19 pandemic, and the pandemic’s overlap with “a historically defining moment of national response to racialized state violence” created by the killings of George Floyd and other Black Americans.

“These social pressures may have sensitized providers to racism and increased empathy for the experiences of racially minoritized communities,” they note.

Fixing the problem, the authors say, will require medical institutions to better address the issue of implicit racial bias among providers. They cite the example of a physician’s use of the term “aggressive” as reflecting the physician’s personal bias regarding Black men. Once such a negative label becomes part of the patient’s record, “it potentially affects the perceptions and decisions of future providers regardless of whether future providers hold a bias about Black men being aggressive,” they note.

They also recommend that regulatory bodies, such as the Accreditation Council for Graduate Medical Education, develop specific recommendations for the use of non-stigmatizing, patient-centered language to prevent the transmission of bias.

The need is especially important, they say, in light of the OpenNotes policies medical institutions are adopting, which allow patients full access to their EHRs, including chart notes. Without paying attention to the language used to describe patients, they risk “harming the patient-provider relationship with downstream effects on patient satisfaction, trust, and even potential litigation.”

The study, “Negative Patient Descriptors” Documenting Racial Bias In The Electronic Health Record” appears in the February, 2022 issue of Health Affairs.

This article originally appeared on Medical Economics.
Recent Videos
© 2024 MJH Life Sciences

All rights reserved.