Racial bias is baked into algorithms doctors use to guide treatment

Doctors use algorithms to help guide their medical care, such as whether a patient should get breast cancer screening or whether they should undergo a cesarean section to give birth — but many of these algorithms have racial biases, according to a new study. 

These algorithms, which are adjusted in some way to account for patients’ race or ethnicity, may lead to worse care for Black patients — for instance, by delaying life-saving heart failure treatment or preventing them from donating or receiving kidneys, according to the research.

Many of these biased algorithms use outdated, incorrect or even nonexistent scientific rationale to adjust patients’ scores based on race, and “guide decisions in ways that may direct more attention or resources to white patients than to members of racial and ethnic minorities,” according to the report, published June 17 in the New England Journal of Medicine

Race should not be ignored in medical research, as “doing so would blind us to the ways in which race and racism structure our society” and affect patients’ health, the authors noted. But algorithms must not confound the influence of patients’ races with, say, their socioeconomic status or access to primary health care, they wrote. Several algorithms included in the study adjusted for race based on correlative studies that found a link between race and medical outcome, but did not tease out whether other factors better explained the differences between patients.    

“Given their potential to perpetuate or even amplify race-based health inequities, [these algorithms] merit thorough scrutiny,” the authors wrote in the study. “If doctors and clinical educators rigorously analyze algorithms that include race correction, they can judge, with fresh eyes, whether the use of race or ethnicity is appropriate” for guiding patient care. 

The report included 13 examples of algorithms, used to assess the risk of developing heart failure, breast cancer and kidney stones, among other ailments, that adjust patient scores based on race and could harm Black and Latino patients, in particular, and occasionally Asian patients, as well. We’ve detailed four examples below: 

Heart failure 

The Get with the Guidelines–Heart Failure Risk Score, developed by the American Heart Association (AHA), predicts a patient’s risk of dying of heart failure in the hospital. A higher score indicates a higher risk of death, so doctors quickly refer patients with higher scores to a cardiologist and may allocate resources to them sooner than patients with lower scores. “Nonblack” patients automatically receive an extra three points on the scale, and therefore, Black patients may be less likely to receive appropriate treatment compared with white patients with the same symptoms and medical history, the authors noted. “The AHA does not provide a rationale for this adjustment,” they wrote. 

Kidney donation 

The Kidney Donor Risk Index (KDRI) predicts whether a donated kidney is likely to fail once transplanted, taking the characteristics of the donor, including race, into account. The algorithm ranks kidneys from Black donors as higher risk than those from people of other races, though “the developers of the KDRI do not provide possible explanations for this difference,” the authors wrote. Studies now suggest that a specific genetic trait, known as the APOL1 genotype, may explain differences in kidney quality better than race, the authors wrote. 

Approximately 13% of Black people carry two copies of an APOL1 gene variant associated with increased risk of chronic kidney disease, according to a 2019 report in the journal Circulation. But again, the genotype, not the person’s race, best explains this increased risk. By screening for race rather than APOL1 genotype, the KDRI may reduce the number of available kidneys from Black donors, which often go to Black patients, the authors noted. 

Breast cancer 

An online risk assessment tool from the National Cancer Institute may underestimate the risk that African American, Latino and Asian American women develop breast cancer, compared with white women who report the same risk factors.  These risk factors include a woman’s age, date of first period, close relatives with breast cancer and number of benign biopsies; the risk assessment tool is not used to evaluate women with a breast-cancer-producing mutation in the BRCA1 or BRCA2 genes. Yet Black women develop breast cancer at about the same rate as white women.

An inappropriately low score may deter women from seeking formal screening for the disease, which may increase the odds that they are diagnosed later in the course of illness, when prognosis is poorer, the authors wrote. Though Black women and white women are equally likely to get breast cancer, Black women die from the disease at a higher rate than white women — about 40% higher, according to the U.S. Centers for Disease Control and Prevention. Early screening and diagnosis is key for improved outcomes and long-term survival. 

Birth 

The Vaginal Birth after Cesarean (VBAC) algorithm assesses whether a person who previously underwent a cesarean delivery would be safe to attempt a vaginal birth, or should be scheduled for another cesarean. The algorithm predicts that people who identify as African American or Hispanic face greater risks from vaginal birth than those of other races. This bias stems from a study that included factors like “marital status and insurance type” as additional predictors of success, but the algorithm only included race without acknowledging these other factors, the authors noted.

  •  Evolution and your health: 5 questions and answers 
  •  12 amazing images in medicine 
  •  Top 10 mysterious diseases 

Compared with cesarean deliveries, vaginal births are associated with fewer surgical complications and faster recovery times, as well fewer complications in later pregnancies, the authors noted. Nonwhite women have higher rates of cesarean sections than white women in the U.S., and the VBAC algorithm may perpetuate the trend, they wrote.

Originally published on Live Science.

Source: Read Full Article