By Consumers For Quality Care, on December 18, 2019
An algorithm that is widely used among health care providers to flag patients who are higher risk for severe illnesses and need special care has been shown to have bias against black patients. As Reuters reports, the algorithm was more likely to flag white patients for special treatment:
After examining the records of 6,079 black and 43,539 white patients through the lens of the software tasked with identifying those at highest risk of serious illness, researchers determined that the algorithm was more likely to flag white patients for extra medical attention than blacks who were just as sick.
The study, published in Science, found the main problem with the algorithm had to do with how it calculated which patients were the “sickest.” The algorithm used health care expenditures to identify the sickest patients. The problem with this method is that white patients were more likely to seek care early on in an illness, while black patients were more likely to seek care when they were already very sick.
This disparity led the algorithm to more commonly flag white patients for increased care. When patients are flagged, it can act as a fast track for them to get access to better care programs.
In concrete terms, “the top 3% of patients in terms of algorithm risk score are auto-identified for enrollment in high-risk care management programs – this doesn’t guarantee they get in, but it’s a bit like a fast track,” Obermeyer explained.
Meanwhile, black patients had different types of costs that were not weighted properly in the algorithm, including costs related to emergency room visits and dialysis.
The authors of the study approached the company that created the algorithm and encouraged them to make changes, which the company has been trying to do.
“The manufacturer independently replicated our analyses on its national dataset of 3,695,943 commercially insured patients,” the researchers write. Since then, the company and the researchers have been working together to root out biases in the software.