Press "Enter" to skip to content

Hospital ‘risk scores’ prioritize white patients

When health risk prediction algorithms focus on cost rather than illness, racial bias can creep in, researchers found.

Hero Images/Getty Images

By Michael Price

Set foot in any major U.S. hospital, and you are entering a place where computers assist doctors almost as much as nurses do. Some algorithms, for example, scan millions of records to flag high-risk patients for follow-up treatment. The problem is that these programs—also used by insurance companies—disproportionately direct their specialized care to white patients, a new study finds. The good news is that a relatively simple tweak may correct this racial bias—if the companies behind the algorithms are willing to do so.

Hospitals and insurance companies use algorithms to assign “risk scores” to more than 200 million Americans every year. The scores—derived from electronic health records that track illnesses, hospitalizations, and other variables—flag some high-risk patients for special interventions. If, for example, an algorithm determines that your diabetes, hypertension, and chronic kidney disease together are putting your life in danger, your primary care doctor might put you on an intensive program to lower your blood sugar.

In the new study, Ziad Obermeyer, a health policy researcher at the University of California (UC), Berkeley, and colleagues examined the effectiveness of one such risk prediction program in a large research hospital. The team soon noticed that the Impact Pro program—manufactured by the health care company Optum in Eden Prairie, Minnesota—was giving many black patients “strangely low” risk scores, despite their deteriorating health conditions.

When the researchers searched for the source of the scores, they discovered that Impact Pro was using bills and insurance payouts as a proxy for a person’s overall health—a common tactic in both academic and commercial health algorithms, Obermeyer says. The problem with that, he notes, is that health care costs tend to be lower for black patients, regardless of their actual wellbeing. Compared with white patients, many black patients live farther from their hospitals, for example, making it harder to go regularly. They also tend to have less flexible job schedules and more child care responsibilities.

As a result, black patients with the highest risk scores had higher numbers of serious chronic conditions than white patients with the same scores, including cancer and diabetes, the team reports today in Science. And compared with white patients with the same risk scores, black patients also had higher blood pressure and cholesterol levels, more severe diabetes, and worse kidney function.

But by simply tweaking the algorithm to predict the number of chronic illnesses that a patient will likely experience in a given year—rather than the cost of treating those illnesses—the researchers were able to reduce the racial disparity by 84%.

“It’s important that we understand the data the algorithms are trained on,” says Milena Gianfrancesco, an epidemiologist at UC San Francisco who wasn’t involved in the study. “An algorithm built and used blindly on [racial] disparities certainly has the potential to further racial biases in health care.”

Obermeyer stresses that although his team looked at a single commercial algorithm, the same problems are rife throughout the country. “This is an industrywide systematic error,” that is “putting healthier white patients further ahead in line.”

He adds that Optum deserves credit for its response, though. When the researchers sent their results to the company, he says, it replicated the findings and committed to correcting its model. The “algorithms that power these tools should be continually reviewed and refined, and supplemented by information such as socioeconomic data, to help clinicians make the best-informed care decisions for each patient,” an Optum spokesperson tells Science.

Still, industry standards are unlikely to change without new laws and regulations, says Ruha Benjamin, an associate professor of African American studies at Princeton University and author of the book Race After Technology: Abolitionist Tools for the New Jim Code. “Scholars and advocates have been raising the alarm about how automated decision systems reproduce and even deepen racial inequities,” she says. “Most tech development prioritizes speed and profit over the public good. That has to change.”


Source: Science Mag