Support Provided ByLearn More
Tech + EngineeringTech & Engineering

Racially-biased medical algorithm prioritizes white patients over black patients

The algorithm was based on the faulty assumption that health care spending is a good proxy for wellbeing. But there seems to be a quick fix.

ByKatherine J. WuNOVA NextNOVA Next
iStock-1092103510.jpg

By focusing on spending over biological data, an algorithm that affects 100 million Americans underestimates black patients’ need for additional treatment. Image Credit: wilpunt, iStock

In the last decade or so, artificial intelligence has found its way into just about every technology-heavy sector of society. From music recommendation services to targeted advertising, machines around the world have learned how to pore through gobs of data, identify the patterns within, and spit out predictions—a strategy that, in theory, streamlines an otherwise arduous process.

There’s just one problem: Across a bevy of applications, artificial intelligence has a serious racial bias problem.

Today, researchers announce the latest example in a study published in the journal Science. Their findings show a widely used medical algorithm that predicts who might benefit from follow-up care drastically underestimates the health needs of black patients—even when they’re sicker than their white counterparts.

Removing the racial bias in the algorithm, which is one of several used in hospitals and by insurance companies around the country, could more than double the number of black patients who are deemed eligible for additional medical support, the researchers found.

Optum, the health services company that sells the algorithm, is now working with the team behind the study to rectify the issue. But the problem is probably widespread among the many organizations that provide health care in the United States, where programs like these affect more than 200 million people, the researchers write.

“It’s truly inconceivable to me that anyone else’s algorithm doesn’t suffer from this,” study author Sendhil Mullainathan, a computational scientist at the University of Chicago Booth School of Business, told Carolyn Y. Johnson at The Washington Post. “I’m hopeful that this causes the entire industry to say, ‘Oh, my, we’ve got to fix this.’”

The algorithm in question, Impact Pro, is estimated to affect 100 million Americans. It’s designed to scan patients’ bills and insurance payouts, then assign “risk scores” to patients, tabulating the urgency of their need for additional or specialized treatment. A big part of the algorithm’s strategy, the researchers found, relies on the assumption that people who spend less on health care are more well.

But many other studies show that this is simply untrue, study author Ziad Obermeyer, a health policy researcher at the University of California, Berkeley, told Michael Price at Science. Black patients, he explains, are less likely than white patients to purchase medical services for the same conditions, due in part to unequal access to care and historical distrust in health providers.

None of this was accounted for in the algorithm. As a result, people with similar scores weren’t on level medical ground. Compared to white patients with similar “risk,” black patients suffered from more illnesses and conditions like cancer, diabetes, and high blood pressure. All told, the algorithm had missed nearly 50,000 chronic conditions in black patients—simply because they spent less on treatment.

The system then helped perpetuate the cycle that had compromised its performance. By flagging patients who were already more likely to spend on health care—typically, wealthy white people—as candidates for additional treatment, it likely widened an existing racial disparity, the researchers say.

In their paper, the researchers also report a straightforward fix for the issue. They trained a new algorithm that scoured biological, rather than financial, data, and predicted future health conditions instead of spending. This reduced the number of missed chronic conditions in black patients to 8,000—an 84% drop.

Support Provided ByLearn More

“Predictive algorithms that power these tools should be continually reviewed and refined, and supplemented by information such as socioeconomic data, to help clinicians make the best-informed care decisions for each patient,” Optum spokesman Tyler Mason told Johnson. “As we advise our customers, these tools should never be viewed as a substitute for a doctor’s expertise and knowledge of their patients’ individual needs.”

The trouble may be identifying these issues in the first place. Prediction algorithms can be both complex and obscure—and across fields, researchers caution that these programs are only as good as the data they’re fed.

“It’s important that we understand the data the algorithms are trained on,” Milena Gianfrancesco, an epidemiologist at the University of California, San Francisco who wasn’t involved in the study, told Price. “An algorithm built and used blindly on [racial] disparities certainly has the potential to further racial biases in health care.”

But, as experts have pointed out before, the problem perhaps goes back to us, the humans who built machines and asked them to learn in our stead. We have a shot at catching our own biases before they’re passed on; the same isn’t (yet) true of algorithms. As Mullainathan told Shraddha Chakradhar at STAT News, “Concepts that we as humans tend to take synonymously—like care in dollars and care in biological terms—algorithms take them literally....But there’s a difference between them, especially for black patients versus white patients.”

Receive emails about upcoming NOVA programs and related content, as well as featured reporting about current events through a science lens.

Funding for NOVA Next is provided by the Eleanor and Howard Morgan Family Foundation.

National corporate funding for NOVA is provided by Draper. Major funding for NOVA is provided by the David H. Koch Fund for Science, the Corporation for Public Broadcasting, and PBS viewers. Additional funding is provided by the NOVA Science Trust.