News Release

Lessons from computer science can make medical devices fair for all

Peer-Reviewed Publication

American Association for the Advancement of Science (AAAS)

Technology can be biased, and its design can disadvantage certain demographic groups. While efforts to address bias and promote fairness in technologies are rapidly growing in a variety of technical disciplines, Achuta Kadambi argues that similar growth is not occurring fast enough for medical engineering. "Although computer science companies terminate lucrative but biased facial recognition systems, biased medical devices continue to be sold as commercial products," writes Kadambi in a Perspective. Bias in medical devices results in undesirable performance variation across demographic groups and can greatly influence health inequality. For example, optical biosensors that use light to monitor vital signs like blood oxygenation have been shown to work differently on light versus dark skin. Since some of these measures relate to what could be serious medical prognoses, such a biased medical device could lead to disparate mortality outcomes for Black and dark-skinned patients. The author outlines three ways medical devices can suffer bias - they can exhibit physical bias, computational bias and interpretational bias. However, unlike similar issues faced in computer science, bias in medical devices often remains unaddressed. According to Kadambi, bias is routinely researched and actively addressed in computer science. For example, Amazon Inc. has recently banned the use of their facial recognition products by law enforcement until bias concerns within the software can be understood and fully resolved. Medical engineers could embrace the lessons learned from these areas to achieve fairness in medical devices and prevent health inequity. "Achieving fairness in medical devices is a key piece of the puzzle, but a piece nonetheless," writes Kadambi, noting that achieving true fairness goes beyond simple device engineering. "Even if one manages to engineer a fair medical device, it could be used by a clinical provider who has conscious or subconscious bias. And even a fair medical device from an engineering perspective might be inaccessible to a range of demographic groups," he writes.

For reporters interested in trends, an October 2019 Science study revealed how a nationally deployed healthcare algorithm - one of the largest commercial tools used by health insurers to inform health care decisions for millions of people each year - exhibited significant racial bias in its predictions of the health risks of black patients. The authors of the study, who showed that reformulating the algorithm led to an 84% reduction in racial bias, were working with the algorithm's developer to reduce bias at the time of the 2019 publication. https://science.sciencemag.org/cgi/doi/10.1126/science.aax2342

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.