by John R. Fischer
, Senior Reporter | May 31, 2022
An AI program's inexplicable ability to predict a patient's race with 90% accuracy just by reading X-ray scans has got researchers at MIT and Harvard scratching their heads.
The findings raise concerns surrounding racial bias and the potential for AI to diagnose a patient based on race instead of individual needs. "These findings demonstrate that the medical technology we use may be capturing information that we didn't realize was possible, or was unintended," Marzyeh Ghassemi, study coauthor and MIT assistant professor of electrical engineering and computer science, told HCB News. "This information can then be used by a high-capacity model in ways we don't realize. Not all personalization of care may be desirable, or appropriate, for models trained on large amounts of data from biased human processes"
The researchers trained a deep-learning model to identify race from X-ray scans of the chest, head, and spine of patients who provided racial information that was not included in the X-ray itself. In several tests, they assessed its abilities on the basis of variable differences such as anatomy, bone density and resolution of images. For instance, with bone density, they assumed that since Black patients generally have higher bone mineral density, the color differences of the thicker and thinner parts of bone enabled the model to identify race, but even when they applied a filter to the images, the model was still able to accurately predict race.
KA Imaging’s Reveal 35C detector, currently available as an upgrade solution in the US and selected geographies, can now be sold in the European Union. The detector recently obtained the CE Mark. Contact us at email@example.com to book a free demo.
While the scientists are still not sure how it could do this, Ghassemi theorizes that it could be possible that X-ray and CT scanners are able to detect higher melanin — pigment that determines skin color — content of darker skin and embed this information in digital images. She told the Boston Globe
that it could be possible that human radiologists have never noticed it before but stresses that more research needs to be done.
"There is a lot of good research showing racial bias in diagnosis and treatment plans without any machine learning models - this is an issue in any system with human judgements," she told HCB News. "By engaging with technology, we want to make sure that this isn't made worse. Systemic audits and robust regulatory guidance, coupled with better support and training of clinical staff are all potential ways to address this risk."
The findings were published in The Lancet Digital Health
. Back to HCB News