MIT and Harvard scientists have found that AI can identify race through X-rays — and no one knows how.

A doctor can’t tell if someone is black, Asian, or white, just by looking at their X-ray pictures. But a computer can, according to Surprising new paper by an international team of scientists, including researchers at MIT and Harvard Medical School.

The study found that an AI program trained in reading X-rays and CT scans can predict a person’s race with up to 90% accuracy. But the software was not trained to identify patients by race, and the scientists who conducted the study say they don’t know how a computer detects it.

“When my graduate students showed me some of the results in this paper, I actually thought it was wrong,” he said, Marzieh Ghasemi, an MIT associate professor of electrical engineering and computer science, and co-author of the paper, which was published Wednesday in the medical journal Lancet Digital Health. “I honestly thought my students were crazy when they told me.”

At a time when AI software is increasingly being used to help clinicians make diagnostic decisions, the research raises the troubling possibility that AI-based diagnostic systems could Unintentionally generating racially biased results. For example, AI (with access to X-rays) can automatically recommend a certain course of treatment to all black patients, whether or not it is best for a particular person. In the meantime, the patient’s human physician wouldn’t know that the AI ​​had based its diagnosis on racial data.

The research effort was born when scientists noticed that an AI program for examining chest X-rays was more likely to miss signs of disease in black patients. “We asked ourselves, How could that happen if computers couldn’t tell a person’s gender?” She said Leo Anthony Seelyanother co-author and associate professor at Harvard Medical School.

The research team, which included scientists from the United States, Canada, Australia and Taiwan, first trained an AI system using standard X-ray and computed tomography datasets, labeling each image with a person’s race. The images came from various parts of the body, including the chest, hand, and spine. Computer-scanned diagnostic images do not contain obvious signs of sweat, such as skin color or hair texture.

Once the program displayed a large number of photos with tags of race, various groups of unlabeled photos were shown. The software was able to identify the ethnicity of the people in the photos with remarkable accuracy, often well above 90 percent. Even when photos of people of the same size, age or gender were analyzed, the AI ​​accurately distinguishes between white and black patients.

but how? Ghasemi and her colleagues remain baffled, but suspect it has something to do with melanin, the pigment that determines skin color. X-rays and CT scans may detect the high melanin content of dark skin, and embed this information into the digital image in a way that human users have not previously noticed. It will take a lot of research to be sure.

Could test results be evidence of innate differences between people of different races? Alan Goodman, Professor of Biological Anthropology at Hampshire College and co-author of the book “Racism not race“I don’t think so. Goodman expressed skepticism about the paper’s conclusions and said he doubted other researchers would be able to reproduce the results. But even if they did, he thought it was all about geography, not race.”

Goodman said geneticists have found no evidence of substantial racial differences in the human genome. But they found significant differences between people based on where their ancestors lived. “Instead of using race, if they look at someone’s geographic coordinates, will the machine work as well?” Goodman asked. “My sense is that the machine will work as well.”

In other words, an AI might be able to determine through X-rays that someone’s ancestors were from Northern Europe, another from Central Africa, and a third from Japan. “You call this race. I call this geographic difference,” Goodman said. (However, he admitted it was unclear how AI could detect this geographical difference just with X-rays.)

In any case, Seeley said clinicians should be reluctant to use AI diagnostic tools that might automatically generate biased results.

“We need to pause,” he said. “We can’t rush to bring algorithms into hospitals and clinics until we’re sure they’re not making racist or gender-biased decisions.”


Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter Tweet embed.