• Another Biden achievement reached today:
Gasoline prices at the pump reached a new milestone on Tuesday. For the first time ever, all 50 states recorded gas prices above $4 a gallon, after prices in the three holdout states — Georgia, Kansas, and Oklahoma — increased overnight, according to AAA data.
• Long article in the New York Times today about rising second thoughts about the use of anti-psychotic medications. I have no opinion about the medical issues involved, but I am struck by how the Times describes the matter in its Tweet promoting the story:
“Non-consensus realities”? I thought we are supposed to follow “the consensus” of “the science” on all matters.
AI can tell your race from an X-ray image — and scientists can’t figure out how
A new study by an international team of scientists from Canada, the U.S., Australia and Taiwan reports that artificial intelligence used to read X-rays and CT scans can predict a person’s race with 90 per cent accuracy — and humans can’t. The scientists, including those from Massachusetts Institute of Technology and Harvard Medical School, have no idea how the program does it. . .
The study began after scientists noticed that an AI program for examining chest X-rays was more likely to miss signs of illness in Black patients. “We asked ourselves, how can that be if computers cannot tell the race of a person?” a co-author and an associate professor at Harvard Medical School Leo Anthony Celi told the Boston Globe.
Researchers taught the AI program by showing it large numbers of race-labelled images of different parts of the body, including the chest, hand and spine — with no obvious markers of race, such as skin colour or hair texture — and then sets of unlabelled images. The program identified the race in the unmarked images with more than 90 per cent accuracy, and could differentiate Black patients from white even when images were from people of the same size, age or gender.
The discovery can assist medical staff in some ways, but it also raises the prospect that AI-based diagnostic systems might unintentionally generate racially biased results, such as automatically recommending a particular treatment for Black patients, whether or not it’s appropriate for the particular person, the newspaper reports. Additionally, that person’s own doctor would be unaware the AI based its diagnosis on racial data.