Discover Amazing Products Daily – Quality Finds, Incredible Deals, and Unmatched Savings You Can’t Resist!

AI Medical Tools Provide Worse Treatment for Women and Underrepresented Groups

Historically, most clinical trials and scientific studies have primarily focused on white men as subjects, leading to a significant underrepresentation of women and people of color in medical research. You’ll never guess what has happened as a result of feeding all of that data into AI models. It turns out, as the Financial Times calls out in a recent report, that AI tools used by doctors and medical professionals are producing worse health outcomes for the people who have historically been underrepresented and ignored.

The report points to a recent paper from researchers at the Massachusetts Institute of Technology, which found that large language models including OpenAI’s GPT-4 and Meta’s Llama 3 were “more likely to erroneously reduce care for female patients,” and that women were told more often than men “self-manage at home,” ultimately receiving less care in a clinical setting.  That’s bad, obviously, but one could argue that those models are more general purpose and not designed to be use in a medical setting. Unfortunately, a healthcare-centric LLM called Palmyra-Med was also studied and suffered from some of the same biases, per the paper. A look at Google’s LLM Gemma (not its flagship Gemini) conducted by the London School of Economics similarly found the model would produce outcomes with “women’s needs downplayed” compared to men.

A previous study found that models similarly had issues with offering the same levels of compassion to people of color dealing with mental health matters as they would to their white counterparts. A paper published last year in The Lancet found that OpenAI’s GPT-4 model would regularly “stereotype certain races, ethnicities, and genders,” making diagnoses and recommendations that were more driven by demographic identifiers than by symptoms or conditions. “Assessment and plans created by the model showed significant association between demographic attributes and recommendations for more expensive procedures as well as differences in patient perception,” the paper concluded.

That creates a pretty obvious problem, especially as companies like Google, Meta, and OpenAI all race to get their tools into hospitals and medical facilities. It represents a huge and profitable market—but also one that has pretty serious consequences for misinformation. Earlier this year, Google’s healthcare AI model Med-Gemini made headlines for making up a body part. That should be pretty easy for a healthcare worker to identify as being wrong. But biases are more discreet and often unconscious. Will a doctor know enough to question if an AI model is perpetuating a longstanding medical stereotype about a person? No one should have to find that out the hard way.

Trending Products

- 13% Sceptre Curved 24.5-inch Gaming Monitor as mu...
Original price was: $149.97.Current price is: $129.97.

Sceptre Curved 24.5-inch Gaming Monitor as mu...

0
Add to compare
- 34% SAMSUNG 34″ ViewFinity S50GC Collection...
Original price was: $349.99.Current price is: $229.99.

SAMSUNG 34″ ViewFinity S50GC Collection...

0
Add to compare
- 19% Wi-fi Keyboard and Mouse Combo – Rii Co...
Original price was: $20.99.Current price is: $16.99.

Wi-fi Keyboard and Mouse Combo – Rii Co...

0
Add to compare
0
Add to compare
0
Add to compare
0
Add to compare
0
Add to compare
- 5% Logitech Media Combo MK200 Full-Measurement K...
Original price was: $19.99.Current price is: $18.99.

Logitech Media Combo MK200 Full-Measurement K...

0
Add to compare
0
Add to compare
- 25% cimetech EasyTyping KF10 Wireless Keyboard an...
Original price was: $39.99.Current price is: $29.99.

cimetech EasyTyping KF10 Wireless Keyboard an...

0
Add to compare
.

We will be happy to hear your thoughts

Leave a reply

TodayBestFinds
Logo
Register New Account
Compare items
  • Total (0)
Compare
0
Shopping cart