AI tools used by English councils downplay women’s health issues, study finds

A recent study by the London School of Economics (LSE) has found that AI tools used by more than half of England's councils may be contributing to gender bias in care decisions. The research focused on Google's AI tool "Gemma," which councils use to generate and summarize case notes. The study found that when using Gemma to analyze the same case notes, the tool used language such as "disabled," "unable," and "complex" significantly more often in descriptions of men than women. This suggests that the AI tool may be downplaying the physical and mental health issues of women, potentially leading to biased care decisions. The findings raise concerns about the potential for AI-powered tools to perpetuate gender biases in the provision of public services. The researchers emphasize the need for greater oversight and accountability in the use of AI systems to ensure they do not exacerbate existing inequalities.
Note: This is an AI-generated summary of the original article. For the full story, please visit the source link below.