Article Text

Download PDFPDF
LP020 Unveiling gender bias in medical AI: underrepresentation of women in regional anesthesia depictions
  1. Laurens Minsart1,
  2. Mia Gisselbaek2,
  3. Mélanie Suppan2,
  4. Ekin Köselerli3,
  5. Basak Ceydo3,4,
  6. Odmara L Barreto Chang5,
  7. Joana Berger-Estilita6,7 and
  8. Sarah Saxena8
  1. 1Department of Anaesthesiology, University Hospital of Antwerp, Antwerpen, Belgium
  2. 2Division of Anesthesiology - Department of Anesthesiology, Clinical Pharmacology, Intensive Care and Emergency Medicine, Geneva University Hospitals and Faculty of Medicine, Geneva, Switzerland
  3. 3Department of Anaesthesiology and ICU, University of Ankara School of Medicine, Ankara, Turkey
  4. 4Ankara University Brain Research Center, University of Ankara School of Medicine, Ankara, Turkey
  5. 5Department of Anesthesia and Perioperative Care, University of California San Francisco, San Francisco - California, USA
  6. 6Institute for Medical Education, University of Bern, Bern, Switzerland
  7. 7CINTESIS@RISE - Centre for Health Technology and Services Research - Faculty of Medicine, University of Porto, Porto, Portugal
  8. 8Anaesthesiology, Université Libre de Bruxelles (ULB), Brussels, Belgium

Abstract

Please confirm that an ethics committee approval has been applied for or granted: Yes: I’m uploading the Ethics Committee Approval as a PDF file with this abstract submission

Background and Aims Artificial Intelligence (AI) is being integrated into anaesthesiology to enhance patient safety, improve efficiency, and streamline various aspects of practice. This study aims to evaluate whether AI-generated images reflect the demographic, racial and ethnic diversity observed in the anaesthesia workforce and to identify inherent social biases in these images. Role models are essential for inspiring leadership ambitions and empowering younger generations. The medical field struggles with representation of women and minorities.

Methods This post-hoc study compares real-world ESRA gender membership data to AI-generated images of regional anaesthesiologists. The initial cross-sectional analysis was conducted from January to February 2024, where three independent reviewers assessed and categorized each image based on gender (m/f).

Results According to 2023 ESRA gender membership data, 50% of members identified as male, while the other 50% identified as another gender/chose not to disclose their gender. However, images generated by ChatGPT DALL-E 2 and Midjourney showed regional anesthesiologists as male in 97% and 99% of cases, respectively, indicating a significant discrepancy (P<0.001).

Conclusions Current AI text-to-image models exhibit a gender bias in the depiction of regional anesthesia (RA), misrepresenting the actual gender distribution in the field. This bias could perpetuate skewed perceptions of gender roles in RA. The findings emphasize the necessity for changes in AI training datasets and greater support for minority RA role models. More broadly, fostering inclusive mentorship and leadership, reducing barriers for institutional representation, and implementing gender equality policies can help recognize and nurture talent regardless of gender.

  • artificial intelligence (AI)
  • bias
  • gender equity.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.