Article Text
Abstract
Please confirm that an ethics committee approval has been applied for or granted: Yes: I’m uploading the Ethics Committee Approval as a PDF file with this abstract submission
Background and Aims Artificial Intelligence (AI) is being integrated into anaesthesiology to enhance patient safety, improve efficiency, and streamline various aspects of practice. This study aims to evaluate whether AI-generated images reflect the demographic, racial and ethnic diversity observed in the anaesthesia workforce and to identify inherent social biases in these images. Role models are essential for inspiring leadership ambitions and empowering younger generations. The medical field struggles with representation of women and minorities.
Methods This post-hoc study compares real-world ESRA gender membership data to AI-generated images of regional anaesthesiologists. The initial cross-sectional analysis was conducted from January to February 2024, where three independent reviewers assessed and categorized each image based on gender (m/f).
Results According to 2023 ESRA gender membership data, 50% of members identified as male, while the other 50% identified as another gender/chose not to disclose their gender. However, images generated by ChatGPT DALL-E 2 and Midjourney showed regional anesthesiologists as male in 97% and 99% of cases, respectively, indicating a significant discrepancy (P<0.001).
Conclusions Current AI text-to-image models exhibit a gender bias in the depiction of regional anesthesia (RA), misrepresenting the actual gender distribution in the field. This bias could perpetuate skewed perceptions of gender roles in RA. The findings emphasize the necessity for changes in AI training datasets and greater support for minority RA role models. More broadly, fostering inclusive mentorship and leadership, reducing barriers for institutional representation, and implementing gender equality policies can help recognize and nurture talent regardless of gender.