Carnegie Mellon University
Browse

Exploring Gender Bias in Job Descriptions in STEM Fields: Implications for Female Representation in Science, Technology, Engineering and Mathematics

Download (1.57 MB)
poster
posted on 2024-06-28, 18:11 authored by Malika Dikshit

 Gender inequality has been historically prevalent in academia, especially within the fields of Science, Technology, Engineering and Mathematics (STEM). One potential contributing factor to this inequity could be the reluctance of qualified female candidates to apply for positions within higher educational institutions, thereby narrowing the applicant pool for STEM positions. Addressing this issue requires expanding the candidate pool by identifying and removing any barriers and biases that may dissuade potential applicants early in the job application process. Frequently, academic job advertisements often serve as the initial point of contact for prospective candidates. Given the significance of this initial interaction, previous social science research has shown how the language employed in job advertisements can shape perceptions and influence candidates' decisions to apply for positions (e.g., Feldman et al., 2006, Lievens & Chapman, 2010). In fields like academic STEM, where gender disparities are pronounced, analyzing the language used in job descriptions is becoming crucial.

In this thesis, we propose to examine gender bias in academic job descriptions in the STEM fields. We go a step further than previous studies that merely identify individual words as ‘masculine-coded’ and ‘feminine-coded’ and delve into the contextual language used in the academic job advertisements. We designed a novel approach to detect gender biases in job descriptions using Natural Language Processing (NLP) techniques. We propose three big groups to explore gender bias in job descriptions, namely agentic, balanced, and communal language. We cluster similar information in job descriptions into these three groups using contrastive learning and various clustering techniques. We then meticulously explore the three clusters and analyze the extent of bias within each cluster. Our findings reveal patterns of gender bias in the clusters, shedding light on the complexities of language and gender perception in academic job postings within STEM disciplines. This research also contributes to the field of gender bias detection by providing a novel approach and methodology for categorizing gender bias in job descriptions, which can aid more effective and targeted job advertisements that will be equally appealing across all genders. 

History

Date

2024-04-30

Academic Program

  • Information Systems

Advisor(s)

Houda Bouamor

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC