Recruiting diverse candidates and promoting equal opportunities are two pressing issues in higher education today. This is true not only for students and learners, but also for higher education institutions themselves. As a result, institutions around the world have taken steps - some big, some small - to advance the agenda of social equity in academia.
Most higher education institutions I know take diversity and equity in the workforce seriously, and many already have proactive measures and institutional processes in place. Some, however, are so eager to solve the problems quickly that they end up with responses that may produce unintended consequences or even perpetuate biases in different contexts. One such response is the use of linguistic software.
Screening Software, and the Notion of Neutral Language
In recent years, there has been a surge in the use of software that screens job ads for biased language. The goal of this type of software is to increase diversity and reduce discrimination against minority groups by identifying and removing biased language from job ads. While this seems like a well-intentioned solution to a significant problem, there are a number of reasons why these screening tools are not an effective solution.
One fundamental problem is that language is never innocent. Words are fundamental to shaping our understandings, and every word and phrase carries meaning and cultural connotations that can vary widely depending on the context in which they are used.
The screening software I have seen relies on a set of pre-determined rules that are designed to identify and remove certain words or phrases that are deemed to be biased or discriminatory. However, these rules are often based on assumptions about what is and isn't appropriate language, and may not take into account the complexities of different cultural contexts.
For example, the word "assertive" may be seen as a positive trait in some cultures, while in others, it may be viewed as aggressive or even confrontational. Similarly, phrases like "team player" or "fast-paced environment" can have different meanings depending on the industry or company culture. Are “research excellence” and “thought leadership” unfortunate and negatively biased terms in academia? While some would say yes, others would caution against rewriting, at the expense of clarity.
The problem with the screening tools I have seen is that they assume that there is such a thing as neutral language. But take a moment and think of "they" as a singular pronoun: While "they" is sometimes used as a gender-neutral singular pronoun in English, in other instances it can be ambiguous and lead to confusion, especially when it is not clear whether it refers to one person or multiple people. Additionally, and perhaps slightly less important, it is confusing for speakers of languages that have gendered pronouns, such as Spanish, German or French.
Research on Job Ads
A recent study in Management Science, published in February and picked up by MIT Sloan School of Management a couple of weeks ago, further shows that the effect of modifying language in job ads in order to be more accommodating towards a broader candidate pool from diverse backgrounds is largely insignificant. While careful wording may seem like a simple and straightforward solution to the problem of bias in job ads, evidence suggests that these screening tools and corrective linguistic measures have limited effectiveness.
In their study titled "The Gendering of Job Postings in the Online Recruitment Process", Emilio J. Castilla and Hye Jin Rho investigated whether gender-neutral job postings created by employers result in equal numbers of applications from women and men. The authors found that there were minimal effects for both the gendering of job postings and the gender of the job poster.
“Our analyses ultimately show negligible effects for both the gendering of job postings or of the job poster, and we therefore conclude that, in practice, employers’ efforts to simply tweak the language of recruitment messages do not matter much for gender equality and diversity.”
- https://pubsonline.informs.org/doi/abs/10.1287/mnsc.2023.4674
Using screening software to remove biased language from job ads seems like a band-aid solution that doesn't address the root causes of discrimination. To create workplaces that truly work for everyone, we need bold strategic decision-making and elaborate, actionable responses that can actually provide the progress we need.
Final Thoughts
The use of software that screens job ads for biased language may seem like a well-intentioned solution, but the reality is that it is not an effective solution.
Organizations - and the people inhabiting them - need to tackle the underlying issues to eradicate bias and discrimination in the workplace. We need to confront the deeper structural barriers that prevent minority groups from advancing in the workforce.
In doing so, we can create a more equitable workplace that benefits everyone.
— If you found this writing useful, please share it with your network and subscribe here👇
REFERENCES
Castilla, E. J., et al. (2023). “The Gendering of Job Postings in the Online Recruitment Process”. Management Science 0(0). (Published Online: February 27, 2023)
Mangelsdorf, Martha (2023). “Gendered Language in Job Postings Has Little Effect on Applicant Behavior, New Research Finds”. MIT Sloan School of Management. (Published Online: March 24, 2023)