Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

October 23, 2017

Unconscious Bias in Recruiting

Posted by Tom Leinster

All of us who work in maths, physics or computer science departments know about the dramatic gender imbalance in our subjects. Many departments and universities have been working hard to make their recruitment processes more inclusive towards under-represented groups — not only for the excellent altruistic reason that it makes the world a better place, but also for the selfish reason that we don’t want to miss out on getting the best people.

There’s research (as well as anecdotal evidence) showing that the wording of job ads can make a big difference to who applies. In particular, it can influence significantly the gender profile of applicants.

My head of department Iain Gordon just pointed out a website by Kat Matfield where you can paste in your ad and get an automatic assessment of the language used. The site matches the ad against lists of “masculine-coded” and “feminine-coded” words and gives you a summary. The first link above is to the academic paper behind the website.

For example, we at Edinburgh are currently advertising a two-year postdoctoral fellowship in any area of mathematics. Try pasting the ad into the site and see what happens!

Posted at October 23, 2017 1:22 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2987

5 Comments & 0 Trackbacks

Re: Unconscious Bias in Recruiting

This is an interesting idea, but the application seems to be pretty unsubtle, simply counting words that are flagged as “masculine” or “feminine”. For example, the ad I ran through it flagged the word “committee” as “feminine,” but this word only comes up in the phrase explaining that applications should be sent to the (male) search committee chair. Similarly, “confidential” is flagged as “masculine,” though in this case it refers only to the confidentiality of the recommendation letters.

Posted by: Greg on October 24, 2017 4:55 AM | Permalink | Reply to this

Re: Unconscious Bias in Recruiting

Yeah, undoubtedly it’s crude, and the site and its documentation are pretty clear about the limitations. It is at least backed up by some peer-reviewed research, but whatever the lists of words may be, there’s a limit to how nuanced a list-matching algorithm like this can be.

So, it’s a first attempt. But I’m interested in seeing how good this kind of thing can get. I gather that there have been real advances in automated semantic analysis recently, and it’s entirely plausible to me that one could program a computer to predict — with decent accuracy — the gender skew that a job ad is likely to provoke.

Aside from all the obvious reasons, I have a kind of personal interest in this. When I applied for the position I currently hold, the application round was viewed as a kind of disaster, because it attracted an incredibly male-dominated set of applicants (I mean, markedly more so than is normal for mathematics, which is saying something). Afterwards, they realized that the wording of the ad was of the kind that apparently tends to cause that. They changed it, and in subsequent application rounds for the same fellowship got a better balance.

Posted by: Tom Leinster on October 25, 2017 1:13 AM | Permalink | Reply to this

Re: Unconscious Bias in Recruiting

I could feel the difference in tone compared to usual job ads of this sort, and I felt it a positive change (I cannot overemphasise how much job applications freak me out). Aside from the grammatical error, independent attempts to write like this and less like clinical, corporate and, dare I say it, dog-eat-dog screeds (where “we’ll keep your resume if anything else comes up” is a clear sop), the better. (Independent so that people don’t fall into the trap of falling into new clichés; it would also be better to have some forked versions of that website about, with variations on the word list, or other models etc)

Posted by: David Roberts on October 25, 2017 8:21 AM | Permalink | Reply to this

Re: Unconscious Bias in Recruiting

Please, please, please, let’s not turn this into one of those threads. We all have already read enough oversimplifications, admittedly from both sides, on the rest of the Internet.

Posted by: Anonymous Coward on October 25, 2017 5:21 PM | Permalink | Reply to this

Re: Unconscious Bias in Recruiting

Thanks; I agree. I just deleted the comment you’re referring to, which had nothing to do with unconscious bias in recruiting.

Please, let’s keep this on topic. There’s abundant documented evidence of the influence of unconscious bias in job recruitment, and it’s a problem with particular resonance for those of us involved in advertisting, recruiting and hiring in subjects that are very male-dominated. I’m frankly not interested in hosting a debate on whether unconscious bias exists, or in providing a forum for rants about men vs. women. But I’d welcome thoughtful comments on the actual topic of this post.

Posted by: Tom Leinster on October 25, 2017 7:13 PM | Permalink | Reply to this

Post a New Comment