AI recruitment tools are “automated pseudoscience” says Cambridge researchers
<p>AI is set to bring in a whole new world in a huge range of industries. Everything from art to medicine is being overhauled by machine learning.</p>
<p>But researchers from the University of Cambridge have published a paper in <a href="https://link.springer.com/journal/13347" target="_blank" rel="noopener"><em>Philosophy & Technology</em></a> to call out AI used to recruit people for jobs and boost workplace diversity – going so far as to call them an “automated pseudoscience”.</p>
<p>“We are concerned that some vendors are wrapping ‘snake oil’ products in a shiny package and selling them to unsuspecting customers,” said co-author Dr Eleanor Drage, a researcher in AI ethics.</p>
<p>“By claiming that racism, sexism and other forms of discrimination can be stripped away from the hiring process using artificial intelligence, these companies reduce race and gender down to insignificant data points, rather than systems of power that shape how we move through the world.”</p>
<p>Recent years have seen the emergence of AI tools marketed as an answer to lack of diversity in the workforce. This can be anything from use of chatbots and resume scrapers, to line up prospective candidates, through to analysis software for video interviews.</p>
<p>Those behind the technology claim it cancels out human biases against gender and ethnicity during recruitment, instead using algorithms that read vocabulary, speech patterns, and even facial micro-expressions, to assess huge pools of job applicants for the right personality type and ‘culture fit’.</p>
<p>But AI isn’t very good at removing human biases. To train a machine-learning algorithm, you have to first put in lots and lots of past data. In the past for example, AI tools have discounted women all together in fields where more men were traditionally hired. <a href="https://www.theguardian.com/technology/2018/oct/10/amazon-hiring-ai-gender-bias-recruiting-engine" target="_blank" rel="noopener">In a system created by Amazon</a>, resumes were discounted if they included the word ‘women’s’ – like in a “women’s debating team” and downgraded graduates of two all-women colleges. Similar problems occur with race.</p>
<div class="newsletter-box">
<div id="wpcf7-f6-p218666-o1" class="wpcf7" dir="ltr" lang="en-US" role="form">
<form class="wpcf7-form mailchimp-ext-0.5.62 resetting spai-bg-prepared" action="/technology/ai-recruitment-tools-diversity-cambridge-automated-pseudoscience/#wpcf7-f6-p218666-o1" method="post" novalidate="novalidate" data-status="resetting">
<p style="display: none !important;"><span class="wpcf7-form-control-wrap referer-page"><input class="wpcf7-form-control wpcf7-text referer-page" name="referer-page" type="hidden" value="https://cosmosmagazine.com/technology/" data-value="https://cosmosmagazine.com/technology/" aria-invalid="false" /></span></p>
<p><!-- Chimpmail extension by Renzo Johnson --></form>
</div>
</div>
<p>The Cambridge researchers suggest that even if you remove ‘gender’ or ‘race’ as distinct categories, the use of AI may ultimately increase uniformity in the workforce. This is because the technology is calibrated to search for the employer’s fantasy ‘ideal candidate’, which is likely based on demographically exclusive past results.</p>
<p>The researchers actually went a step further, and worked with a team of Cambridge computer science undergraduates, to build an AI tool modelled on the technology. You can check it out <a href="https://personal-ambiguator-frontend.vercel.app/" target="_blank" rel="noopener">here</a>.</p>
<p>The tool demonstrates how arbitrary changes in facial expression, clothing, lighting and background can give radically different personality readings – and so could make the difference between rejection and progression.</p>
<p>“While companies may not be acting in bad faith, there is little accountability for how these products are built or tested,” said Drage.</p>
<p>“As such, this technology, and the way it is marketed, could end up as dangerous sources of misinformation about how recruitment can be ‘de-biased’ and made fairer.”</p>
<p>The researchers suggest that these programs are a dangerous example of ‘technosolutionism’: turning to technology to provide quick fixes for deep-rooted discrimination issues that require investment and changes to company culture.</p>
<p>“Industry practitioners developing hiring AI technologies must shift from trying to correct individualized instances of ’bias’ to considering the broader inequalities that shape recruitment processes,” <a href="https://link.springer.com/article/10.1007/s13347-022-00543-1" target="_blank" rel="noopener">the team write in their paper.</a></p>
<p>“This requires abandoning the ‘veneer of objectivity’ that is grafted onto AI systems, so that technologists can better understand their implication — and that of the corporations within which they work — in the hiring process.”</p>
<p><!-- Start of tracking content syndication. Please do not remove this section as it allows us to keep track of republished articles --></p>
<p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=218666&title=AI+recruitment+tools+are+%E2%80%9Cautomated+pseudoscience%E2%80%9D+says+Cambridge+researchers" width="1" height="1" /></p>
<p><em>Written by Jacinta Bowler. Republished with permission of <a href="https://cosmosmagazine.com/technology/ai-recruitment-tools-diversity-cambridge-automated-pseudoscience/" target="_blank" rel="noopener">Cosmos Magazine</a>.</em></p>
<p><em>Image: Cambridge University</em></p>