Monday, 21 October 2019

Can AI eliminate discrimination in job recruitment?

Might advanced technology help stop the problem of recruiters discriminating unfairly against job applicants?

Russell Cavanagh writes about age discrimination.and other issues affecting the employment market.

Despite Economic Survivor still being a young blog, there are plenty of articles here confirming what many readers (well, both of you) likely know already - that no matter what laws exist, age discrimination remains rife in the realm of employment.

Office decoration

I was once on a job recruitment panel that chose the young female candidate over the more experienced older man. In retrospect, it seemed a wee bit like choosing how to decorate the office best.

Although I wasn't entirely happy with my then manager's final say, I was still a decade or two away from standing in the same shoes the older guy wore that day. My younger self therefore saw no great issue.

But in truth it turned out, despite thinking at the time my boss was more interested in improving the office feng shui, that our choice did prove very good at doing her job.

And yet she likely wouldn't have got it had automation been available in the recruitment process way back around 2003. Here's why ...

Automating the process

Automation is used regularly in recruitment these days, particularly in the initial stages of sifting CVs.

Savvy job applicants therefore know to make "key skills" they claim on their CVs correspond with specifications stated in the vacancy ad. Any recruitment professional will confirm how that boosts a candidate's chances of success.

The trick is to get noticed and selected. This works because recruitment software uses algorithms that find specified keywords in order to sift and rank submissions.

If we can trust the tech then we may also rely on the decisions it makes:

Mathias Linnemann
05 October 2019

"Algorithms are great for removing bias in recruitment, because they don’t take into account variables such as age, name, race, or gender. Algorithms only look at people’s competencies to find the right person for the particular job. In that sense, biases are not even an option."

Read more ...

Here's where it gets more tricky

The term "AI" is often defined as "artificial intelligence" where sci-fi connotations are often presented as taking place either now or in the near future.

But existing AI is really just "automated intelligence" that manages relatively simple functions, albeit rather a lot of them at the same time, in the form of subroutines processing "what if" commands.

The final result, in the case of recruitment, aims to be a list of suitably qualified and experienced candidates.

Nonetheless, Natalia Kukushkina of the tech and human resources website wrote in a company blog post how "there are certain hidden rocks to keep in mind when implementing AI" and that more advanced software can cause unforeseen problems:
Natalia Kukushkina
25 July 2019

"AI learns on past data – and past data consists of a set of human behavior. So if an AI model sees that you’ve said no to 95% of black men in the past, it may as well learn that such candidates should not be considered at all.

"To fix the issue, you will need a Data Scientist to remove all the patterns of bias and make sure AI will learn from non-biased data set."

Read more ...

Eh, hang on a minute. "A Data Scientist" can "fix the issue"?


If you like my work here and want it to continue, please consider a one-off donation by card or PayPal. Many thanks!