AI has the potential to completely revolutionise employers ability to hire a diverse workforce.
Computers do not discriminate based on race, gender, sexual orientation or age. Therefore, why not let an intelligent algorithm make hiring decisions for you?
The logic appears to be sound, a computer is more able to be impartial than any person, and can simply look at the relevant data signals to select the most qualified people from a heap of applications, free from human bias or prejudice.
However, issues start to appear when you consider the fact that an algorithm is vulnerable to inheriting the biases of its creator. If this does occur it can be argued that AI will simply reinforce the status quo rather than redefine it.
According to a study from researchers at Princeton University and the University of Bath, AI learns human bias- the same way a child might. The study focused specifically on an artificial intelligence process called word embeddings which, put simply, allows AI to correlate other commonly associated words to create a definition with more context and accuracy.
The problem is, social constructions of words typically contain some bias. As the study found “female” and “woman” were “closely associated with arts and humanities occupations and with the home,” while male and man were “closer to math and engineering professions.”
So how can we ensure that AI is used effectively as a tool to aid diversity rather than hinder it? This all depends on what the AI is being used for – and how it learned to do
Human oversight is still necessary to ensure the AI isn’t simply replicating existing biases or introducing new ones based on the data it is fed.
Recruiting AI software can be tested for bias by using it to rank and grade candidates, and then assessing the demographic breakdown of those candidates. The great thing testing software in this way is that if AI does highlight a bias in your recruiting, then this provides you an opportunity to act on
Aided by AI, employers can use human judgment and expertise to decide how to address any biases and improve hiring processes. Teaching AI to do as we say, not as we do.