A new survey from Roy Morgan has revealed that 57% of Australians believe that artificial intelligence (AI) causes more problems than solves, with women making up the bulk of the respondents. However, when it comes to AI being used for recruitment, the numbers tell a very different story.
The data from Roy Morgan (which came from a survey of almost 1,500 Australians) showed that 67% of people who believe that it causes more problems were women. The data also showed that older Australians and those from rural areas were more likely to be sceptical of AI.
PwC’s 2021 AI Predictions Survey also found that Australia is one of the most nervous nations when it comes to AI adoption.
And none of this is particularly surprising. While AI is nothing new, it’s exploded in the mainstream over the past year. Adoption of generative AI in particular has skyrocketed while laws, regulations and education
Of course, scepticism is ripe when the critical elements and considerations in the space still need to be implemented, and in some cases, legislated.
That being said, once you dive deeper and ask more specific questions, you may find that there is a little more trust in AI than what you’d expect — especially when it comes to bias in the workforce.
Human bias vs AI bias in recruitment
Bias in AI is rampant. And this is because a lot of it has been trained on data that is also biased.
It’s a real problem that has to be solved, particularly within businesses. IBM’s 2022 Global AI Adoption Index revealed that 74% of organisations using AI arenโt reducing unintended bias.
But what about human bias? We all have them, whether it’s intentional or not. We already know that bias has a huge impact on the workforce when it comes to things like equal pay, promotions and even hiring.
Those of us who have been on the receiving end — particularly women and people of colour — are acutely aware of this. And it seems it’s even having an impact on how some of us view AI.
A recent white paper titled Does Artificial Intelligence Help or Hurt Gender Diversity?ย found that women were 30% more likely to complete an application compared to men if AI was involved in the recruitment process.
“Complementary evidence from two surveys suggests that the gender treatment effect is driven by applicantsโ perceptions of the relative bias they experience from AI vs. human evaluators,” the white paper says.
It also found that human evaluators tend to score women “substantially lower” than men when gender-revealing names are shown with applications.
“We find that adding AI to recruitment increases the representation of women at the 50th percentile of evaluated applicants by 30%.”
This is particularly damning when you consider the above data that suggests the propensity for Australians, especially women, to be wary of AI.
How bad must bias be in the workforce — particularly when it comes to recruitment — if a demographic inclined to be sceptical of AI is also more willing to trust it than human evaluators?
This leaves me with two thoughts — that equality in the workplace continues to be a sad state of affairs, and that when it comes to asking for perspectives on AI, you need to be asking the right questions.
Comments