Friday, 15 Nov 2024

Robot recruiters: can bias be banished from AI hiring?

Robot recruiters: can bias be banished from AI hiring?


Robot recruiters: can bias be banished from AI hiring?
1.0 k views

Michael Scott, the protagonist from the US version of The Office, is using an AI recruiter to hire a receptionist.

Guardian Australia applies.

The text-based system asks applicants five questions that delve into how they responded to past work situations, including dealing with difficult colleagues and juggling competing work demands.

The process would typically create a shortlist an employer can follow up on, with insights on personality markers including humility, extraversion and conscientiousness.

For customer service roles, it is designed to help an employer know whether someone is amiable. For a manual role, an employer might want to know whether an applicant will turn up on time.

The selling points of AI hiring are clear: it can automate costly and time-consuming processes for businesses and government agencies, especially in large recruitment drives for non-managerial roles.

Sapia is not the only AI company claiming its technology will reduce bias in the hiring process. A host of companies around Australia are offering AI-augmented recruitment tools, including not just chat-based models but also one-way video interviews, automated reference checks, social media analysers and more.

In 2022 a survey of Australian public sector agencies found at least a quarter had used AI-assisted tech in recruitment that year. Separate research from the Diversity Council of Australia and Monash University suggests that a third of Australian organisations are using it at some point in the hiring process.

Applicants, though, are often not aware that they will be subjected to an automated process, or on what basis they will be assessed within that.

Research out of the US in 2020 demonstrated that facial-analysis technology created by Microsoft and IBM, among others, performed better on lighter-skinned subjects and men, with darker-skinned females most often misgendered by the programs.

Natalie Sheard, a lawyer and PhD candidate at La Trobe University whose doctorate examines the regulation of and discrimination in AI-based hiring systems, says this lack of transparency is a huge problem for equity.

Australia has no laws specifically governing AI recruitment tools. While the department of industry has developed an AI ethics framework, which includes principles of transparency, explainability, accountability and privacy, the code is voluntary.

Hyman says client feedback and independent research shows that the broader community is comfortable with recruiters using AI.

In the Sapia demonstration, the AI quickly generates brief notes of personality feedback at the end of the application for the interviewee.

This is based on how someone rates on various markers, including conscientiousness and agreeableness, which the AI matches with pre-written phrases that resemble something a life coach might say.

Sapia says its chat-interview software analysed language proficiency, with a profanity detector included too, with the company saying these were important considerations for customer-facing roles.

So, could Guardian Australian work for Michael Scott at the fictional paper company Dunder Mifflin?

you may also like

Airline passenger shares photo of 'reclined' seat debacle: 'Dude is in my lap'
  • by foxnews
  • descember 09, 2016
Airline passenger shares photo of 'reclined' seat debacle: 'Dude is in my lap'

A passenger paid for a first-class ticket on an American Airlines flight, but the seat in front of him trapped him in his chair, which led to the airline posting a public apology on X.

read more