Skip to content

Warning Job Applicants With Disabilities May Miss Out if Artificial Intelligence Used to Hire

Posted in Accessibility News

Melanie Carroll
Aug 04 2021

How IBM Is Using Artificial Intelligence to Combat Bias in Advertising

IBM is trying to figure out how to use artificial intelligence to identify bias in advertising and mitigate it. Coming up with an equitable ad industry is better for both consumers and companies, said Sheri Bachstein, CEO of The Weather Compan…

New Zealand employers are crying out for skilled workers but may be ruling out potential talent without realising it.

Many large organisations use algorithms to assess performance in job interviews and about 40 per cent use artificial intelligence (AI) when screening potential candidates, according to a global report by business consultancy Accenture.

But algorithms are only as good as the people programming them and there were concerns about inbuilt discrimination against job applicants with disabilities, said Jonathan Mosen, chief executive of Workbridge.

“There is no stopping the progress of technology, so I don’t think it is going to do us any good to lament the advancements in tech but I think there are some cautions,” said Mosen, who is totally blind and hearing impaired. He heads an organisation which helps people with disabilities, mental illness and other conditions into employment.

Only 23 per cent of disabled people who wanted to work were in work right now, compared with about 70 per cent of non-disabled people, he said.

“The participation rate is very low among disabled people already, so it is not as if we have got a lot to lose. But I think potentially we do have some things to lose.”

“No business can afford to lose the competitive edge that employing a disabled person might bring,” says Jonathan Mosen.

Taking the human element out could make job applications even more difficult for people with disabilities.

“We encourage disabled people to go into that job interview and if they can read the signs they can say: I know you have questions about how I could do this job with my particular impairment, I would love the opportunity to talk that through with you if you would not mind.

“But if you have got artificial intelligence that is programmed by people who are making assumptions about certain criteria, like speed or the tone of someone’s voice or anything like that, then you may inadvertently be screening out a really capable candidate.”

Barriers already existed for job applicants with disabilities, Mosen said. When he previously applied for a chief executive’s job, the psychometric testing for candidates was completely inaccessible to him as a blind person, he said.

People with disabilities needed to be part of the artificial intelligence design process, which would ensure the dataset was inclusive, he said.

The biggest problem is other people’s perceptions about what he can achieve, says Jonathan Mosen.

“No business can afford to lose the competitive edge that employing a disabled person might bring. They may be out-of-box thinkers – a lot of disabled people are because we have to be.”

Technology in itself was not the issue and had actually made employment more accessible, he said.

“Now, offices are electronic, blind people have computers that will display in braille what is on the screen or speak what is on the screen, so it is really no impediment at all.”

The biggest problem he faced was not so much being disabled, it was other people’s perceptions about what he could or could not achieve and he was worried those perceptions would find their way into algorithms.

With more than 160,00 disabled New Zealanders seeking work, a more inclusive design of artificial intelligence was essential, said Accenture New Zealand managing director Ben Morgan.

A combination of technology and human input is the answer, says Ben Morgan of Accenture NZ.

Often executives were unaware that AI-powered employee recruitment and performance systems might not include consideration of people’s physical, cognitive or sensory disabilities.

“Assessments that evaluate employees based on algorithmic perception of their facial expressions, tone of voice or movements can penalise people with disabilities,” Morgan said.

ArtificiaI intelligence was used in New Zealand in the hiring process, for example to screen CVs. It was also used in employee evaluation, with metrics to measure employee performance, he said.

“The vast majority of New Zealand organisations using this type of screening technology would buy it from third-party software providers.

“We would really encourage organisations to get behind the covers of some of those products and look for how some of those products are taking an inclusive design approach, or not, and then importantly how it is applied within their organisation.”
While 67 per cent of high-level executives believed they had built a supportive workplace for disabled employees, only 41 per cent of employees with disabilities agreed.

A combination of technology and human input was the answer for a more inclusive workplace, Morgan said, along with independent auditing of organisations’ algorithmic assessments and datasets.

“Many with disabilities conceal conditions such as dyslexia but organisations cannot incorporate these perspectives if people are not comfortable being open and giving their views. Fostering a positive culture of safe self-disclosure is vital to achieving responsible and ethical AI.”

Otago University AI expert Professor Colin Gavaghan – said to his knowledge, nobody knew exactly how much the technology was being used in New Zealand, which was a concern.

Who the job advert targets in the first place is a concern, says Professor Colin Gavaghan.

But it was being used – for example HireVue-
software, which used facial recognition in job interviews.

Bias could creep in at any stage of the hiring process but Gavaghan was particularly concerned about who the job advert was targeted at in the first place.

“So if you never find out about a vacancy, you are never going to apply for it. Then the initial filtering of the applications, before you get to the short-list and interview stage, I think a lot of damage could be done potentially at that stage.”

If a system was 90 per cent accurate, it had to be clear what the other 10 per cent was and who it affected, “because if the 10 per cent of times it goes wrong always affects disabled people or MÄori people, that is a disaster”.

Adam Shapley, managing director of recruiting experts Hays, in New Zealand, said most large recruiters were using AI to some extent in New Zealand, for speed and efficiency.

Recruiters were very aware about the barriers to people with disabilities, Shapley said.

“In fact, it’s often recruiters who are working with hiring managers on how to improve diversity and inclusion in both the recruitment process and the workplace.”

Whenever AI was involved, it was important to be able to clearly explain why recommendations were made and on what basis.

Original at https://www.stuff.co.nz/business/industries/125949927/warning-job-applicants-with-disabilities-may-miss-out-if-artificial-intelligence-used-to-hire

Source: https://www.accessibilitynewsinternational.com/warning-job-applicants-with-disabilities-may-miss-out-if-artificial-intelligence-used-to-hire/

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *