Do AI-led Interviews Disadvantage Disabled Jobseekers?
Getting out a laptop in an interview is typically a bit of a controversial choice. For many interviews now, conducted remotely, they’re the only requirement. These interviews, which often use artificial intelligence within the sifting process, have been heralded as more efficient, effective and convenient than in-person interviews. However, there is a risk that disabled jobseekers are disadvantaged by this technology due to implicit biases or specific difficulties within the process.
Dorian, for example, found their remote interview difficult due to their autism: “Being professional” just means masking my autism, appearing as “high-functioning” as possible while using a lot of my attention to understand my interviewers’ reaction”. With no person on the other side of the screen, this was impossible. The interview made Dorian feel “pretty worried” and like “a stage actor suddenly forced in front of a camera with no audience”.
Another jobseeker, Blythe, had a similar experience. They found the lack of non-verbal cues a barrier, alongside “not having the option to ask for a rephrasing of questions”. The software used by the company they applied through also only allowed for two takes for each question, which “wasn’t in any way helpful to stress levels”, especially when their autism already leads to difficulties in interviews. The experience was tough; “I don't think I'd have applied if I had known I'd have to go through with it”, they concluded.
Facial analysis software, in which the facial expressions of applicants were judged during interviews, caused particular controversy. Those who struggled with eye contact, potentially due to neurodiversity, would be marked lower. In January, HireVue – used by over 100 employers to screen applicants – announced that it would discontinue facial analysis during interviews. However, the company would continue to use Natural Language Processing (NLP).
NLP uses algorithms to make sense of language, examining each candidate’s competency based on their verbal responses, including intonation. The company said that, due to NLP’s evolution, “visual analysis no longer significantly added value to assessments” but also noted that public outcry over the software had led to “concern”.
For disabled people, NLP can be especially troubling. If their speech is impacted – either directly (for example, through a stutter) or indirectly, via brain fog – then NLP may pick up on this. Merve Hickok, a lecturer and speaker on AI bias and founder of Lighthouse Career Consulting, told SIC that “the NLP model might pick up these signals and score you low wherever the system is looking for fluent speech”. She continues: “The designers of these algorithms, by definition, make a decision on what ‘good’ speech should sound like”. This is intersectional; those for whom English is not a first language, who have a regional accent, or use slang may be marked lower.
Diane Lightfoot, CEO of the Business Disability Forum, echoed these concerns, telling SIC that AI systems should be designed inclusively and taught to “focus on the context of what a candidate is saying rather than the delivery”.
Some companies which use software like this don’t even utilise the AI feature. Instead, staff manually assess the footage. However, no matter what remote interviews are actually assessing, many jobseekers assume the worst. “I'm a cynic so I genuinely think the software analysed everything”, Blythe told SIC. This can cause stress and anxiety which then impacts interview performance, especially for those who may already feel they may be judged due to their disability.
Ellen Jones, a Change Enabler at inclusive workplace consultancy Utopia, explained that this results in a lack of “psychological safety”. Clearer processes surrounding what is being assessed and transparency regarding who will be assessing applications would help here, alongside a more holistic process for disabled candidates which considers individual adjustments.
Diane also stressed the significance of adjustments – “more widely, at every stage of the recruitment process, companies should ask every candidate if they need any adjustments – and then be prepared to make them”. Suggested adjustments included extra time, providing explanations or more information for questions, providing questions in advance or offering a work trial.
While some of these changes are possible through the video interview software (for example, most provide additional time), there are only so many changes that can be made. Reflecting on their experience, Dorian said, “I’m not sure how much better the recorded interview format could be made. Without the reactions to play off of, there isn’t much extra accommodations could do”.
Adjustments shouldn’t be made in isolation. Ellen emphasised that any process making remote interviews accessible “needs to be something which is developed alongside an inclusion strategy more broadly across the company”. Otherwise, disabled jobseekers may find that they are successful in entering the workplace but struggle in succeeding once they arrive.
Artificial intelligence can help here; Diane expanded on potential uses for AI in the workplace such as automating tasks like scheduling or basic admin to “free up more time for interaction”, analysing gender and other pay gaps to create a more inclusive workforce and even understanding reasons why employees exit the company, encouraging a better consideration of diversity and inclusion.
Due to their convenience for employers, it’s likely that remote interviews are here to stay, with AI still involved in the sifting process in some form. More clarity regarding these assessment processes, along with a holistic understanding of adjustments and alternative routes, can support disabled jobseekers, enabling them to find empowering and fulfilling work.
Written by Abigail Howe