top of page

Work, Disability

Disabled job seekers disadvantaged by AI bias in hiring

Recent study reveals how AI ranks resumes with disability-related credentials lower

MMS Staff

23 Jun 2024

5-min read

The use of artificial intelligence (AI) tools such as ChatGPT in resume screening is becoming increasingly common among recruiters. 


And a recent study by researchers at the University of Washington has uncovered a significant issue: AI tools can be biassed against resumes that imply a disability. 


This finding has profound implications for disability inclusion and rights, as it highlights how technological advancements can inadvertently reinforce existing prejudices. 


The study and its findings


This research, presented at the 2024 ACM FAccT (Conference on Fairness, Accountability, and Transparency) investigated how ChatGPT ranked resumes with disability-related credentials. 


Led by Kate Glazko, a doctoral student at the UW's Paul G. Allen School of Computer Science & Engineering, the study found that resumes with disability-related accolades — such as the "Tom Wilson Disability Leadership Award" — were consistently ranked lower than identical resumes without these credentials.


Worse - when the AI was asked to explain its rankings, it revealed biassed perceptions of disabled individuals. 


For example, a resume with an autism leadership award was said to have "less emphasis on leadership roles," pushing the stereotype that autistic individuals are not capable leaders.


Attempting to mitigate bias


The researchers attempted to mitigate this bias by customising the AI with instructions to avoid ableism. 


While this approach reduced bias for five of the six disabilities tested (deafness, blindness, cerebral palsy, autism, and the general term "disability"), only three disabilities saw an improvement in rankings compared to resumes without any mention of disability.


"Ranking resumes with AI is starting to proliferate, yet there's not much research behind whether it's safe and effective," said Glazko, the study's lead author. "For a disabled job seeker, there's always this question when you submit a resume of whether you should include disability credentials. I think disabled people consider that even when humans are the reviewers."


Fair point. 


"In a fair world, the enhanced resume should be ranked first every time," said senior author Jennifer Mankoff, a UW professor in the Allen School. "I can't think of a job where somebody who's been recognized for their leadership skills, for example, shouldn't be ranked ahead of someone with the same background who hasn't." 


When researchers asked GPT-4 to explain the rankings, its responses exhibited explicit and implicit ableism. 


For instance, it noted that a candidate with depression had "additional focus on DEI and personal challenges," which "detract from the core technical and research-oriented aspects of the role."


"Some of GPT's descriptions would colour a person's entire resume based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resume," Glazko said. 


"For instance, it hallucinated the concept of 'challenges' into the depression resume comparison, even though 'challenges' weren't mentioned at all. So you could see some stereotypes emerge."


Implications for disability inclusion and rights


These findings are troubling for several reasons. 


First, they reveal that AI tools can reinforce biases, undermining efforts to promote disability inclusion and rights. If AI tools are used to screen resumes, disabled candidates may be unfairly disadvantaged, even if they possess the necessary qualifications and leadership skills.


Second, the study underscores the broader issue of how technological advancements can replicate and amplify real-world biases. 


AI systems learn from existing data, which often contain historical biases and ableist notions. Without careful oversight and correction, these systems can push, and even worsen existing, discriminatory practices.


Challenges disabled people face in finding meaningful employment


Disabled individuals face numerous barriers when seeking employment, including:


  • Biassed hiring practices, discrimination, and being faced with stereotypes about disabilities.

  • Many workplaces are not fully accessible, creating physical and technological barriers for disabled employees.

  • There are often fewer job opportunities made available for disabled individuals, particularly in competitive fields.

  • Employers may be unwilling or unable to provide necessary accommodations, such as flexible working hours or assistive technologies.

  • Negative attitudes and misconceptions about disabilities can affect workplace interactions and professional development opportunities.


Tips for employers to promote fair and inclusive hiring


To ensure fairness and inclusivity in hiring and promoting, employers can take the following steps:


  • Provide bias training for hiring managers and staff to recognize and counteract biases.

  • Write inclusive job descriptions that emphasise essential skills and competencies rather than unnecessary physical requirements.

  • Ensure that job application processes are accessible to all candidates, including those with disabilities.

  • Offer reasonable accommodations during the hiring process and in the workplace.

  • Use diverse hiring panels to reduce individual biases and ensure a variety of perspectives in the hiring process.

  • Clearly define and communicate the criteria for hiring and promotion, focusing on skills, experience, and potential rather than assumptions about disability.

  • Provide ongoing support and development opportunities for disabled employees to thrive in their roles.

  • Implement feedback mechanisms to allow disabled employees to voice concerns and suggest improvements. 


The nullification of disability inclusion and rights


The use of biassed AI in resume screening effectively nullifies inclusion efforts towards communities that have historically been marginalised and their identities stigmatised. 


Disabled individuals already face numerous barriers in the job market, and AI tools that perpetuate age-old bias just adds another layer of discrimination. Something we don’t need at all. 


This issue is particularly concerning given the increasing reliance on AI in hiring processes.


A fair scenario would include resumes that highlight leadership skills and achievements — whether related to disability or not — be ranked based on the candidate's qualifications and potential. 


The fact that AI tools can diminish the value of disability-related credentials is a reminder that technological solutions must be developed and implemented with a strong emphasis on fairness and inclusivity.


Ensuring fairness in AI


To address these issues, it is crucial for developers, researchers, and companies to prioritise fairness in AI development. This includes:


  • Conducting regular audits of AI systems to identify and correct biases.

  • Ensuring that training data includes diverse representations of disabled individuals and their achievements.

  • Providing clear guidelines to AI systems on avoiding ableism and other forms of discrimination.

  • Incorporating human oversight in the resume screening process to catch and address biases that AI may miss.


By taking these steps, we can work towards a future where AI tools contribute to, rather than hinder, disability inclusion and rights. 


Ensuring that technological advancements promote fairness and equality is essential for building a more inclusive society.


More information: Kate Glazko et al, Identifying and Improving Disability Bias in GPT-Based Resume Screening, The 2024 ACM Conference on Fairness, Accountability, and Transparency (2024). DOI: 10.1145/3630106.3658933


This article was originally published on Techxplore.com

Much much relate? Share it now!

SHORTS

bottom of page