Making employment technology accessible means ensuring that the candidate can use the technology and that the skills he or she measures unfairly exclude candidates with disabilities, he says. Alexandra Givens, CEO of the company Center for Democracy and Technology, an organization focused on civil rights in the digital age.
Artificial intelligence-based employment tools often do not include people with disabilities when generating data on their training, she says. Such people have long been excluded from the workforce, so algorithms modeled on previous hires in the company will not reflect their potential.
Even if models could explain emergencies, the way disability is presented varies from person to person. Two people with autism, for example, could have very different strengths and challenges.
“As we automate these systems, and employers work on what’s the fastest and most efficient, they lose the opportunity for people to really show their qualifications and ability to do the job,” says Givens. “And that’s a huge loss.”
It is difficult for state regulators to monitor AI recruitment tools. In December 2020, 11 senators wrote letter to U.S. Equal Employment Opportunity Commission expressing concern about the use of employment technologies following the covid-19 pandemic. The letter asked about the agency’s authority to investigate whether these tools discriminate, especially those with disabilities.
The EEOC responded with letter in January it leaked to the MIT Technology Review. In the letter, the commission indicated that it could not investigate AI employment tools without a specific allegation of discrimination. The letter also raises concerns about the industry’s reluctance to share data and says variations between different companies’ software will prevent the EEOC from establishing any broader policies.
“I was surprised and disappointed when I saw the answer,” he says Roland Behm, a lawyer and advocate for people with behavioral health problems. “The whole point of that letter seems to make the EEOC act more as a passive observer rather than as a law enforcement agency.”
The agency usually begins an investigation when an individual files a discrimination complaint. With AI employment technology, most candidates do not know why their job was denied. “I believe the reason we haven’t seen more enforcement actions or private litigation in this area is because candidates don’t know that the bill is grading or grading them,” he says. Keith Nerd, EEOC Commissioner.
Sonderling says he believes artificial intelligence will improve the hiring process and hopes the agency will issue guidelines for employers on how best to apply it. He says he welcomes congressional oversight.