The Biden administration and Department of Justice have warned employers using AI software for recruitment purposes to take extra steps to support disabled job applicants or they risk violating the Americans with Disabilities Act (ADA).
Under the ADA, employers must provide adequate accommodations to all qualified disabled job seekers so they can fairly take part in the application process. But the increasing rollout of machine learning algorithms by companies in their hiring processes opens new possibilities that can disadvantage candidates with disabilities.
The Equal Employment Opportunity Commission (EEOC) and the DoJ published a new document this week, providing technical guidance to ensure companies don't violate ADA when using AI technology for recruitment purposes.
"New technologies should not become new ways to discriminate. If employers are aware of the ways AI and other technologies can discriminate against persons with disabilities, they can take steps to prevent it," said EEOC chair Charlotte Burrows.
"As a nation, we can come together to create workplaces where all employees are treated fairly. This new technical assistance document will help ensure that persons with disabilities are included in the employment opportunities of the future."
Companies using automated natural language processing-powered tools to screen resumes, for example, may reject candidates that have gaps in their employment history. Disabled folks may have had to take time off from work for health reasons, and hence they risk being automatically turned down early on in the hiring process despite being well qualified.
There are other ways that AI can discriminate against those with disabilities. Computer vision software analyzing a candidate's gaze, facial expressions, or tone is not appropriate for those who have speech impediments, are blind, or paralyzed. Employers need to take extra precautions when using AI in their hiring decisions, the document advised.
Companies should ask software vendors providing the tools if they built them with disabled people in mind. "Did the vendor attempt to determine whether use of the algorithm disadvantages individuals with disabilities? For example, did the vendor determine whether any of the traits or characteristics that are measured by the tool are correlated with certain disabilities?" it said.
Employers should think of ways of how best to support disabled individuals, such as informing them how its algorithms assess candidates, or giving them more time to complete tests.
If algorithms are used to rank candidates, they could consider adjusting scores for those with disabilities. "If the average results for one demographic group are less favorable than those of another (for example, if the average results for individuals of a particular race are less favorable than the average results for individuals of a different race), the tool may be modified to reduce or eliminate the difference," according to the document.
"Algorithmic tools should not stand as a barrier for people with disabilities seeking access to jobs," Kristen Clarke, Assistant Attorney General for the Justice Department's Civil Rights Division, concluded. "This guidance will help the public understand how an employer's use of such tools may violate the Americans with Disabilities Act, so that people with disabilities know their rights and employers can take action to avoid discrimination." ®
Thanks to Linux wunderkind Rudra Saraswat, not Canonical, this time
Move follows Databricks' donation of Delta Lake 2.0 to Linux Foundation
Webinar Say sayonara to SANs, hello to HCI by catching up on this webinar
Finally shift set for version 2.357 of developer automation platform
Already host your own file-sharing tool? Now you can add a web-based office suite on top
Stalled marketshare seems to be creeping upwards again in consumer, enterprise - but adoption still a slog