'AI personally makes me pretty nervous' employment lawyer says of hiring algorithms

Recent notices from the EEOC about using AI and algorithms in hiring point to the complexity of avoiding discriminatory conduct.

The DOJ warns that employers should screen hiring technologies before use and regularly when in use, to determine if they screen out persons with disabilities. (Credit: Alexander Limbach/Shutterstock.com)

A sobering message from Washington highlights the risk of violating the Americans with Disabilities Act when employers use artificial intelligence programs to help with employment decisions.

Recent notices from the Equal Employment Opportunity Commission and the Department of Justice about the potential for disability discrimination when employers use AI and algorithms to evaluate job applicants point to the complexity of avoiding discriminatory conduct, some employment lawyers have said.

Related: Beware ‘immunity discrimination’ in the workplace

“I’m generally a relatively conservative employment adviser, and AI personally makes me pretty nervous,” said Lee Moylan, an employment lawyer at Klehr Harrison Harvey Branzburg in Philadelphia.

Lee D. Moylan of Klehr Harrison Harvey Branzburg/courtesy photo

The EEOC and DOJ notices make clear that employers using programs based on AI or algorithms when screening applicants should inform applicants about the hiring process and emphasize the right to reasonable accommodations for an applicant whose disability impacts their ability to take part in the process.

But because the ADA covers such a broad swath of issues, and a prospective employer might not know that a job applicant has a disability, avoiding discriminatory conduct when using an AI program is difficult, Moylan said.

“I get nervous when employers are using AI in the hiring process. You may not realize all the things you need to be thinking about. Yes, it is very dangerous,” Moylan said.

‘Can’t demonstrate that through this test’

It’s no secret that AI hiring tools made to help employers sort through applicants have had a discrimination problem—such software has long been criticized for culling women and nonwhite applicants from the applicant pool. Some products come with the label “bias tested,” but that label typically means software that has been tested for race and gender bias, said Lauren Daming, a labor and employment lawyer whose practice focuses on privacy at Greensfelder, Hemker & Dale in St. Louis.

Guarding against disability discrimination when using AI hiring tools is much more difficult because it encompasses such a large variety of conditions, Daming said. For example, a chat bot might ask an applicant if he or she can stand for 30 minutes, and a negative response might cause that applicant to be eliminated from consideration. Or employers might discriminate against people with post-traumatic stress disorder or mental illness if they are questioned about the ability to focus on a task, whether they are optimistic, or whether they wake up every day excited for the day ahead, Daming added.

Lauren Daming of Greensfelder, Hemker & Dale/courtesy photo

“There are a lot of job attributes that you can measure through these job tests, but they’re not necessarily true to a real work environment where lots of people with disabilities would be able to do the essential functions of the job, but they can’t demonstrate that through this test,” said Daming.

The EEOC said that without proper safeguards, workers with disabilities might be screened out of consideration for a job or promotion even if they can do the job without a reasonable accommodation. The EEOC also said if using AI results in applicants having to provide information about disabilities or medical conditions, it may result in prohibited, disability-related inquiries.

The DOJ said in a separate announcement that employers should screen hiring technologies before use and regularly when in use, to determine if they screen out persons with disabilities who can perform the job with or without reasonable accommodations. For example, if an employer uses facial and voice analysis technologies to evaluate applicants’ skills and abilities, people with disabilities like autism or speech impairments may be screened out, even if they are qualified for the job.

Some employers use video games to measure an applicant’s abilities or personality traits, but an applicant who is blind might not be able to play the game and would be rejected, the EEOC said. But that applicant still might be able to perform the essential functions of the job, the agency added.

‘Be critical about the vendors you’re using’

The EEOC guidance makes it clear that an employer using AI software made by a third party can’t blame bias on the vendor, Daming said.

“The EEOC has really said it’s the employer’s responsibility to vet that stuff before you use it,” Daming said. “This is just a good reminder that you need to be critical about the vendors you’re using, the types of software and the algorithms. You need to know what they’re actually doing—don’t just sign up for some techy vendor because it promises lots of benefits,” Daming said.

“I think another issue is companies aren’t really thinking through why they want to use AI or algorithms. They really need to pay attention to the purpose of what the software is doing, what they’re trained to find in job candidates and how they think they can identify those attributes, and then making sure the systems they are using to evaluate people are really identifying those attributes,” Daming said.

Moylan said employers intending to use AI or algorithms in the recruitment process have to do their own due diligence to ensure the program they are using is not inadvertently excusing people with disabilities.

“Employers can make sure they clearly articulate to applicants and employees, make it very clear and prevalent that you can request a reasonable accommodation if you need one. Make sure you have a process in place for handling such requests. Also, be transparent on what criteria you are looking for [in a job applicant] and when you are articulating that, make sure what you are looking for are truly essential functions of the position,” Moylan said.

“The message isn’t ‘don’t do it.’ You have to be very careful when you’re rolling it out. If my client asked me about this, I would want to see the method being used, I would want to know about the company [providing the software]. I would want to think of every angle. That’s what lawyers do—we think of the worst-case scenario. I’d definitely want to be involved in the process,” Moylan said.

Read more: