Menu

Game On – EEOC Settles First AI Hiring Bias Lawsuit

AI tools undoubtedly offer benefits in the recruitment and hiring process; however, the use of AI screening tools when making employment decisions comes with associated risks. One significant risk is that an employer may unintentionally violate federal anti-discrimination laws if the AI tool disproportionately screens out individuals in protected classes and the employer is unable to justify the exclusion as sufficiently job-related and consistent with business necessity. The increasing popularity and use of AI tools in the recruitment and hiring process has caught the attention of the Equal Employment Opportunity Commission (EEOC), and it has begun to aggressively target AI hiring bias, which includes new Technical Assistance issued on May 18, 2023.[1]

Forecasting the importance of this issue, last May the EEOC Chair announced that the agency had filed its first lawsuit against a company, iTutor Group, for hiring discrimination in violation of the Age Discrimination in Employment Act (ADEA).[2] The EEOC alleged that the company’s software program automatically rejected over two hundred job applicants between the ages of 55-60 during a one month period. While the company denied intentional discrimination, on August 9, 2023, the company nevertheless settled the case and agreed to pay $365,000 to the screened out applicants, to send an invitation to each person disqualified to reapply, to adopt changes to its anti-discrimination policies, to provide training to all executives and managers prevent running afoul of federal anti-discrimination laws in the future, and to be subject to EEOC monitoring for five (5) years.[3]

Why is this lawsuit and settlement so important? Within the last year, the EEOC has unequivocally demonstrated that it will seek to hold vendors and employers liable for violating federal nondiscrimination laws as the result of the adverse impact that results from programming decisions built into AI tools, whether the scrutinized feature used was implemented intentionally to discriminate or not. The successful pursuit and settlement of its claims against iTutor, likely means that the EEOC will be emboldened to pursue others in the near future. If a commonly cited SHRM survey (which reportedly states that 79% of employers use some form of AI in the recruitment and hiring process) is accurate, then it is only a matter of time before new suits will be filed. When these suits are filed, and if they involve the use of AI tool that screens out a group of prospective workers (age, sex, national origin, disability, etc.), then employers will not be facing a single claim by one disappointed applicant, but a very costly class action claim.

In this evolving area, it is incumbent on all employers to identify and understand what AI tools are being used, directly or indirectly by a third party, as part of the company’s recruitment and hiring process to determine what steps may need to be taken to minimize the risks of future litigation. The fact the employer does not know that the AI tool is being used or that the program used has a feature that allows the tool to unlawfully screen out a protected category of applicants or employees is not a defense if there is a significantly significant adverse impact caused by the AI tool. As a result, at a minimum employers and human resource managers should make sure that all AI products used and the supporting vendors are carefully vetted to gain a sufficient level of comfort that the AI screening tools have been designed, reviewed, and tested to avoid unintended adverse impacts on protected categories of applicants and employees. If you have questions regarding these issues or need assistance in evaluating. correcting or remediating issues involving the use of AI in the workplace, please contact a member of our Labor & Employment Practice Group.

[1] In January 2023, the EEOC identified combatting AI hiring bias as a key aspect of its new Strategic Enforcement Plan, and in May 2023, the Agency issued new Technical Assistance to give employers a clearer understanding of how it will approach this issue, Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964, https://www.eeoc.gov/select-issues-assessing-adverse-impact-software-algorithms-and-artificial-intelligence-used
[2] EEOC sues iTutor Group for Age Discrimination (May 5, 2022), https://www.eeoc.gov/newsroom/eeoc-sues-itutorgroup-age-discrimination
[3] Joint Notice of Settlement

Additional Resources

Practices & Specialties

Similar Articles

No related posts found based on taxonomy.
These articles are provided for general informational purposes only and are marketing publications of Gentry Locke. They do not constitute legal advice or a legal opinion on any specific facts or circumstances. You are urged to consult your own lawyer concerning your situation and specific legal questions you may have.
FacebookTwitterLinkedIn