Eightfold AI is at the center of a new lawsuit filed by job seekers who want greater transparency into how artificial intelligence evaluates job applicants.
Two women with STEM backgrounds have filed the lawsuit alleging that Eightfold’s AI unfairly screened them out of roles for which they were qualified at companies such as Microsoft and PayPal, according to a news release from law firm Outten & Golden. The plaintiffs argue that AI-powered hiring systems should be regulated under the Fair Credit Reporting Act of 1970, the same law that governs credit bureaus, because these AI tools collect personal data and significantly influence employment decisions.
Erin Kistler, one of the plaintiffs in the lawsuit, has decades of experience in the technology industry, The New York Times reports. Yet after applying to thousands of jobs in the past year, with several applications processed through Eightfold’s software system, only 0.3% resulted in a follow-up or interview, she shared.
“I think I deserve to know what’s being collected about me and shared with employers,” Kistler told the outlet. “And they’re not giving me any feedback, so I can’t address the issues.”
Per the lawsuit, applicants are assigned a score by Eightfold software based on their skills and what the employer is looking for, but the applicant is not provided with an explanation of their scores, what information was used, nor insight into how the tools generate rankings. They also receive no opportunity to correct errors if the system gets something wrong, The Times notes.
New York-based Outten & Golden and Denver-based firm Towards Justice filed the lawsuit on the plaintiffs’ behalf with assistance from former attorneys at the Consumer Financial Protection Bureau and the Equal Employment Opportunity Commission (EEOC), according to the outlet.
Concerns Over AI In The Hiring Process
Former EEOC Chair Jenny Yang, one of the attorneys representing the plaintiffs, said the commission started taking a closer look at algorithmic hiring systems more than 10 years ago, per The Times.
“We realized they were fundamentally changing how people were hired,” Yang told the outlet. “People were getting rejected in the middle of the night, and nobody knew why.”
The proposed class-action lawsuit is one of the first to rely on credit reporting laws to protect applicants from “Black box” employment decisions, in which candidates are left in the dark and receive no explanation for being screened out, The Times reports. The lawsuit seeks unspecified damages and a court order requiring Eightfold to comply with state and federal consumer reporting laws, per the outlet.
“Just because this company is using some fancy-sounding AI technology and is backed by venture capital doesn’t put it above the law. This isn’t the wild west,” David Seligman, executive director of Towards Justice, said per the Outten & Golden’s news release. “AI systems like Eightfold’s are making life-altering decisions about who gets a job, who gets housing, who gets healthcare, and we’ve got a choice to make: Are we going to let them and their investors pull the wool over our eyes and hijack our marketplace? Or are we going to make sure they follow the laws on the books and provide the most basic things, like fairness, transparency, and accuracy? That’s what this case is about.”
Outten & Golden’s news release notes that, if successful, the lawsuit could reshape AI hiring, demanding accountability for millions of workers each year.
An Eightfold representative did not respond to The Times’ requests for comment, the outlet reports.

