Job Seekers Sue Eightfold AI Over Secret Scoring System Used by Microsoft and PayPal

9 Sources

Share

A groundbreaking class-action lawsuit targets Eightfold AI, an AI-powered hiring platform used by Microsoft, PayPal, and Salesforce. Two job seekers claim the system violates the Fair Credit Reporting Act by generating hidden algorithmic assessments without applicant consent, potentially reshaping how AI hiring tools operate across the industry.

Job Seekers Challenge AI Hiring Practices in Landmark Case

Two job seekers have filed a class-action lawsuit against Eightfold AI, marking the first legal challenge in the United States to allege that an AI hiring vendor has violated the Fair Credit Reporting Act (FCRA).

1

The complaint, filed in California's Contra Costa County Superior Court, targets the Santa Clara-based company's AI-powered hiring platform, which assists major employers including Microsoft, PayPal, Salesforce, and Bayer in evaluating candidates.

1

Source: Decrypt

Source: Decrypt

Plaintiffs Erin Kistler and Sruti Bhaumik claim that Eightfold's system generates secretive candidate reports and match scores without proper disclosure or consent, violating consumer protection laws enacted in the 1970s.

5

Kistler, a computer science graduate with 19 years of product management experience, told The New York Times she applied for thousands of jobs over the past year, with only 0.3 percent resulting in follow-ups or interviews.

4

"I think I deserve to know what's being collected about me and shared with employers," Kistler said, expressing frustration over the lack of feedback.

4

Source: Futurism

Source: Futurism

How Eightfold's AI Hiring System Works

Eightfold AI uses large-scale data analysis to match candidates with jobs more efficiently than human recruiters. According to the company, its AI models draw on more than 1 billion professional profiles, 1 million job titles, and an equally vast catalog of skills and industries.

1

The lawsuit alleges the platform's AI is trained on more than 1.5 billion global data points, collecting information from sources like LinkedIn, GitHub, and job boards.

5

Source: TechSpot

Source: TechSpot

When employers evaluate applicants, the system assembles talent profiles that describe traits such as teamwork or introversion, assess educational background, and even forecast likely future roles.

1

These algorithmic assessments are distilled into numerical match scores ranging from zero to five that predict the likelihood a candidate is a good fit for a given role.

2

Lower-ranked candidates are often discarded before a human being ever looks at their application, creating what critics call a "black box" where job seekers cannot see the process behind the decision.

4

Legal Arguments and FCRA Compliance

The lawsuit contends that automated ranking systems should be subject to the same disclosure and correction requirements as credit scoring agencies.

1

Backed by law firm Outten & Golden and nonprofit advocacy group Towards Justice, the complaint argues that because Eightfold's scores aggregate personal information and translate it into rankings used for employment purposes, the company should follow rules that apply to consumer reporting agencies.

2

Those rules include notifying applicants when such a report is being created, obtaining their consent, and giving them the chance to dispute any inaccurate information.

2

The lawsuit alleges that during the application process, neither plaintiff received standalone disclosure that consumer reports would be generated, nor did they receive summaries of their consumer protection rights.

5

The plaintiffs seek actual and statutory damages between $100 and $1,000 per violation under federal law, plus up to $10,000 per violation under California law.

5

Company Response and Industry Context

Eightfold disputes the allegations. Company spokesperson Kurt Foeller stated that the platform "operates on data intentionally shared by candidates or provided by our customers. We do not scrape social media and the like."

3

He emphasized Eightfold's commitment to responsible AI, transparency, and compliance with applicable data protection and employment laws.

1

Yet the case highlights growing tension between technological innovation and decades-old privacy statutes. Roughly 88% of companies now use some form of AI for initial candidate screening, according to the World Economic Forum.

2

Legal observers note the challenge ahead. "These tools are designed to be biased - in the sense that they're looking for specific kinds of candidates," said David J. Walton, a Philadelphia attorney who advises companies on AI compliance. "The fine line is whether that bias veers into something unlawful."

1

Broader Implications for Job Applicant Screening

The Eightfold suit follows another high-profile challenge against Workday, an AI-powered hiring platform, where a federal judge allowed claims to proceed alleging discrimination against older, disabled, and Black applicants.

1

Regulators have also weighed in. In 2024, the Consumer Financial Protection Bureau issued guidance indicating that algorithmic scores used for employment decisions are likely consumer reports under the FCRA, though that guidance was later rescinded under the Trump administration.

1

Eightfold's investors - including SoftBank's Vision Fund and General Catalyst - have wagered heavily on the platform's ability to transform corporate hiring pipelines. The company claims that one-third of the Fortune 500 uses its technology and has partnered with state labor departments in New York and Colorado to power public job-matching portals.

1

Such scale means the implications could reach far beyond a single company. A ruling that classifies AI hiring algorithms as consumer reporting systems could require tech vendors to open their processes to scrutiny, disclose ranking methods to applicants, and implement dispute mechanisms similar to those used by credit bureaus today.

1

"There is no AI exemption to our laws," said David Seligman, executive director of Towards Justice. "For decades, these statutes have protected people from opaque systems that decide their futures. That protection shouldn't disappear just because the system now has an algorithm."

1

Jenny R. Yang, a lawyer for the case and former chair of the U.S. Equal Employment Opportunity Commission, added that qualified workers are being denied job opportunities based on automated assessments they have never seen and cannot correct.

2

Attorneys on the case note that AI hiring tools are being adopted very quickly, often faster than companies are building the compliance, auditing, and governance structures needed to use them responsibly, creating real risk of inaccurate decisions and hidden discrimination.

3

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo