6 Sources
6 Sources
[1]
Lawsuit targets AI hiring systems used by Microsoft and Salesforce
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. In context: In California, a lawsuit against one of the most widely used AI hiring systems could reshape how companies deploy algorithms to evaluate candidates. Eightfold AI, a Santa Clara-based platform that assists employers such as Microsoft, PayPal, Salesforce, and Bayer in ranking applicants, is accused of generating undisclosed reports on job seekers without their consent - a practice the plaintiffs claim violates federal consumer protection law. The proposed class action, filed in Contra Costa County Superior Court, is the first in the US to allege that an AI hiring vendor has violated the Fair Credit Reporting Act (FCRA). Enacted in 1970 to protect consumers from inaccurate or opaque credit data, the law also covers any organization that creates reports about a person's characteristics for employment purposes. The key question is whether an algorithm that scores resumes and predicts future job titles falls under that definition. The complaint, backed by law firm Outten & Golden and the nonprofit advocacy group Towards Justice, contends that automated ranking systems should be subject to the same disclosure and correction requirements as credit scoring agencies. Eightfold uses large-scale data analysis to match candidates with jobs more efficiently than human recruiters. According to the company, its AI models draw on more than a billion professional profiles, a million job titles, and an equally vast catalog of skills and industries. When employers evaluate applicants, the system assembles "talent profiles" that describe traits such as teamwork or introversion, assess educational background, and even forecast likely future roles. The results are distilled into numerical or categorical scores that influence whether a candidate proceeds to human review. Critics argue that such automated assessments act as algorithmic gatekeepers, leaving applicants without access to their detailed rankings or the underlying data. Eightfold disputes the allegations. Company spokesperson Kurt Foeller told Reuters that its systems rely solely on information provided directly by candidates or shared by employer customers. "We do not scrape social media and the like," Foeller said. "We are deeply committed to responsible AI, transparency, and compliance with applicable data protection and employment laws." Still, legal observers say the case highlights a growing tension between technological innovation and decades-old privacy statutes. "These tools are designed to be biased - in the sense that they're looking for specific kinds of candidates," said David J. Walton, a Philadelphia attorney who advises companies on AI compliance but is not involved in the case. "The fine line is whether that bias veers into something unlawful," he told The New York Times. The Eightfold suit follows another high-profile challenge against Workday, an AI hiring platform, where a federal judge allowed claims to proceed last year alleging discrimination against older, disabled, and black applicants. That San Francisco case has intensified legal debates over whether algorithmic models inadvertently encode prohibited bias or merely streamline recruitment. Regulators have also weighed in. In 2024, the Consumer Financial Protection Bureau issued guidance indicating that algorithmic scores used for employment decisions are likely consumer reports under the FCRA. Although that guidance was later rescinded under the Trump administration, it signaled how federal agencies might interpret the law in future enforcement actions. Eightfold's investors - including SoftBank's Vision Fund and General Catalyst - have wagered heavily on the platform's ability to transform corporate hiring pipelines. The company claims that one-third of the Fortune 500 uses its technology and has partnered with state labor departments in New York and Colorado to power public job-matching portals. Such scale means the implications of the California suit could reach far beyond a single company. A ruling that classifies AI hiring algorithms as consumer reporting systems could require tech vendors to open their processes to scrutiny, disclose ranking methods to applicants, and implement dispute mechanisms similar to those used by credit bureaus today. For now, the plaintiffs' legal team - which includes former officials from the Consumer Financial Protection Bureau and the Equal Employment Opportunity Commission - views the case as the start of a broader accountability push. "There is no AI exemption to our laws," said David Seligman, executive director of Towards Justice. "For decades, these statutes have protected people from opaque systems that decide their futures. That protection shouldn't disappear just because the system now has an algorithm."
[2]
Job Seekers Want to Know What the Hell Is Going on With AI-Based Hiring Decisions: Lawsuit
Eightfold, an AI company that makes human resources software, is being sued over one of its hiring tools. Applying for a job already sucks on its own. But increasingly, job seekers are left wondering whether an AI system is screening them out before a human ever sees their application. A new lawsuit hopes to change that by forcing more transparency into how AI hiring tools work. The case argues that automated applicant "scores" should be legally treated like credit checks and be subject to the same consumer protection laws. The proposed class action was filed on Wednesday in California state court by two women working in STEM who say AI hiring screeners have filtered them out of jobs they were qualified for. "I've applied to hundreds of jobs, but it feels like an unseen force is stopping me from being fairly considered," said Erin Kistler, one of the plaintiffs, in a press release. "It's disheartening, and I know I'm not alone in feeling this way." And she's right about not being the only person feeling this way at a time when more companies are relying on AI for hiring. Roughly 88% of companies now use some form of AI for initial candidate screening, according to the World Economic Forum. The lawsuit specifically targets Eightfold, an AI human resources company that sells tools designed to help employers manage recruiting and hiring. Among its offerings is a tool that generates a numerical score predicting the likelihood that a candidate is a good match for a given role. That scoring system sits at the center of the case. Eightfold's "match score" is generated using information pulled from a variety of sources, including job postings, an employer's desired skills, applications, and, in some cases, LinkedIn. The Model then provides a score ranging from zero to five that "helps predict the degree of match between a candidate and a job position." The lawsuit argues that this process effectively produces a "consumer report" under the Fair Credit Reporting Act (FCRA), a federal law passed in 1970 to regulate credit bureaus and background check companies. Because the score aggregates personal information and translates it into a ranking used to determine eligibility for "employment purposes," the lawsuit claims Eightfold should be required to follow the same rules that apply to credit reporting agencies. Those rules include notifying applicants when such a report is being created, obtaining their consent, and giving them the chance to dispute any inaccurate information. "Eightfold believes the allegations are without merit. Eightfold's platform operates on data intentionally shared by candidates or provided by our customers," an Eightfold spokesperson told Gizmodo in an emailed statement. "We do not scrape social media and the like. We are deeply committed to responsible AI, transparency, and compliance with applicable data protection and employment laws." Still, the lawsuit is seeking a court order requiring Eightfold to comply with state and federal consumer reporting laws as well as financial damages. "Qualified workers across the country are being denied job opportunities based on automated assessments they have never seen and cannot correct," said Jenny R. Yang, a lawyer for the case and former chair of the U.S. Equal Employment Opportunity Commission. "These are the very real harms Congress sought to prevent when it enacted the FCRA. As hiring tools evolve, AI companies like Eightfold must comply with these common-sense legal safeguards meant to protect everyday Americans."
[3]
Job Seekers Sue Company Scanning Their Résumés Using AI
"I think I deserve to know what's being collected about me and shared with employers." Thanks to scores of competing AI systems clogging up online application portals, applying for a new job in 2026 can feel more like applying for a bank loan than seeking a job. At least, that's what a group of disgruntled job seekers is claiming in a lawsuit against an AI screening company called Eightfold AI. According to the New York Times, the plaintiffs allege that Eightfold's employment screening software should be subjected to the Fair Credit Reporting Act -- the regulations protecting information collected by consumer credit bureaus. The reason, they say, can be found deep within Eightfold's AI algorithm, which actively trolls LinkedIn to create a data set of "1 million job titles, 1 millions skills, and the profiles of more than 1 billion people working in every job, profession, industry, and geography." That data set, in turn, is used in marketing material to help sell its services to potential clients. Using an AI model trained on that data, plaintiffs say, Eightfold scores job applications on a scale of one to five, based on their skills, experience, and the hiring manager's goals. In sum, their argument is that it's not at all unlike the opaque rules used to govern consumer credit scores. In the case of Eightfold, however, applicants have no way of knowing what their final score even is, let alone the steps the system took to come up with it. That creates a "black box": a situation where the people subjected to an algorithmic decision can only see the system's outcome, not the process that led to it. And if Eightfold's AI starts making things up on the fly -- an issue AI models are infamous for -- the job seeker has no way of knowing. There's also the issue of data retention. With no way to take a peek under the hood, there's no telling how much data from job applicants' résumés Eightfold collects, or what the AI company and its clients are doing with it. "I think I deserve to know what's being collected about me and shared with employers," Erin Kistler, one of the plaintiffs told the NYT. "And they're not giving me any feedback, so I can't address the issues." Kistler, who has decades of experience working in computer science, told the publication she's kept a close score of every application she's sent over the last year. Out of "thousands of jobs" she's applied for, only 0.3 percent moved on to a follow-up or interview, she said. It all underscores the sad state of the job market, which has become the stuff of dystopian nightmares thanks to AI hiring tools. Whether the lawsuit can gain enough momentum to challenge the massive legal grey area of AI hiring remains to be seen. If it does, it could bring relief to throngs of despondent job seekers whose careers quite literally hang in the balance. Eightfold AI didn't respond to the NYT's request for comment.
[4]
AI Hiring Firm Eightfold Sued Over Alleged Secret Scoring Of Job Applicants - Decrypt
The filing says the platform's AI is trained on more than 1.5 billion data points, without letting applicants review or correct errors. Two job seekers filed a federal class-action lawsuit Tuesday against AI hiring platform Eightfold, alleging that the company uses hidden artificial intelligence to secretly score applicants without their knowledge or consent, thereby violating consumer protection laws enacted in the 1970s. The complaint, filed in California's Contra Costa County Superior Court, alleges that Eightfold violated the Fair Credit Reporting Act and California's Investigative Consumer Reporting Agencies Act by assembling consumer reports on job applicants without providing required disclosures or dispute rights. Plaintiffs Erin Kistler and Sruti Bhaumik claim Eightfold's platform collects sensitive personal data, including social media profiles, location data, internet activity, and tracking cookies, from public sources like LinkedIn, GitHub, and job boards to evaluate candidates applying to companies including Microsoft, PayPal, Starbucks, and Morgan Stanley. The plaintiffs seek actual and statutory damages between $100 and $1,000 per violation under federal law, plus up to $10,000 per violation under California law, along with punitive damages and injunctive relief requiring Eightfold to change its practices. The lawsuit alleges that Eightfold's AI uses "more than 1.5 billion global data points" to generate "Match Scores" that rank applicants from 0 to 5 based on their "likelihood of success," with lower-ranked candidates often "discarded before a human being ever looks at their application." Kistler, a computer science graduate with 19 years of product management experience, applied for senior PayPal roles via Eightfold in December without landing an interview, while Bhaumik, a project manager with degrees from Bryn Mawr and the University of Pittsburgh, was automatically rejected from a Microsoft role two days after applying. The lawsuit claims that nearly two-thirds of large companies now use AI technology like Eightfold's to screen candidates, while 38% deploy AI software to match and rank applicants. "This case is about a dystopian AI-driven marketplace, where robots operating behind the scenes are making decisions about the most important things in our lives: whether we get a job or housing or healthcare," David Seligman, Executive Director at Towards Justice and one of the attorneys representing the plaintiffs, tweeted. "There is no AI exemption to the law -- no matter how fancy-sounding the tech or how much venture capital is behind it," he noted. The complaint alleges that Eightfold's proprietary Large Language Model incorporates data on "more than 1 million job titles, 1 million skills, and the profiles of more than 1 billion people working in every job, profession, [and] industry," plus "inferences drawn" to create profiles reflecting applicants' "preferences, characteristics, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes." During the application process, neither plaintiff received a standalone disclosure that consumer reports would be generated, nor did they receive summaries of their consumer protection rights or information about Eightfold's role as a consumer reporting agency, the lawsuit alleges. Decrypt has reached out to Eightfold for comment and will update this article should they respond.
[5]
AI Recruitment Platform Eightfold Sued for Screening Job Applicants Without Consent | AIM
The lawsuit marks the first case in the US to accuse an AI recruitment firm of breaching the Fair Credit Reporting Act. Eightfold AI, an AI recruitment platform based in the US and used by companies such as Microsoft and PayPal, as well as various Fortune 500 firms, is being sued in California for reportedly compiling applicant screening reports without their consent. The lawsuit, filed on January 20, marks the first case in the US to accuse an AI recruitment firm of breaching the Fair Credit Reporting Act, according to the legal firms that initiated the suit. It also highlights how consumer advocates are seeking to enforce existing laws on AI systems that can infer information about individuals through extensive data analysis. "In order to protect against the harms of such reports, the FCRA requires consumer reporting agencies like Eightfold to make certain disclosures, obtain certain certifications, and ensure that consumers (here, job applicants) have a mechanism to review and correct reports that are provided to prospective employers for purposes of determining eligibility for employment," the suit said. The startup offers tools to speed up hiring by assessing job applicants and predicting their fit for positions using data from online resumes and job listings. "There is no AI-exemption to these laws, which, for decades, have been an essential tool in protecting job applicants from abuses by third parties, like background check companies, that profit by collecting information about and evaluating job applicants," they said in the lawsuit. However, individuals seeking employment at firms that use these technologies are not informed or given an opportunity to contest inaccuracies, as alleged by Erin Kistler and Sruti Bhaumik in their proposed class-action lawsuit. As a result, they assert that Eightfold breached the FCRA and a California statute that grants consumers the right to access and dispute credit reports utilised in hiring and lending. According to Eightfold representative Kurt Foeller, the platform operates on data provided by candidates or clients, as reported by Reuters. "We do not scrape social media and the like. We are deeply committed to responsible AI, transparency, and compliance with applicable data protection and employment laws," Foeller said. According to the lawsuit, Eightfold generates consumer reports for potential employers using its Evaluation Tools. They evaluate job candidates not just as individuals by claiming to pinpoint their likely skills, experiences, and traits, but also in relation to each other, ranking applicants on a scale from 0 to 5 based on the findings, conclusions, and assumptions derived from Eightfold's proprietary AI regarding their "likelihood of success." Eightfold creates talent profiles of job seekers that include personality descriptions such as 'team player' and 'introvert', ranks their 'quality of education', and predicts their future titles and companies, according to the lawsuit. "Employers use these reports to sift through applications, typically only reviewing highly ranked candidates. Lower-ranked candidates are often discarded before a human being ever looks at their application," the lawsuit said. Kistler and Bhaumik filed a lawsuit in California state court on behalf of all job applicants in the US who were assessed using the company's tools. The proposed class is represented by the labour law firm Outten & Golden and the nonprofit advocacy organisation Towards Justice. Kistler sought positions at various companies that use Eightfold, including PayPal, while Bhaumik pursued opportunities at firms like Microsoft, as stated in the complaint. Both individuals have degrees in science or technology and over a decade of experience. They were not selected for employment, and each believes that Eightfold's tools contributed to this outcome. Microsoft and PayPal are not named as defendants in the lawsuit.
[6]
AI company Eightfold sued for helping companies secretly score job seekers
Jan 21 (Reuters) - Eightfold AI, a venture capital-backed artificial intelligence hiring platform used by Microsoft, PayPal and many other Fortune 500 companies, is being sued in California for allegedly compiling reports used to screen job applicants without their knowledge. The lawsuit filed on Tuesday accusing Eightfold of violating the Fair Credit Reporting Act shows how consumer advocates are seeking to apply existing law to AI systems capable of drawing inferences about individuals based on vast amounts of data. Santa Clara, California-based Eightfold provides tools that promise to speed up the hiring process by assessing job applicants and predicting whether they would be a good fit for a job using massive amounts of data from online resumes and job listings. But candidates who apply for jobs at companies that use those tools are not given notice and a chance to dispute errors, job applicants Erin Kistler and Sruti Bhaumik allege in their proposed class action. Because of that, they claim Eightfold violated the FCRA and a California law that gives consumers the right to view and challenge credit reports used in lending and hiring. "There is no AI-exemption to these laws, which have for decades been an essential tool in protecting job applicants from abuses by third parties--like background check companies--that profit by collecting information about and evaluating job applicants," they said in the lawsuit. A spokesperson for Eightfold did not immediately reply to a request for comment. Eightfold is backed by venture capital firms including SoftBank Vision Fund and General Catalyst. Kistler and Bhaumik sued in California state court on behalf of all U.S. job seekers who applied for jobs and were evaluated using the company's tools. Labor law firm Outten & Golden and nonprofit advocacy group Towards Justice represent the proposed class. Eightfold creates talent profiles of job seekers that include personality descriptions such as "team player" and "introvert," rank their "quality of education," and predict their future titles and companies, according to the lawsuit. Kistler applied to roles at several companies that use Eightfold, including PayPal, and Bhaumik applied to companies including Microsoft, according to the complaint. Both hold science or tech degrees and have more than 10 years of experience. Neither was hired, and both believe that Eightfold's tools played a role. Microsoft and PayPal are not defendants in the lawsuit. A Microsoft spokesperson declined to comment. A spokesperson for PayPal did not immediately reply to a request for comment. One-third of Eightfold customers are Fortune 500 companies, including Salesforce and Bayer, according to the company's website. The New York State Department of Labor and Colorado Department of Labor and Employment also offer Eightfold-powered platforms for job seekers. (Reporting by Jody Godoy in New York; Editing by Matthew Lewis)
Share
Share
Copy Link
A class-action lawsuit filed in California accuses Eightfold AI of using hidden algorithms to score job applicants without their knowledge or consent. The case marks the first time an AI hiring platform has been challenged under the Fair Credit Reporting Act, raising questions about whether automated recruitment systems should face the same transparency requirements as credit bureaus.
Eightfold AI, a Santa Clara-based AI-powered hiring platform used by major corporations including Microsoft, PayPal, Salesforce, and Bayer, is now defending itself against a groundbreaking class-action lawsuit filed in California's Contra Costa County Superior Court
1
. The case represents the first time in the United States that an AI recruitment platform has been accused of violating the Fair Credit Reporting Act, a federal consumer protection law enacted in 1970 to shield individuals from opaque credit reporting practices2
. Plaintiffs Erin Kistler and Sruti Bhaumik, both experienced professionals in STEM fields, allege that Eightfold's AI hiring system generates secret scoring of job applicants without providing required disclosures or obtaining consent, effectively creating consumer reports that determine employment eligibility4
.
Source: Decrypt
At the center of the controversy lies Eightfold's proprietary scoring mechanism, which evaluates job applicant screening using what the company describes as more than 1.5 billion global data points
4
. The AI recruitment platform draws on over 1 million job titles, 1 million skills, and profiles of more than 1 billion professionals across industries and geographies3
. When employers evaluate candidates, the system generates talent profiles that include personality assessments such as "team player" or "introvert," ranks educational quality, and even predicts future job titles5
. These algorithmic assessments are distilled into match scores ranging from zero to five, which predict the likelihood of success for each candidate2
. Lower-ranked applicants are often filtered out before human recruiters ever review their applications, creating what critics describe as an algorithmic gatekeeper that operates in complete opacity1
.Source: TechSpot
The lawsuit's legal strategy hinges on whether Eightfold's automated rankings constitute consumer reports under FCRA regulations
1
. The Fair Credit Reporting Act requires organizations that create reports about personal characteristics for employment purposes to provide transparency, obtain consent, and implement dispute mechanisms similar to credit bureaus5
. The complaint, backed by labor law firm Outten & Golden and nonprofit advocacy group Towards Justice, argues that job seekers deserve the same protections against inaccurate or hidden data that consumers receive when credit agencies evaluate them1
. "There is no AI exemption to our laws," said David Seligman, executive director of Towards Justice. "For decades, these statutes have protected people from opaque systems that decide their futures"1
. Plaintiffs seek damages between $100 and $1,000 per violation under federal law, plus up to $10,000 per violation under California's Investigative Consumer Reporting Agencies Act4
.Related Stories
The plaintiffs' experiences illustrate the frustration many job seekers face when navigating AI hiring systems. Kistler, a computer science graduate with 19 years of product management experience, applied for senior roles at PayPal through Eightfold in December without securing an interview
4
. She tracked thousands of applications over the past year, finding that only 0.3 percent resulted in follow-ups or interviews3
. "I think I deserve to know what's being collected about me and shared with employers," Kistler told The New York Times. "And they're not giving me any feedback, so I can't address the issues"3
. Bhaumik, a project manager with degrees from Bryn Mawr and the University of Pittsburgh, was automatically rejected from a Microsoft position just two days after applying4
. Their stories reflect a broader trend, as roughly 88% of companies now use some form of AI for initial candidate screening2
.
Source: Futurism
Eightfold disputes the allegations through spokesperson Kurt Foeller, who stated that the platform operates solely on data intentionally shared by candidates or provided by employer customers
1
. "We do not scrape social media and the like. We are deeply committed to responsible AI, transparency, and compliance with applicable data protection and employment laws," Foeller said5
. However, the lawsuit claims the system collects sensitive personal information including social media profiles, location data, internet activity, and tracking cookies from public sources like LinkedIn and GitHub4
. The case follows another high-profile challenge against Workday, where a federal judge allowed discrimination claims to proceed last year1
. With Eightfold claiming that one-third of Fortune 500 companies use its technology and backing from investors including SoftBank's Vision Fund, a ruling classifying AI hiring algorithms as consumer reporting systems could require tech vendors across the industry to open their processes to scrutiny and provide background check-level transparency1
. Jenny R. Yang, former chair of the U.S. Equal Employment Opportunity Commission and attorney for the plaintiffs, emphasized the stakes: "Qualified workers across the country are being denied job opportunities based on automated assessments they have never seen and cannot correct"2
.Summarized by
Navi
[2]
1
Policy and Regulation

2
Technology

3
Technology
