Focused On Employment, Whistleblower And Business Law

Outsourcing Hiring Won’t Outsource Risk: Implications for Employers Using AI in Hiring

Sep 20, 2025 | Legal News

AI for hiringArtificial intelligence (“AI”) is reshaping the way companies recruit and hire talent. However, a July 2024 decision from the Northern District of California reminds employers that outsourcing hiring to an AI-driven vendor does not absolve them of their legal responsibility. The case, Mobley v. Workday, raises significant questions about how federal antidiscrimination laws apply when employers rely on third-party platforms for screening job applicants. 

In this putative class action, the plaintiffs allege that Workday’s AI-powered hiring platform systematically disadvantaged job seekers over the age of 40 in violation of the Age Discrimination in Employment Act (“ADEA”). According to the complaint, Workday’s recommendation system allegedly scored, sorted, and screened out older applicants, embedding biases from training data and employer preferences. The plaintiffs claim they submitted hundreds of applications but were consistently rejected, sometimes within minutes, despite meeting minimum qualifications. 

The Court dismissed the theory that Workday operated as an “employment agency,” concluding that it does not directly procure employees for employers. However, the Court accepted the plaintiffs’ alternative “agent” theory of liability, finding that Workday may plausibly act as an agent of the employers who rely on its tools. By performing traditional hiring functions, such as rejecting candidates at the screening stage or recommending who should advance, Workday could be considered to be acting on behalf of the employers who use its platform. 

Although Workday is the named defendant, this ruling raises an even more important question:

What about the liability of employers who use Workday’s tools?

More than 11,000 employers currently use Workday, and over 1.1 billion applications were rejected through its systems during the relevant period. As the case moves into discovery, the employers who relied on Workday’s platform may themselves be identified and subject to potential claims. This ruling signals that courts are willing to scrutinize how AI systems affect protected groups and hold employers accountable for discriminatory impacts—even when the decisions are made by a third-party vendor. 

Employers cannot assume that contracting with a vendor shifts legal responsibility. Federal law makes clear that employers are ultimately accountable for compliance with the ADEA, Title VII, and other antidiscrimination statutes. Even if vendors face liability, employers remain on the hook for hiring decisions that result in unlawful bias. This means organizations need to take proactive steps to evaluate the risks that come with outsourcing critical HR functions. 

Practical Takeaways for Employers: 

  1. Vendor Accountability
    Outsourcing does not eliminate responsibility. Employers must carefully review agreements with AI and recruiting vendors, paying close attention to indemnity, compliance, and audit provisions. Contracts should reflect clear expectations about nondiscrimination compliance and vendor transparency.
  2. Audit Hiring Tools
    Employers should demand information about how AI systems score, rank, and filter applicants. Understanding the data inputs, training processes, and decision-making criteria is essential. Employers may want to work with legal counsel or outside experts to evaluate whether tools could have a disparate impact on protected groups.
  3. Maintain Human Oversight
    AI can assist in screening, but it should not replace meaningful human review. Employers should ensure that a qualified recruiter or hiring manager has the final say, particularly when AI tools exclude candidates who otherwise meet minimum qualifications.
  4. Monitor Outcomes
    Employers should track the results of hiring processes that involve AI to identify any patterns of exclusion. Regular reviews of demographic data can help organizations spot potential risks before they become claims.
  5. Train HR Teams
    Human resources staff and managers need training on the legal implications of using AI. They should know how to question vendor practices, evaluate flagged risks, and document compliance efforts.

The Mobley v. Workday case is still in the early stages, and its ultimate outcome remains uncertain. But the case highlights a growing trend: courts, regulators, and plaintiffs are paying close attention to how AI-driven hiring affects protected classes. Employers who rely on third-party platforms cannot afford to take a hands-off approach. 

By auditing AI tools, insisting on transparency from vendors, and maintaining meaningful human oversight, employers can reduce their legal exposure while still leveraging technology to improve hiring efficiency. At the end of the day, employers—not vendors—will bear the risk if AI systems discriminate. 

At Hoyer Law Group, we help employers stay ahead of evolving compliance risks in hiring and employment practices. Our attorneys advise on vendor agreements, conduct audits of AI-driven systems, and train HR teams on lawful hiring practices. If your organization uses AI or third-party platforms in recruitment, now is the time to evaluate your exposure and update your policies. 

Contact us at Hoyer Law Group to learn how we can help you adopt innovative hiring tools without sacrificing compliance or increasing your risk of discrimination claims. 

 

Featured On

Archives