Ensuring EEOC Compliance for Our Algorithmic Recruiting Software

Updated by Chris Calmeyn

Ensuring EEOC Compliance for Our Algorithmic Recruiting Software

At Fetcher, our algorithmic recruiting software leverages AI and machine learning technology to match candidate profiles with open positions. However, we recognize that the use of algorithms also risks introducing unlawful discrimination.

We are committed to ensuring our software complies with EEOC anti-discrimination laws through the following measures:

  • Our algorithms are trained on quality, unbiased datasets that avoid using protected class information such as race, gender, age, etc. or proxies for protected classes.
  • We closely monitor algorithmic outcomes to detect any unintended demographic skew in candidate matching. If detected, the algorithms are re-trained to prevent discrimination
  • Algorithmic matches are validated by trained employees before profiles are shared with customers to minimize bias risk.
  • Hiring decisions ultimately rest with the employer. Our software provides matches but employers make the final determination on candidates.
  • All client employers must comply with EEOC hiring regulations. Our software terms prohibit unlawful use of our tools for discriminatory purposes.
  • Our algorithms are continuously reviewed, validated, and enhanced to ensure they remain fair, transparent and focused strictly on job-relevant qualifications per EEOC standards.

By making ethical design choices and proactively monitoring for algorithmic bias, we aim to increase recruitment diversity, effectiveness and compliance. Our legal team stays current on EEOC guidelines and regularly audits our software to maintain compliance.


How Did We Do?


Powered by HelpDocs (opens in a new tab)