AI Auditing and Assurance
Beamery's commitment to responsible AI is upheld by Warden AI's assurance platform, which performs continuous AI audits. This live dashboard provides regular updates, ensuring that Beamery's AI systems are fair, transparent, and compliant for use.
Bias Audit Report
Download reportOverview
Summary
Warden AI is engaged by Beamery to perform ongoing bias audits of Beamery’s AI system. This bias audit report has been created by Warden AI’s auditing platform and reviewed by the Warden AI team.
The report covers a subset of the overall audit that relates to the requirements of the NYC Local Law 144. The methods used meet the specific requirements for conducting a bias audit of automated employment decision tools (AEDT) as published in the final rules of the NYC Department of Consumer and Worker Protection (DCWP).
A Disparate Impact Analysis was conducted to identify the adverse impact on persons of protected groups separated by sex and race/ethnicity as mandated by the Local Law 144. Warden’s independent data set of real candidate profiles was used to perform the audit, due to a lack of access to historical data.
This bias audit is meant for demonstration purposes and does not indicate the bias audit results of Beamery’s tools for any particular employer or job opportunity.
System overview
Beamery’s AI Talent Match is an AI system that predicts the degree of match between a job candidate and a vacancy.
This system is part of the Skills AI feature set and appears in a number of use cases in the platform: AI Suggested Contacts for Vacancies, Suggested Vacancies for Candidates (Talent Portal Match Score), AI Vacancy Calibration Insights (Beamery Insights), Applicant Scoring, Talent Portals: Match Scores for Candidates, Match Score explainability.
- Candidate profile
- Vacancy profile
- Match score (0 to 1)
Audit results
The results are presented in three tables, one for each demographic category (Sex, Race/Ethnicity, Intersectional). For each table, the reference group (the group with the highest selection rate to which all other groups are compared for the purpose of calculating the impact ratio) is identified.
Sex bias
Female | 2,828 | 1,424 | 50.4% | 1.00 |
Male | 2,632 | 1,306 | 49.6% | 0.99 |
Race/Ethnicity bias
Asian | 1,312 | 653 | 49.8% | 0.97 |
Black or African American | 1,284 | 662 | 51.6% | 1.00 |
Hispanic or Latino | 1,324 | 667 | 50.4% | 0.98 |
White | 1,540 | 748 | 48.6% | 0.94 |
Intersectional bias (Sex X Race/Ethnicity)
Asian | Female | 616 | 309 | 50.2% | 0.92 |
Asian | Male | 696 | 344 | 49.4% | 0.91 |
Black or African American | Female | 752 | 385 | 51.2% | 0.94 |
Black or African American | Male | 532 | 277 | 52.1% | 0.96 |
Hispanic or Latino | Female | 680 | 370 | 54.4% | 1.00 |
Hispanic or Latino | Male | 644 | 297 | 46.1% | 0.85 |
White | Female | 780 | 360 | 46.2% | 0.85 |
White | Male | 760 | 388 | 51.1% | 0.94 |