To address some of the concerns about the use of automated employment decision tools (AEDTs) in making employment decisions, the New York City Council has taken decisive action and passed legislation that mandates bias audits of these tools. Originally due to come into effect on 1st January 2023, the enforcement date for Local Law 144 has now been pushed back to 15th April 2023 due to the high volume of comments received during the public hearing on the Department of Consumer and Worker Protection’s (DCWP) proposed rules to clarify the requirements of the legislation. Following this first hearing, DCWP has published an update to their proposed rules and will be holding a second hearing.
In this blog post, we outline the 10 key things you need to know about this legislation and the proposed and updated rules.
An impartial evaluation of an automated employment decision tool carried out by an independent auditor that should include (but is not limited to) assessing for disparate impact against category 1 protected characteristics (race/ethnicity and sex/gender at minimum). Employers must provide a summary of this audit on their website if using automated employment decision tools to assess candidates residing in New York City and must inform them of the key features of the automated tool before using it.
The first version of the proposed rules specify that bias should be determined using impact ratios based on subgroup selection rate (% of individuals in the subgroup that are hired), subgroup average score, or both. Ratios are calculated by dividing the subgroup average score/selection rate by the average score/selection rate of the group with the group with the highest score/rate:
However, the updated rules provide a revised calculation for calculating impact ratios for AEDTs that result in a continuous score, where scores are first binarized using a pass/fail criteria depending on whether scores are above or below the median score of the sample, termed the scoring rate:
The updated rules also clarify that when historical data, or real-life hiring data, is unavailable, the audit can be conducted using test data, providing that the reason for doing so is included in the summary of results. Where historical data is available, it can be sourced from any employers or employment agencies that use the AEDT, in the case that it is used by multiple entities. However, a bias audit based on historical data from another entity may only be relied on if the employer or employment agency has provided historical data to the independent auditor or if it has never used the AEDT.
Further, the proposed rules explicate that bias audits must be conducted in relation to the following categories:
While the initial legislation did not explicate who is considered an independent author, the first version of the proposed rules clarified that an independent auditor is a person or group that was not involved in using or developing the AEDT. However, amid concerns that this could lead to employers or employment agencies conducting internal audits of their tools, where teams not involved in the use or development of the tool would conduct the audit, the updated rules make it clear that audits should be conducted by a third party. Giving a more comprehensive definition, the second version of the rules state that an independent auditor should exercise objective and impartial judgement and that they are not considered to be independent if:
A computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence that produces a simplified output (a score, classification, or recommendation) used to aid or automate decision making for employment decisions (screening for promotion or employment).
The proposed rules clarify that machine learning, machine learning, statistical modeling, data analytics, or artificial intelligence are a group of computer-based mathematical, computer-based techniques that generate a prediction of a candidate’s fit or likelihood of success or classification based on skills/aptitude. The inputs, predictor importance, and parameters of the model are identified by a computer to improve model accuracy or performance and are refined through cross-validation or by using a train/test split. They also clarify that a simplified output includes ranking systems.
Video interviews, game-based/image-based assessments, and resume screening tools etc. that are scored or evaluated by an algorithm. Systems that rank candidates on their suitability for a position or how well they meet some criteria are also considered automated employment decision tools.
Employers using an automated employment decision tool must provide a summary of the results of a current bias audit (< 1 year old) on their website or the website of the Employment agency before using the tool.
The first version of the proposed rules clarify that this summary should appear the careers or jobs section of their website in a clear and conspicuous manner and should include the date of the most recent bias audit of such AEDT, the distribution date of the AEDT to which such bias audit applies, and a summary of the results (including selection rates and impact ratios for all categories). The updated rules further clarify that the summary of results should include the source and explanation of the data used for the bias audits.
At least ten working days before the tool is used, candidates must be informed that an automated employment decision tool is being used to assess them and allow them to request an accommodation or alternative selection process. The characteristics that are being used to make the judgments and the source and type of data being used within 30 days of written request. If it is not available on the website of the employer or the Employment Agency.
The first version of the proposed bias audit rules clarify that the notice can be given by including it in a job posting or by sending it through U.S. mail or e-mail. For employees specifically, notice can also be given in a written policy or procedure that is provided to employees, and for candidates, the notice can be included on the careers or jobs section of its website.
The updated rules specify that the notice must also include instructions for how candidates or employees can request accommodations or alternative selection processes under other laws, if available. The updated rules also specify that employers and employment agencies must provide information about its AEDT data retention policy on the employment section of its website, along with information about the type and source of data collected by the AEDT. Instructions on how to make a request for such information should also be posted, and responses should be issued within 30 days, including any explanations of why providing such information would violate local, state, or federal laws or interfere with a law enforcement investigation.
Employers using automated employment decision tools to evaluate candidates or employees who reside in New York City for a position or promotion.
Up to $500 for the first violation and each additional violation occurring on the same day. Subsequent violations incur penalties of $500 - $1500.
The subchapter should not be construed to limit the rights of any candidate of employee for an employment decision to bring civil action. Therefore, candidates’ civil rights are not affected and other relevant equal employment laws must still be followed by the employer.
The proposed rules clarify that nothing in the legislation requires employers to comply with requests for alternative procedures or accommodations, but these practices may be covered by other legislation (e.g., Americans with Disabilities Act; ADA).
Originally due to come into effect on 1st January 2023, the enforcement date of Local Law 144 has been pushed back to 15th April 2023. From then, the NYC Bias Audit law will make it unlawful for employers to use an automated employment decision tool without a bias audit to screen candidates or employees residing in New York City.
To find out more about how Holistic AI can help you prepare for this and other upcoming legislation, get in touch at we@holisticai.com
Last updated 3rd January 2023
Subscribe to our newsletter!
Join our mailing list to receive the latest news and updates.
Our automated AI Risk Management platform empowers your enterprise to confidently embrace AI
Get Started