Updated Rules on NYC Local Law 144

New York City’s DCWP Updates their Proposed Rules for Local Law 144

December 28, 2022
New York City’s DCWP Updates their Proposed Rules for Local Law 144

New York City Local Law 144 is a landmark piece of legislation that mandates independent impartial bias audits of automated employment decision tools (AEDTs) used to evaluate candidates for employment or employees for promotion in New York City. Initially due to go into effect on 1st January 2023, the enforcement date has now been pushed back to 15th April 2023 considering concerns raised during the public hearing on the Department of Consumer and Worker Protection’s (DCWP) proposed rules about who qualifies as an independent auditor and the suitability of the impact ratio metrics. In response to these concerns, the DCWP has published an updated version of its rules and announced a second public hearing that will be held on 21st January. In this blog post, we summarise the key changes to these updated rules.

Updated rules on NYC Local Law 144: Who qualifies as an independent auditor?

While the first version of the proposed rules did clarify who constitutes an independent auditor, in that this person could not be involved in the use or development of the AEDT undergoing the audit, this definition also created some confusion about whether the tools could be internally audited by someone who was not involved in the development or design of the tool.

The updated rules clarify that an independent auditor is not someone who: i) was involved in using, developing, or designing the AEDT, ii) during the audit, had an employment relationship with the employer or employment agency that seeks to (continue to) use the AEDT or with a vendor that developed or distributes the AEDT, or iii) has a direct or material indirect financial interest in the employer or employment agency that seeks to (continue to) use the AEDT or in a vendor that developed or distributed the AEDT.

In short, the rules stipulate that bias audits cannot be conducted internally or by anyone connected to the employment agency, employer, or vendors and must be conducted by a third party.

Calculating Impact Ratios

While the original law did not specify the metrics that should be used to determine whether a system is biased during the audit, the first version of the proposed rules explicated that bias should be determined using impact ratios based either on the selection rate or the average score, depending on whether the system results in a binary outcome or continuous score:

Calculating Impact Ratios of NYC Local Law 144

However, given that this impact ratio calculation is similar to the Equal Employment Opportunity Commission’s four-fifths rule, which was designed to calculate whether disparate or adverse impact is occurring based on selection rates, there were concerns about how suitable this metric is for continuous scores.

Accordingly, the updated rules have proposed a new formula for determining whether bias is occurring for systems that result in a score. To binarise scores and make them more compatible with the disparate impact ratio calculation, candidates or employees are assigned to pass/fail depending on whether they score above or below the sample’s median score, termed the scoring rate. This is then used to calculate impact ratios using the following calculation:

NYC Bias Audit scoring rate

Converting the scores to a classification in this way is better suited to the impact ratio calculation and is something that is done in practice when hiring data is not available, for example during the development and validation of a tool.

In addition, the updated rules clarify that impact ratios should be calculated for sex (or gender), ethnicity, and for sex and ethnicity intersections at minimum. Where the AEDT classifies employees or candidates into groups, such as different leadership styles, these calculations should be performed for each group.

The use of test data

Another addition to the new version of the proposed rules is that the audit can now be based on test data when historical data (data collected during the use of the AEDT) is not available. This is not only useful for employers or employment agencies that do not collect demographic data for privacy reasons, but it can also support the testing for bias during the development and validation of a tool, meaning that subgroup differences can be compared even when the product has not yet used in any hiring decisions.

Under the new rules, therefore, bias audits can be conducted by running panels that collect demographic information and ask participants to submit responses using the AEDT. In this case, the summary of results should explain why historical data could not be used and how the test data was collected.

Data retention policy

The first version of the proposed rules provided additional clarity regarding how to provide notice to candidates and employees surrounding the use of the tool and the characteristics that it would consider, including providing written notice via mail or e-mail, providing notice in a job posting, on the employment section of the website for candidates, or in the employee handbook for employees.

The revised rules provide some additional clarity about providing the data retention policy; whereas the original legislation stipulated that the employer or employment agency’s data retention policy should be available upon written request, the revised rules state that employers and employment agencies should provide information on the AEDT data retention policy in a clear and conspicuous way on the employment section of its website. They must also provide instructions on how to make a written request for this information and must provide a response within 30 days or provide an explanation of why providing this information would violate local, state, or federal law or interfere with a law enforcement investigation.

Getting started with a bias audit

The delayed enforcement date provides additional time to commission a bias audit of your AEDT, particularly if you were planning on doing an internal audit, so it’s not too late to get started.

At Holistic AI, we have audited over 100 projects and combine our expertise in policy, computer science, business psychology, and AI ethics to holistically understand a system and the context it is used in, examine the inputs and outputs of the system, and impartially audit a system using our auditing framework. To find out how we can help you, get in touch at we@holisticai.com.

Manage risks. Embrace AI.

Our automated AI Risk Management platform empowers your enterprise to confidently embrace AI

Get Started