Department of Consumer and Worker Protection’s Public Hearing on the NYC Bias Audit Law (Local Law 144): Key Takeaways 
Bias Audit

Department of Consumer and Worker Protection’s Public Hearing on the NYC Bias Audit Law (Local Law 144): Key Takeaways 

November 8, 2022

New York City’s Department of Consumer and Worker Protection (DCWP) held a public hearing on their proposed rules for the NYC Bias Audit Legislation (Local Law 144) on 4th November after being postponed from the 24th October due to capacity issues.

Leading up to the hearing, comments, concerns, and queries about the rules could be submitted online or via email – see Holistic AI’s comments here. Although the department did not provide any answers or clarifications during the hearing, attendees were given the opportunity to testify. Many participants took the opportunity to read their written comments into the record.

Here are some key takeaways from the public hearing on the proposed rules for the NYC Bias Audit Legislation (Local Law 144):

Bias Audits Must be Conducted by Third-Parties

While the proposed rules state that anyone who is not involved in the development or use of an automated employment decision tool, the majority argued that this is not sufficient. Independent third parties, rather than an employee of the company who is not directly involved in the design or deployment of the system, should conduct the audits.

Conflicts of interests can arise if a company tries to audit their system internally, even from separate departments or divisions, and can increase legal exposure. To ensure that audits are robust, employers must partner with a third-party auditor such as Holistic AI, to ensure that the bias audit is impartial and compliant with the legislation.

Impact Ratios Are a Poor Metric for Small Sample Sizes

The Department’s proposed rules specify that bias should be determined using impact ratios, where average scores or selection rates should be examined for different subgroups after the initial legislation was silent on the underlying bias metrics. The impact ratio is similar to the Equal Employment Opportunity Commission’s 4/5th rule, which states that the selection rate of one group should not be less than 4/5 of the selection rate of the group with the highest rate. However, it is widely acknowledged that this metric is not suitable for small sample sizes. Instead, other metrics such as Cohen’s d or the two standard deviations rule are more appropriate when sample sizes are small.

However, the proposed rules do not state that the 4/5th threshold should be used and do not give any indication of what the best approach is when sample sizes are small or when it is acceptable to use metrics other than the impact ratios. During the hearing, many shared this concern and called for clarification on the most appropriate metrics to use when dealing with small sample sizes.

Our open-source library offers multiple alternative metrics that can be used to assess bias, and provides mitigation strategies for when bias is found.

Clarifications on Notice Requirements are Needed

The original legislation states that candidates or employees must be informed at least 10 business days prior to the use of the automated employment decision tool (AEDT) to evaluate them as well as the characteristics the AEDT will consider.

DCWP’s proposed rules provide some clarity about how this notice can be provided:

  • Posting the notification in the job on the careers section of their website (for candidates only)
  • Including notice in a written policy or procedure (for employees only)
  • Posting the notification in the job description
  • Providing written notice in person, via US mail, or by email

While this provides some clarification around the requirement to inform each candidate or employee individually, it was clear from the hearing there are still questions about the best way to notify candidates or employees and the implications of the notice requirements on hiring timelines.

Further Guidance is Needed for Vendors of AEDTs

Under Local Law 144, employers are liable for complying with the legislation. However, many employers outsource their automated employment decision tools, procuring them from vendors. Therefore, despite not being liable under the legislation, vendors are often the ones commissioning the bias audits on behalf of their clients.

During the hearing, there were calls for clarifications on the requirements for vendors on AEDTs and how they can best support their clients to comply with the legislation.

The NYC Bias Audits Legislation is Just the Start

During the hearing, participants expressed overwhelming support for the New York City bias audit legislation, particularly its enforcement of transparency; while employers have been conducting adverse impacts analysis to test a bias for decades, the results of these analysis are not often made public.

The legislation’s major contribution is giving candidates and employees the information they need to be able to make an informed decision about the systems they interact with. Nevertheless, this legislation is just the start or things to come. Additional legislation is needed to ensure that developers and employers of automated sessions are held accountable for their actions and that the systems are made as safe as possible.

Holistic AI can support you on your journey to compliance with this and other relevant legislation, helping you to identify risks to your enterprise, establish a risk management framework, and protect against future potential harms. Book a demo  to find out more about how we can help you embrace your AI with confidence.

Subscribe to our Newsletter!

Join our mailing list to receive the latest news and updates.

Manage risks. Embrace AI.

Our automated AI Risk Management platform empowers your enterprise to confidently embrace AI

Get Started