We’re Hiring!
Join our team

NYC Bias Audits Protected Characteristics

May 3, 2023
Authored by
Researcher at Holistic AI
Researcher at Holistic AI
NYC Bias Audits Protected Characteristics

The New York City Council took decisive action to mandate bias audits of automated employment decision tools (AEDTs) used to evaluate employees for promotion or candidates for employment in New York City, signaling that the risks of Artificial Intelligence (AI) are becoming an increasing regulatory concern. Local Law 144, also known as the NYC Bias Audit Law, is the first of its kind to codify independent, impartial bias audits in law.

As part of this regulation, employers (and employment agencies) are required to make a summary of the results of the bias audit publicly available on their website, increasing the transparency of these systems and allowing applicants to make more informed decisions about their interactions with them. Employers are also required to notify candidates and employees about the use of an AEDT, the characteristics that it will consider, and their instructions on how to request accommodations or an alternative selection procedure. In this blog post, we outline the protected characteristics that must be analysed for the bias audit and what to do if you do not have this data.

Key takeaways:

  • Local Law 144 requires that, at minimum, bias audits should examine outcomes based on the EEO Component 1 categories of sex and race/ethnicity.
  • The required sex categories are male/female, with an optional ‘Other’ category.
  • the required race/ethnicity categories are Hispanic or Latino, White, Black or African American, Native Hawaiian or Pacific Islander, Asian, Native American or Alaska Native, and two or more races.
  • The delayed enforcement date, now 5 July, provides an opportunity to collect the required data.
  • Those that do not have access to real-life data relating to the required categories are permitted to collect test data.

What Protected Characteristics are Covered by the Legislation?

The legislation states that bias audits should include, at a minimum, testing for disparate impact against component 1 categories required to be reported by employers under subsection (c) of section 2000e-8 of title 42 of the United States Code as specified in part 1602.7 of title 29 of the code of federal regulations. Rules proposed by the Department of Consumer and Worker Protection further clarify this, explicating that the audits, at minimum, should cover each race/ethnicity and sex category that is required to be reported to the Equal Employment Opportunity Commission (EEOC), with an updated version of the rules also specifying that intersectional analysis must be carried out in addition to standalone analysis.

NYC Bias Audits Protected Characteristics

Race/Ethnicity Categories

Under the DCWP’s updated rules, race/ethnicity data must be divided into seven categories: Hispanic or Latino, White, Black or African American, Native Hawaiian or Pacific Islander, Asian, Native American or Alaska Native, and two or more races. The Instruction Booklet on Component 1 Data Collection recently published by the EEOC defines each of these categories:

  • Hispanic or Latino – This category includes individuals of Cuban, Mexican, Puerto Rican, South or Central American, or other Spanish culture or origin, regardless of their race.
  • White (Not Hispanic or Latino)- Includes people with origins from Europe, the Middle East, or North Africa.
  • Black or African American (Not Hispanic or Latino) – Individuals that have origins in any of the black racial groups of Africa.
  • Native Hawaiian or Other Pacific Islander (Not Hispanic or Latino) – Anyone with origins in Hawaii, Guam, Samoa, or other Pacific Island.
  • Asian (Not Hispanic or Latino) – Individuals with origins in the Far East, Southeast Asia, or the Indian Subcontinent. This includes Cambodia, China, India, Japan, Korea, Malaysia, Pakistan, the Philippine Islands, Thailand, and Vietnam.
  • American Indian or Alaska Native – People with origins in any of the original peoples of North and South America (including Central America) who maintain tribal affiliation or community attachment.
  • Two or more races – anyone who identifies with more than one race. This does not include those identifying as Hispanic or Latino.  

Sex/Gender Categories

Under the DCWP’s proposed rules, sex data should be based on male/female classifications. However,  the EEOC has recently expanded its classification of gender, adding two new categories: unspecified and another gender identity. Making gender reporting more inclusive of those who are non-binary, for example, individuals no longer must restrict their self-reported gender to one of two categories. Thus, the sex/gender categories used for bias audits are male, female, and other.

What if You Don’t Have the Required Data?

Due to stringent data protection laws, such as the EU’s Gender Data Protection Regulation (GDPR) or France’s laws that prohibit employers for asking applicants information related to protected attributes, some employers, employment agencies, and vendors may not have the data on protected characteristics required for the bias audit. With the delayed enforcement date, this provides an opportunity to find a means of collecting this data.

For those that are unable to do this, under the revised rules, bias audits are now permitted to be conducted using test or synthetic data. The typical way for such data to be collected is by recruiting participants to be assessed by the AEDT and provide their demographic information, with online panel sites offering a rapid way to collect this data. If test data is used for the bias audit, however, the summary of results must explain why historical, or real-life, data was not used and how the test data was collected.

Enforcement Date Delayed

Originally due to come into effect on 1 January 2023, the enforcement date for this legislation was pushed back to 15 April 2023 due to the large number of public comments that were received during the first public hearing on the Department of Consumer and Worker Protection’s (DCWP) proposed rules, particularly concerning the metrics that should be used to determine bias. Consequently, the DCWP revised the proposed rules and held a second hearing on 23 January 2023. Following this hearing, the final rules were adopted in April and the enforcement date has been postponed to 5 July 2023.

This will give employers that have already had an audit of their system time to collect additional and carry out an additional audit if they have made any changes in light of the results of their initial audit, although their first audit will still be valid (for 1 year after the audit date) when the law does come into effect. For those who have not yet taken steps to procure an audit, the delayed enforcement date will give them the opportunity to collect the necessary data to conduct the audit.

Get Started on your Bias Audit Journey

The delayed enforcement date and updates to the rules signal that public concerns about the implementation of this law are being taken seriously by the DCWP and that they will be vigilant when the law does come into effect. Taking steps to prepare early is important for ensuring that you have the necessary data to conduct a bias audit.

Unsure of where to start? Schedule a free consult with one of our experts.

Written by Airlie Hilliard, Senior Researcher at Holistic AI and Lindsay Levine, Head of Customer Success at Holistic AI

Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.
deployment options

HAI Platform has two deployment options, Hybrid and Fully Managed

Hybrid Cloud

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →

Hybrid Cloud

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →

Hybrid Cloud

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →

Hybrid Cloud

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →
product

AI Governance

A command centre suite for executive-level management of AI applications

Register AI usage and development

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →

Register AI usage and development

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →

Register AI usage and development

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →

Register AI usage and development

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Learn more →
NYC Bias Audits Protected Characteristics
AI Regulations

NYC Bias Audits Protected Characteristics

May 3, 2023

The New York City Council took decisive action to mandate bias audits of automated employment decision tools (AEDTs) used to evaluate employees for promotion or candidates for employment in New York City, signaling that the risks of Artificial Intelligence (AI) are becoming an increasing regulatory concern. Local Law 144, also known as the NYC Bias Audit Law, is the first of its kind to codify independent, impartial bias audits in law.

As part of this regulation, employers (and employment agencies) are required to make a summary of the results of the bias audit publicly available on their website, increasing the transparency of these systems and allowing applicants to make more informed decisions about their interactions with them. Employers are also required to notify candidates and employees about the use of an AEDT, the characteristics that it will consider, and their instructions on how to request accommodations or an alternative selection procedure. In this blog post, we outline the protected characteristics that must be analysed for the bias audit and what to do if you do not have this data.

Key takeaways:

  • Local Law 144 requires that, at minimum, bias audits should examine outcomes based on the EEO Component 1 categories of sex and race/ethnicity.
  • The required sex categories are male/female, with an optional ‘Other’ category.
  • the required race/ethnicity categories are Hispanic or Latino, White, Black or African American, Native Hawaiian or Pacific Islander, Asian, Native American or Alaska Native, and two or more races.
  • The delayed enforcement date, now 5 July, provides an opportunity to collect the required data.
  • Those that do not have access to real-life data relating to the required categories are permitted to collect test data.

What Protected Characteristics are Covered by the Legislation?

The legislation states that bias audits should include, at a minimum, testing for disparate impact against component 1 categories required to be reported by employers under subsection (c) of section 2000e-8 of title 42 of the United States Code as specified in part 1602.7 of title 29 of the code of federal regulations. Rules proposed by the Department of Consumer and Worker Protection further clarify this, explicating that the audits, at minimum, should cover each race/ethnicity and sex category that is required to be reported to the Equal Employment Opportunity Commission (EEOC), with an updated version of the rules also specifying that intersectional analysis must be carried out in addition to standalone analysis.

NYC Bias Audits Protected Characteristics

Race/Ethnicity Categories

Under the DCWP’s updated rules, race/ethnicity data must be divided into seven categories: Hispanic or Latino, White, Black or African American, Native Hawaiian or Pacific Islander, Asian, Native American or Alaska Native, and two or more races. The Instruction Booklet on Component 1 Data Collection recently published by the EEOC defines each of these categories:

  • Hispanic or Latino – This category includes individuals of Cuban, Mexican, Puerto Rican, South or Central American, or other Spanish culture or origin, regardless of their race.
  • White (Not Hispanic or Latino)- Includes people with origins from Europe, the Middle East, or North Africa.
  • Black or African American (Not Hispanic or Latino) – Individuals that have origins in any of the black racial groups of Africa.
  • Native Hawaiian or Other Pacific Islander (Not Hispanic or Latino) – Anyone with origins in Hawaii, Guam, Samoa, or other Pacific Island.
  • Asian (Not Hispanic or Latino) – Individuals with origins in the Far East, Southeast Asia, or the Indian Subcontinent. This includes Cambodia, China, India, Japan, Korea, Malaysia, Pakistan, the Philippine Islands, Thailand, and Vietnam.
  • American Indian or Alaska Native – People with origins in any of the original peoples of North and South America (including Central America) who maintain tribal affiliation or community attachment.
  • Two or more races – anyone who identifies with more than one race. This does not include those identifying as Hispanic or Latino.  

Sex/Gender Categories

Under the DCWP’s proposed rules, sex data should be based on male/female classifications. However,  the EEOC has recently expanded its classification of gender, adding two new categories: unspecified and another gender identity. Making gender reporting more inclusive of those who are non-binary, for example, individuals no longer must restrict their self-reported gender to one of two categories. Thus, the sex/gender categories used for bias audits are male, female, and other.

What if You Don’t Have the Required Data?

Due to stringent data protection laws, such as the EU’s Gender Data Protection Regulation (GDPR) or France’s laws that prohibit employers for asking applicants information related to protected attributes, some employers, employment agencies, and vendors may not have the data on protected characteristics required for the bias audit. With the delayed enforcement date, this provides an opportunity to find a means of collecting this data.

For those that are unable to do this, under the revised rules, bias audits are now permitted to be conducted using test or synthetic data. The typical way for such data to be collected is by recruiting participants to be assessed by the AEDT and provide their demographic information, with online panel sites offering a rapid way to collect this data. If test data is used for the bias audit, however, the summary of results must explain why historical, or real-life, data was not used and how the test data was collected.

Enforcement Date Delayed

Originally due to come into effect on 1 January 2023, the enforcement date for this legislation was pushed back to 15 April 2023 due to the large number of public comments that were received during the first public hearing on the Department of Consumer and Worker Protection’s (DCWP) proposed rules, particularly concerning the metrics that should be used to determine bias. Consequently, the DCWP revised the proposed rules and held a second hearing on 23 January 2023. Following this hearing, the final rules were adopted in April and the enforcement date has been postponed to 5 July 2023.

This will give employers that have already had an audit of their system time to collect additional and carry out an additional audit if they have made any changes in light of the results of their initial audit, although their first audit will still be valid (for 1 year after the audit date) when the law does come into effect. For those who have not yet taken steps to procure an audit, the delayed enforcement date will give them the opportunity to collect the necessary data to conduct the audit.

Get Started on your Bias Audit Journey

The delayed enforcement date and updates to the rules signal that public concerns about the implementation of this law are being taken seriously by the DCWP and that they will be vigilant when the law does come into effect. Taking steps to prepare early is important for ensuring that you have the necessary data to conduct a bias audit.

Unsure of where to start? Schedule a free consult with one of our experts.

Written by Airlie Hilliard, Senior Researcher at Holistic AI and Lindsay Levine, Head of Customer Success at Holistic AI

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Manage risks. Embrace AI.

Our AI Governance, Risk and Compliance platform empowers your enterprise to confidently embrace AI

Get Started