New York City employers who use artificial intelligence (AI), data analytics, or statistical modeling in the hiring or promotion process will be required to notify candidates in advance and perform a “skills audit”. prejudices” annual.
Passed on November 10, 2021, this new law is one of the most significant steps ever to address concerns from civil rights groups that machine learning can discriminate against women and minorities. . The law takes effect on January 1, 2023, with fines of $500 for first offenses and up to $1,500 for subsequent offenses.
Wide range
While the new law might be expected to specifically target algorithmic decision-making, the language appears to cover a much wider range of job testing. The law applies to “automated employment decision tools” defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence” that generates a “simplified outcome, including a score, classification, or recommendation”, and substantially assists or supersedes discretionary employment decisions.1
Even common online job reviews, which predate AI technology, could be swept away by the broad definition of “automated job decision tools.” For example, under Title VII of the Civil Rights Act, an employment test must be job-related and consistent with business necessity if it disparately impacts members of a protected group. The link to the job is usually established by a validation study, and most validation studies rely on some form of “statistical modelling” to demonstrate a correlation between the assessment and the knowledge, skills, abilities and behavioral characteristics needed to perform the job successfully. The same goes for justifying the method of scoring, weighting and other use of an assessment in the selection process. As such, the vast majority of properly validated job tests use a “computational process” that has been “derived from” either “statistical modeling” or “data analysis” with a “simplified result”, such as a final score or pass/fail. flag. Similarly, all graded objective testing can be described as replacing “discretionary decision-making”. Finally, although the law includes some exceptions, the exceptions have no material impact on employment decisions, such as “a spam filter, a firewall, antiviral software, a calculator, a sheet computer, databases, data set or other compilation of data”.2
Notification requirement
New York City employers and employment agencies that use “automated employment decision tools” will have to meet strict notice requirements. Specifically, all candidates residing in the City who will be screened by these tools must be notified, at least 10 business days in advance, that (i) an automated employment decision tool will be involved in the assessment of their application; (ii) the professional qualifications and characteristics that the tool will assess; and (iii) that the candidate may request an alternative selection procedure or unspecified accommodation.3
Notice requirements will create challenges for employers who use many of the AI sourcing and selection tools on the market today. In most cases, vendors selling these tools claim to rate candidates on job-related factors, but refuse to provide details because their algorithms are proprietary. In fact, the vendors themselves may not know the characteristics and qualifications selected because some algorithms continually change or become “smarter”, incorporating successful recruiting or hiring results into the algorithm to prefer candidates. which share certain points in common with those selected.
Annual bias audits
The new law also requires a “bias audit” at least once a year, defined as an “unbiased assessment” conducted by an “independent auditor”, which includes, at a minimum, an analysis to determine whether the automated employment decision has had a disparate impact based on gender, race or national origin.4 The law doesn’t specify who qualifies as an “independent auditor,” but it likely wouldn’t include an internal expert or the vendor who created the assessment. Potentially most problematic for employers, the “bias audit” must be published on the employer’s website, along with “the release date of the tool to which this audit applies”. before employer can use the tool, which means employers will need to run the assessment, either with real candidates or incumbents, for development purposes only to gather the data needed to test the disparate impact and hopefully satisfy the bias audit requirement.
To take with
The New York City law is the latest and greatest effort by regulators to reduce bias when AI is used to make employment decisions. Earlier in 2021, the Equal Employment Opportunity Commission (EEOC) launched an initiative to study AI tools used in hiring decisions, highlighting concerns about bias and discrimination . Illinois passed its own AI employment law, which gives job applicants the right to know if AI is being used in a video interview and the ability to delete video data, while the Maryland has passed a law requiring candidate consent to use facial recognition technology. Washington, DC also announced a bill that would regulate algorithmic decision-making, with annual audits similar to New York City law.
The broad scope of this law leaves many questions open, such as whether long-standing computer-based assessments that have been derived from traditional test validation strategies are covered by law, or whether assessment tools Passives, such as recommendation engines used by employment agencies, could fall within the scope of the law.
In the absence of regulatory guidelines, employers who wish to screen New York residents for employment or promotion using computerized assessments will need to take the necessary steps by January 2023 to ensure compliance. The fines, $500 for first offenses and $1,500 for repeat offences, are counted as separate offenses each day the automated employment decision tool is used.5 And, while the law does not include a private right of action, it also does not prevent a candidate from bringing a private action under other federal, state, or local laws, such as anti- traditional discrimination.6
1 ID. to 1.
2 ID. 1.
3 ID. at 2 o’clock.
4 “Protected Persons” are those persons who are required to be reported by employers under 42 USC §2000e-8(c), as specified in 29 CFR §1602.7.
5 ID. at 3.
6 ID. at 3-4.