California’s second phase of CPRA rulemaking sets automated tools in the crosshairs

Businesses have until March 27, 2023, to respond to a series of questions from the California Privacy Protection Agency to inform its upcoming rulemaking on automated decision-making systems under the California Privacy Rights Act. The agency’s broad mandate means that rules coming out of this proceeding are likely to meaningfully affect how businesses use automated tools across their enterprise.

The California Privacy Rights Act empowered the California Privacy Protection Agency (CPPA) to issue regulations “governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling,” even though the law did not expressly given consumers such rights. The CPPA previously indicated its intention to create automated decision-making (ADM) regulations after it completed its initial set of CPRA rulemaking. On February 14, 2023, the CPPA transmitted the initial set of CPRA regulations to the Office of Administrative Law for approval, setting the stage for the CPPA to invite the public to submit preliminary comments on a range of questions to inform its upcoming rulemaking on ADM systems, risk assessments, and cybersecurity audits.

The CPPA has invited interested parties to submit pre-rulemaking comments by March 27, 2023, at 5 PM PST. The public will also be able to provide additional comments on any proposed regulations when the CPPA proceeds with a notice of proposed rulemaking action.

Key Questions Under Consideration

Through this forthcoming ADM rulemaking, the CPPA will define the scope of the right to opt out of the use of ADM tools and to receive information about the decision-making process such as the logic involved and a description of likely outcomes with respect to the consumer. Below, we highlight some of the CPPA’s questions on ADM and provide some initial reactions.

How is “ADM technology” defined [under other laws, frameworks, and best practices]? Should the Agency adopt any of these definitions?

While the CPPA is considering aligning its rules with requirements in other jurisdictions, it could define covered systems more broadly than other laws to address what it perceives as gaps in existing frameworks. As a result, new rules here will not necessarily align with requirements under existing frameworks, like the profiling opt-outs under other state laws or laws impacting automated employment decision tools like New York City’s Local Law 144.

For businesses that have already begun identifying the ADM tools they offer or use and evaluating their compliance obligations, a broad definition may prove costly and require reassessing what tools would now be in scope.

Should access and opt-out rights with respect to businesses’ use of ADM technology, including profiling, vary depending upon certain factors?

One potential answer here is yes, at minimum for employees. Unlike other comprehensive state privacy laws that exclude employee data, the CPPA’s regulations would apply to ADM tools in the employment context unless the CPPA expressly carves it out.

At the same time, California’s Civil Rights Department is also considering ADM rules for which the California Civil Rights Council recently voted to start the rulemaking process. Businesses will be able to participate in a 45-day public comment period that opens the rulemaking process after the Council’s Algorithm and Bias committee transmits an Initial Statement of Reasons and other required documentation to the Office of Administrative Law.

It will be imperative for employers that use ADM tools to monitor both proceedings and to encourage alignment between the potential regulations so that they can comply with both sets of obligations without undue burden.

What pieces and/or types of information should be included in responses to access requests that provide meaningful information about the logic involved in ADM processes and the description of the likely outcome of the process [...]?

Appropriately tailoring access rights for ADM tools will be a balancing act for the CPPA. The uncertainty of abstract or ambiguous rules may hinder innovation, particularly as new technologies such as generative AI are being commercialized, and overly prescriptive approaches may be similarly impracticable. Providing insights to the CPPA on how other regulators have approached this question and what is possible would help with securing appropriately tailored regulations.

Next Steps

Given the breadth of the CPPA’s rulemaking authority and the possibility of highly prescriptive rules that do not align with existing standards, public participation at this stage will be crucial to shaping the initial set of regulations from the CPPA. In addition to ADM tools, the CPPA is also considering rules on risk assessments and cybersecurity audits that may be similarly impactful and may warrant public input. More information about how to submit comments can be found here.

 

 

Authored by Harsimar Dhanoa and Filippo Raso.

Brittney Griffin, a Senior Paralegal in our New York office, contributed to this entry.

Contacts
Harsimar Dhanoa
Associate
Washington, D.C.

 

This website is operated by Hogan Lovells International LLP, whose registered office is at Atlantic House, Holborn Viaduct, London, EC1A 2FG. For further details of Hogan Lovells International LLP and the international legal practice that comprises Hogan Lovells International LLP, Hogan Lovells US LLP and their affiliated businesses ("Hogan Lovells"), please see our Legal Notices page. © 2024 Hogan Lovells.

Attorney advertising. Prior results do not guarantee a similar outcome.