On July 24, 2025, the California Privacy Protection Agency (CPPA) approved final regulations (the Rule) under the California Consumer Privacy Act (CCPA) governing Automated Decision-Making Technology (ADMT), including artificial intelligence (AI) and other automated tools. The regulations also adopt a cybersecurity audit rule to impose ‘reasonable’ cybersecurity practices on businesses, which will be addressed in Part Two. The Rule introduces new consumer rights and compliance obligations for businesses using technology that makes or substantially influences decisions about consumers. While the regulations implementing the Rule are not yet fully finalized — they proceed for procedural review to the California Office of Administrative Law under the California Administrative Procedure Act — it’s not too early to consider how impending legal obligations will impact a business’s product roadmaps and technical design choices for products and services incorporating ADMT features.
Key Highlights of the Rule
Focuses on the use of ADMT in health care, employment opportunities, credit and lending services, housing and education;
Offers a broad definition of ADMT that captures technologies that may not necessarily be considered AI (e.g., a spreadsheet or database), in certain cases;
Provides for new consumer rights including pre-use notices of ADMT by a business, a right to opt-out of ADMT and a right to appeal significant decisions, i.e., where a person receives or is denied financial or lending services, housing, educational opportunities, employment or contracting opportunities or compensation, or health care services; and
Extends compliance obligations including risk assessments, vendor contracting requirements and record keeping.
Broad Scope: What Is ADMT?
Rather than defining AI technology, the CPPA focused on whether a technology replaces or substantially influences human decision-making as the trigger for bringing it within the Rule’s scope, resulting in a much broader scope. Specifically, ADMT includes any technology that processes personal information and computation to replace or substantially replace human decision-making. This includes AI and machine learning (ML) technologies like predictive models, generative AI and facial recognition, as well as basic rule-based technologies like credit scoring systems when they influence important outcomes in finance and lending without real human review. Technologies that are not typically considered ADMT such as data storage, calculators, databases and spreadsheets can also be considered ADMT if they replace human decision-making. The CPPA’s earlier draft included an example of using a spreadsheet to run an analysis to decide who gets promoted — an example that did not appear in the final text but illustrates how broad the definition is intended to be. In short, if a technology replaces or influences human-decision making for granting or denying benefits or opportunities in health care, financial or lending services, education or employment, it probably counts as ADMT.
When the Rule Applies
The Rule covers decisions that materially impact consumers’ lives, such as determining access to health care, employment opportunities, credit and lending services, housing or education. They also apply to profiling activities that produce legal or similarly significant effects, including behavioral profiling. Additionally, businesses using consumer data to train or improve ADMT technology fall within scope, even if the technology is still in development and not yet deployed.
New Consumer Rights
Consumer rights take center stage in the Rule. Businesses must provide consumers with a pre-use notice in plain language that, among other things, explains the specific purpose of the use of ADMT technology and its potential impact on the consumer. Consumers must also be notified of their right to opt out of profiling and most other ADMT uses and appeal adverse significant decisions through a process that includes meaningful human review. These requirements are often operationally complex, requiring businesses to integrate transparency, rights management and governance controls into their workflows and their design, development and deployment of ADMT technologies.
Extended Compliance Obligations
Companies must complete prescribed risk assessments not only for deployed ADMT that makes significant decisions, but also when processing personal information to train ADMT for significant decisions or for training facial-recognition, emotion-recognition or other identity verification or profiling technologies. This obligation applies even if the business only intends to use the technology in the future, allows others to use it or markets its use. Businesses must also ensure that service providers and contractors using ADMT comply with the Rule. This includes updating vendor contracts to require cooperation in providing information, supporting consumer rights such as opt-out and appeal processes and preventing unauthorized uses of ADMT. Finally, the Rule imposes recordkeeping requirements, including retaining copies of risk assessments, notices and documentation of consumer requests and responses.
What to Do Now
Even though the Rule will likely not take effect until late 2025 or early 2026, given the necessary cross-functional planning, technology reviews and contract updates, it’s not too early to get started on the following:
Inventory ADMT use cases used by or for the business (whether provided by you or a vendor);
Plan to operationalize consumer rights;
Update vendor agreements;
Develop risk assessment frameworks; and
Establish recordkeeping practices.
[View source.]