Continue reading on DataGuidance with:
Free Member
Limited ArticlesCreate an account to continue accessing select articles, resources, and guidance notes.
Already have an account? Log in
California: Age-Appropriate Design Code Act
On 15 September 2022, Governor Gavin Newsom signed Assembly Bill 2273 for the California Age-Appropriate Design Code Act1, establishing the California Age-Appropriate Design Code Act ('the Act'). This landmark legislation dramatically expands the privacy protections afforded to California residents under the age of 18 and brings California even closer to its UK and European counterparts on privacy legislation. Lily Li, Founder of Metaverse Law, provides a background and overview of the Act, its provisions, and how it compares with similar legislation in the UK.
The backdrop
The Children's Online Privacy Protection Act of 1998 ('COPPA') is an existing US federal law that protects children's privacy online. U.S. Congress passed COPPA in 1998 to protect children's use of the internet, by requiring clear and conspicuous privacy notices and affirmative consent by parents prior to collection of personal information from children. COPPA's scope is limited, however, to children under the age of 13 and to websites and services that: (i) are either directed to children; or (ii) have actual knowledge they are collecting personal information from children online.
Since 1998, the use of the internet by teenagers between the ages of 13 and 18 has exploded, with some teens now earning their livelihood through social media channels. Consequently, the Federal Trade Commission ('FTC'), Congress, and individual states have ramped up efforts to address privacy issues for this age group. The FTC has sought public comment on children's advertising2 and online surveillance3, while Senators Richard Blumenthal and Edward J. Markey have proposed the Kids Online Safety Act4 and the Children and Teens Online Privacy Protection Act5 respectively to create a broader national framework on privacy.
While federal efforts stay mired in D.C. politics, California moved forward with its own privacy legislation – the Act.
The details
The Act imposes obligations on businesses that: (i) meet the California Consumer Privacy Act of 2018 ('CCPA') thresholds; and (ii) provide an 'online service, product, or feature' that is 'likely to be accessed by children'. Children are defined as California residents 'who are under 18 years of age'.
The Act provides factors for whether an online service, product, or feature is 'likely to be accessed' by California residents under the age of 18, including:
- it is directed to children as defined by COPPA;
- it is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children, or it is substantially similar to an online service, product, or feature that meets this factor;
- it displays advertisements marketed to children;
- it has design elements known to be of interest to children, including games, cartoons, music, and celebrities who appeal to children; and
- based on internal research, a significant amount of the audience is children.
If a business finds that their online service, product, or feature is covered by the Act, the business must take several steps to ensure compliance. We identified the following as among the more onerous steps that must be taken.
Data Protection Impact Assessments and risk mitigation plans
Businesses must complete a Data Protection Impact Assessment ('DPIA') prior to launching a covered service, product, or feature, and must maintain DPIA documentation for as long as such services are likely to be accessed by children. Businesses must biennially review all DPIAs. Businesses must further document any risk of material detriment to children that arises from data management practices identified in the DPIA and create a timed plan to mitigate or eliminate the risk before the specific product or service is accessed by children.
Estimate age of child users or treat all consumers as children
Businesses must estimate the age of child users with a reasonable level of certainty or apply the privacy and data protections afforded to children to all consumers. These methods may include the user self-declaring their age, artificial intelligence ('AI') algorithms establishing a user's age, third-party verification services, confirmation from a known adult account holder, hard identifiers (e.g., passports or similar documents), or some form of technical measures6.
High privacy as default settings for children
Most importantly, the Age-Appropriate Design Code requires businesses to implement, by default, privacy settings that offer a 'high level of privacy', unless the business can demonstrate a compelling reason that a different setting is in the best interests of children. It remains to be seen what a 'high level of privacy' means, but this catch-all requirement will likely be the toughest to implement.
Comparison to the UK Age appropriate design code
While the California Age-Appropriate Design Code is new, it is modelled after the UK's Age appropriate design code7. Consequently, covered businesses may look to guidance from the UK Information Commissioner's Office ('ICO') on how to interpret the scope and implementation of the Act.
As far as the scope of the Act, it is purposefully broad. The ICO states that the term 'likely to be accessed by' covers 'services that children [are] using in reality', not just those services specifically targeting children8. However, the ICO recognises that the term is not so broad as to 'cover all services that children could possibly access'9. The key difference is whether it is 'more probable than not' that an online service, product, or feature will be accessed by children, and businesses should take a 'common sense approach to this question'10.
To illustrate this point:
- If an online service, product, or feature is the kind 'you would not want children to use in any case', then the business should focus on preventing children from accessing the online service, product, or feature, rather than making it child friendly11.
- If a business's common-sense analysis reveals that children make up a 'substantive and identifiable user group' routinely accessing the business' online services, then the 'likely to be accessed' definition will apply12.
- If that analysis does not reveal such a group yet causes the business to 'think that children will want to use it', then the business 'should conform to the [law's] standards'13.
- If a business decides that its services are not likely to be accessed by children, the business should 'document and support' the reasons for such a determination, and incorporate such evidence as 'market research, current evidence on user behaviour, the user base of similar or existing service', and more14.
Please note that while the California law is expressly modelled after the UK law, there are some differences. For example, the California law notably exempts the 'delivery or use of a physical product'. This exemption diverts from the UK version of the law, which covers 'connected toys and devices'15.
In addition, the Act also establishes the California Children's Data Protection Working Group, which will recommend best practices for implementing the Act. Based on the Working Group's recommendations, we could easily see differing guidance from California regulators and the ICO on the practical implementation of the law.
Key takeaways
The California Age-Appropriate Design Code goes a long way to incorporate DPIAs and Privacy by Design and Default into California's technology ecosystem. Any businesses that have a substantial consumer base of under 18s or plan to expand into this market should be familiar with California's new law and its UK predecessor.
Lily Li Founder
[email protected]
Metaverse Law, Orange County/Los Angeles
1. See: https://www.gov.ca.gov/2022/09/15/governor-newsom-signs-first-in-nation-bill-protecting-childrens-online-data-and-privacy/
2. See: https://www.ftc.gov/news-events/news/press-releases/2022/08/ftc-seeks-additional-public-comment-advertising-kids-digital-media
3. See: https://www.ftc.gov/news-events/news/press-releases/2022/08/ftc-explores-rules-cracking-down-commercial-surveillance-lax-data-security-practices
4. See: https://www.congress.gov/bill/117th-congress/senate-bill/3663/text
5. See: https://www.congress.gov/bill/117th-congress/senate-bill/1628/text
6. The Act provides no guidance on estimating a child's age - this is based on guidance from the UK on a similar law, see: https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-a-code-of-practice-for-online-services/3-age-appropriate-application
7. See: https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-code/
8. See, at page 17: https://ico.org.uk/media/for-organisations/guide-to-data-protection/key-data-protection-themes/age-appropriate-design-a-code-of-practice-for-online-services-2-1.pdf
9. Id.
10. Id, at 17-18.
11. Id, at 18.
12. Id.
13. Id.
14. Id.
15. According to ICO, connected toys and devices are 'children's toys and other devices which are connected to the internet. They are physical products which are supported by functionality provided through an internet connection', see: https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-a-code-of-practice-for-online-services/14-connected-toys-and-devices