Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

USA: Children's privacy updates

Children's online privacy has become a top priority in the United States at both the federal and state levels. This focus has consistently been echoed in President Biden's State of the Union speeches in 2022, 2023, and again in 2024 where he unequivocally called on lawmakers to "pass bipartisan privacy legislation to protect our children online." As a result, efforts to protect children online have significantly increased in the past year, and it is expected that new measures will continue to be introduced in 2024. Key areas of policy, regulatory, and enforcement activity continue to focus on guardrails around behavioral tracking and targeted advertising towards minors, increased consent requirements to gain access to minors' personal information, and access to, as well as the use of, social media by minors. Alaap B. Shah and Lisa Pierce Reisz, from Epstein Becker & Green, P.C., discuss the developments across the US to further protect children online.

Catherine McQueen/Moment via Getty Images

Federal policy developments

The Biden-Harris Administration, building on the U.S. Surgeon General's 2023 advisory on Social Media and Youth Mental Health, has spearheaded several actions 'to safeguard children's privacy, health, and safety from online harms.”1 In addition, on December 20, 2023, the FTC proposed updates to the Children's Online Privacy Protection Act (COPPA), which at 23 years old is widely viewed as outdated and ineffective. The period for public comments on the updates to COPPA ended on March 11, 2024, but have not yet been implemented. Given the slow federal progress, much of the recent legislative activity in the area of children's privacy is occurring at the state level, which is discussed in more detail below.

Federal regulatory and enforcement developments


COPPA predates much of today's digitally driven world including social media, video-sharing websites, smartphones, online behavioral tracking, and cross-contextual targeted advertising which are increasingly a daily part of children's lives. COPPA was originally enacted in 1998 with the intent of giving parents more control over the information collected about their children online. Currently, the law requires operators of games, websites, apps, and other online services catered to users under the age of 13 to obtain permission from a child's parent before collecting information about them. Given the rapid advances in technology and the pervasiveness of the internet in children's lives since the passage of COPPA, many privacy advocates have called for updates to COPPA to better protect the privacy of children.

To that end, the Federal Trade Commission (FTC) announced a Notice of Proposed Rulemaking (FTC NPRM) on December 20, 2023, and summarized the key aspects as follows: 'Some of the suggested updates address changes in technology, such as limiting push notifications to kids. Others aim to clarify and streamline the COPPA Rule and strengthen data security. And still, others are designed to strengthen how children's personal information is protected online with an aim to ensure that parents - not companies - are in charge. For example, the proposal would require targeted advertising to kids to be off by default.'

Particularly, the FTC proposed, inter alia, the following key modifications to COPPA:

  • prohibitions on using contact information to 'nudge' a child to remain online;
  • prohibitions on conditioning a child's participation in an online activity on the collection of more personal information than would be necessary for such participation;
  • expansion of security obligations to protect children's data; and
  • limitations on the retention of data.

The comment period for these updates ended on March 11, 2024, and the FTC is anticipated to issue final rulemaking later in 2024. It should be noted that the FTC's regulatory authority remains limited under the existing statute, and as such, some lawmakers are seeking to amend COPPA itself.

In April 2024, parallel bipartisan efforts in the U.S. House of Representatives and the U.S. Senate revived efforts to amend COPPA. Specifically, two companion bills, H.R.7890 and S.1418 (informally referred to as COPPA 2.0), aim to strengthen children's privacy in several ways. First, COPPA 2.0 increases the age of minors to 16 and prohibits internet companies from collecting their personal information without consent. Second, COPPA 2.0 aims to ban targeted advertising to children and teens. Third, the 'actual knowledge' standard would be revised to hold social media platforms more accountable to protect children's privacy. Fourth, an 'eraser button' would be required to allow children and teens to delete their data. And fifth, data minimization rules would be established. These companion bills still have a significant journey ahead before becoming law. However, the current administration's policy direction and the bills' bipartisan nature suggest they will likely advance.

It is important to note that regardless of whether COPPA changes, the FTC has made it clear that its enforcement priorities will focus exclusively on consumer privacy and data protection. In the context of children, the FTC has used its authority under COPPA to hold companies accountable for the improper collection and use of children's data. For example, the FTC successfully obtained an 'algorithmic disgorgement' remedy against an artificial intelligence (AI) company that allegedly obtained and used children's personal information in violation of COPPA. In this case, the court-approved settlement required the company to:

  • destroy any children's personally identifiable information (PII) collected without verifiable parental consent;
  • destroy any AI developed using such PII; and
  • pay a $1.5 million civil monetary penalty.

Kids Online Safety Act

The Kids Online Safety Act (KOSA) is a bipartisan legislation originally introduced in the Senate by Richard Blumenthal and Marsha Blackburn in February 2022. Following last year's Senate Hearing with Big Tech CEOs, with some significant revisions since its original introduction, KOSA is now cosponsored by 62 senators. While it appears that KOSA is ready for a vote, new bills have recently been introduced as tech companies, which still oppose many of KOSA's provisions, continue to lobby Congress regarding different legislative solutions. For instance, NetChoice (a tech industry group representing several social media companies), the Chamber of Progress, the Software & Information Industry Association, and Internet Works all sent a joint letter to Congress on February 27, 2024, urging them to pass the Invest in Child Safety Act (ICSA) instead of KOSA. The ICSA is primarily aimed at directing more funding to law enforcement to investigate child predators as opposed to focusing on technology companies themselves.

The Federal Government appears ready to pass legislation on this issue, but it is too early to tell which of the two bills might survive.

American Privacy Rights Act of 2024

In the wake of the failed attempt at passing a comprehensive federal consumer privacy law via the American Data Privacy and Protection Act, Congress recently revived bipartisan efforts to legislate by issuing the American Privacy Rights Act of 2024 (APRA) on April 7, 2024. The APRA tackles a wide array of privacy issues, including children's privacy. First, it explicitly states that 'Nothing in this Act may be construed to relieve or change any obligation that a covered entity or person may have under [COPPA].' Second, for information subject to the APRA, a 'covered minor' is defined to mean individuals under the age of 17. Third, covered minors' data is considered 'sensitive covered data,' which carries various restrictions and compliance obligations related to its collection, use, and disclosure.

Despite its bipartisan roots, obtaining sufficient support for the APRA may prove challenging given the difficulty in harmonizing privacy requirements across various state interests and the fact that this legislation has been introduced in an election year when congressional activity tends to slow and stall on various issues. As such, the success of the APRA remains quite uncertain in these early days.

State law developments

As updates to COPPA have languished over the last 10 years and other federal legislative efforts such as KOSA have stalled, numerous states have stepped in to fill some of the void. Most states, however, are not only seeking to regulate the collection of information but are also focused on the emotional and mental harm that such collection and uses potentially pose to minors.

Several states have also recently passed laws limiting the access of minors under the age of 18 to social media and other online platforms that are predominantly accessed by minors. However, almost every single one of these laws has currently been enjoined by a federal court on the basis that they potentially violate the First Amendment right to freedom of speech.


Arkansas has adopted a law limiting minors' access to social media. The Arkansas law, which was supposed to be effective on September 1, 2023, was temporarily enjoined by Federal Judge Timothy Brooks the day before its effective date when it was challenged for violating First Amendment rights. Under Arkansas's law, minors cannot obtain a social media account without the express consent of their parent or guardian. The Arkansas law also includes a private right of action if children are harmed as a result of a violation of this law.


California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA)

California was one of the first states to specifically address children's privacy issues in its comprehensive privacy law. The CCPA, enacted in 2018 (and updated by the CPRA in 2020), includes special protections for children in California. Under the CPRA, a covered business cannot sell or share the personal information of a child aged 13 to 15 without the affirmative consent of the child, and for a child under the age of 13 without the affirmative consent of the child's parent or guardian.

California Age-Appropriate Design Code Act (AADC)

In addition, California has recently enacted legislation specifically focused on children's privacy. The AADC, which was modeled after the United Kingdom's Age Appropriate Design Code, becomes effective on July 1, 2024. It emphasizes age-appropriate design principles for websites, online services, and applications directed toward children under 18.

In contrast to COPPA, the AADC has a much broader reach in that it applies to all minors under the age of 18. It is also broader than COPPA in its scope. Currently, COPPA only applies to online services that are directed to children or when the service providers have 'actual knowledge' that children are using the online service. In contrast, the AADC applies to a business that is 'likely to be accessed' by minors. This includes online services that are routinely accessed by a significant number of children or are similar to online services that are routinely accessed by a significant number of children. Of note, the AADC requires businesses to conduct Data Protection Impact Assessments (DPIA) to analyze whether the service will harm children. The AADC also requires businesses to estimate the age of child users with a reasonable level of certainty or otherwise apply the privacy and data protections afforded to children to all users of the online products and services. As a practical matter, this may mean that businesses must implement some form of age verification or age gate for all end users.

Finally, the AADC strictly regulates the collection and use of geolocation information and prohibits the use of dark patterns to encourage children to provide additional personal information beyond what is reasonably expected for the online product or service, forgo privacy protections or take any action that the business knows, or has reason to know, is materially detrimental to the child's physical health, mental health, or well-being.

It should be noted that on September 18, 2023, in Netchoice v. Bonta, the U.S. District Court for the Northern District of California granted a preliminary injunction against California's enforcement of the law finding that the AADC likely violated the First Amendment. California appealed the District Court's decision to the Ninth Circuit Court of Appeals. A decision from the Ninth Circuit is still pending.


On June 26, 2023, the Connecticut Governor signed Connecticut SB 3 to amend the Connecticut Data Privacy Act. The amendment requires data controllers that offer an online service, product, or feature to minors (any individual younger than 18) to avoid a heightened risk of harm to those minors. The law also requires controllers to obtain the minor's consent or a parent's consent for minors under the age of 13 to sell the personal information of such minors or engage them in targeted advertising or profiling. Finally, the amendments require controllers to conduct a DPIA to identify whether there is a heightened risk of harm to minors.


On July 1, 2023, Florida's Digital Bill of Rights became effective. This new law imposes restrictions on for-profit companies doing business in Florida that collect sensitive data regarding Florida residents. In particular, the law prohibits companies from:

  • selling sensitive data without receiving prior consent from the consumer; and
  • processing sensitive data of children under the age of 18 without authorization under COPPA.

Additionally, on March 25, 2024, Florida Governor Ron DeSantis signed an amended social media bill (HB 3) that bans children under the age of 14 from social media platforms and requires 14- and 15-year-old children to get parental consent to set up a social media account. The bill requires social media platforms to terminate the accounts of individuals under 14 and those of individuals under 16 who do not have parental consent. It requires them to use a third-party verification system to screen out those who are underage. It will become effective on January 1, 2025.


In June 2023, Louisiana passed a law that would require minors to obtain parental consent to open social media accounts. The law would also require social media companies to 'make commercially reasonable efforts' to verify the ages of Louisiana account holders and put up additional safeguards for the State's children on their sites. For instance, social media companies must prohibit adults from messaging Louisiana minors with whom they are not connected. Further, social media companies are prohibited from targeting advertisements to minors based on personal information outside of age and location and from collecting the personal information of minors 'beyond what is adequate, relevant, and reasonably necessary.' The law goes into effect on July 1, 2024.


On April 6, 2024, the Maryland legislature passed two privacy bills: the Maryland Online Data Privacy Act of 2024 and the Maryland Kids Code. Under the Maryland Online Data Privacy Act of 2024, the definition of 'child' is tied to COPPA, children's data is considered sensitive data with heightened protections, and parents have the authority to exercise a child's rights on their behalf. Under the Maryland Kids Code, certain online platforms, including gaming and social media companies, would be prohibited from tracking children under the age of 18 or from using manipulative techniques to keep children online. Both of these bills require the Governor's signature before becoming law, but if they are signed the Maryland Kids Code would become effective on October 1, 2024, followed by the Maryland Online Data Privacy Act of 2024, which would become effective on October 1, 2025.


On July 5, 2023, Governor Mike DeWine signed the Social Media Parental Notification Act (SMPN) into law. The SMPN imposes numerous requirements on operators whose target market is children under the age of 16 or whose service is reasonably anticipated to be accessed by children. For instance, the SMPN requires social media companies to obtain parental permission for children under 16 to use social media. It also requires social media platforms to develop procedures to verify whether a user is under the age of 16. Although the SMPN was scheduled to go into effect on January 15, 2024, its implementation was enjoined by Judge Algenon Marbley of the U.S. District Court for the Southern District of Ohio on February 12, 2024, after a First Amendment free speech challenge was brought against the State of Ohio. The Ohio law currently remains enjoined.


On June 12, 2023, Texas enacted a law that requires commercial entities that 'knowingly and intentionally' publish or distribute material on an internet website, including a social media platform, where more than one-third of which is sexual material harmful to minors, to use reasonable age verification methods to verify that an individual attempting to access the material is 18 years of age or older. Such verification must be done through an individual's government-issued ID or a commercially reasonable method relying on public or private transactional data. The law further required the websites to post a warning stating that pornography is addictive, that exposure to pornography could result in body image and mental health issues, and that it could increase the demand for child exploitation.

The law, which was scheduled to take effect on September 1, 2023, was challenged in federal court by the Free Speech Coalition, which claimed that the law was overbroad and failed the strict scrutiny test applied to free speech. A federal district court granted the Free Speech Coalition an injunction that prevented the Texas Attorney General from enforcing the law. However, on November 14, 2023, the Fifth Circuit Court of Appeals vacated the injunction. Further, on March 7, 2024, the Fifth Circuit ruled that the Texas Attorney General could enforce the age verification part of the law but determined that the required health warning was unconstitutional.

Additionally, Texas recently passed a children's privacy law directed at social media companies. This law becomes effective on July 1, 2024. Unlike the Louisiana, Utah, and Arkansas laws, this new Texas law does not require verifiable parental consent for a child to have a social media account. Instead, the Texas law requires parental consent for specific collections and uses of minor information. The Texas law also requires providers to develop a strategy to prevent a minor's exposure to harmful material and create tools to allow parents to control a minor's social media. The Texas law also limits targeted advertising through the social media platform. There is no private right of action.


In March 2023, Utah became the first US State to regulate children's access to social media under the Social Media Regulation Act. Under this Act, minors cannot obtain a social media account without the express consent of their parent or guardian. The Act also prohibits minors from accessing social media between the hours of 10:30 pm and 6:30 am, unless a parent allows the child to access it during these times and limits the collection and use of minor information. The Act requires age verification for anyone who wants to use social media in Utah. It also seeks to prevent technology companies from luring kids to their apps using addictive features. Of further note, the Act include a private right of action if children are harmed as a result of violations of these laws.

The Social Media Regulation Act was supposed to go into effect on March 1, 2024. However, its constitutionality was challenged in December 2023 by NetChoice, a tech industry group representing several social media companies. NetChoice had requested a preliminary injunction to block the law from taking effect. In response, Utah lawmakers moved the effective date give them a chance to amend the law to avoid freedom of speech issues. In March 2024, the bill for the Social Media Regulation Amendments was signed by the Governor and the Utah Minor Protection in Social Media Act will become effective in October 2024.

Ongoing state legislative activity in 2024

Within the first four months of 2024, several states have proposed additional laws to protect children's privacy and well-being online. These include, inter alia, the following:

  • Colorado introduced SB 24-041 to amend the Colorado Privacy Act;
  • Georgia introduced SB 351;
  • Hawaii introduced SB2309 and SB2012 (similar to California's AADC);
  • Illinois introduced HB3880;
  • Minnesota introduced HF 2257/SF 2810 (an AADC bill);
  • New Mexico introduced SB 68 (an AADC bill);
  • New York introduced A4967 and S3281 (these bills will require DPIAs of products and services for children and certain data collection and targeted advertising activities);
  • Pennsylvania introduced Bill 1879;
  • South Carolina introduced H 4541 and H 4842 (a version of the AADC);
  • Vermont introduced H.712;
  • Virginia introduced HB 707 and HB 821, and SB 361 and SB 684; and
  • West Virginia introduced HB 4718.


Overall, there is a clear trend of increased regulatory focus on protecting children's online privacy and safety, with lawmakers at both the federal and state levels seeking to update existing laws and introduce new measures to address the evolving digital landscape. However, the implementation and enforcement of these laws remain an ongoing challenge that will require continued attention and collaboration between policymakers, industry, and privacy advocacy groups.

Alaap B. Shah Member of the Firm
[email protected]
Lisa Pierce Reisz Attorney
[email protected]
Epstein Becker & Green, P.C., Washington, DC and Ohio

1. See The White House FACT SHEET: Biden-Harris Administration Announces Actions to Protect Youth Mental Health, Safety & Privacy Online (May 23, 2023). Key policy actions include:
  • creating, in partnership with the Department of Commerce, an interagency Task Force on Kids Online Health & Safety;
  • the U.S. Department of Education updating the Family Educational Rights and Privacy Act (FERPA) rulemaking to better protect minor education records;
  • the U.S. Department of Education, in consultation with the U.S. Department of Health and Human Services (HHS), the Surgeon General, the Centers for Disease Control and Prevention (CDC), and the FTC, issuing resources, model policies, and best practices for school districts on the use of internet-enabled devices;
  • the Department of Commerce promoting efforts to prevent online harassment and abuse of children; and
  • the Department of Homeland Security, with the Department of Justice (DOJ), working with the National Center for Missing and Exploited Children creating combined image repositories used to identify victims, as well as detect and investigate offenses involving child sexual abuse material.