“The platforms know that millions of children are using their services. They have chosen not to know. Choosing not to know is not the same as not knowing. It is a business decision.”
— Senator Richard Blumenthal, Senate Judiciary Subcommittee hearing on children's online safety, 2021
What COPPA Was Designed to Do
The Children's Online Privacy Protection Act (COPPA), signed by President Clinton in 1998 and taking effect in April 2000, was designed to address a specific problem: the commercial collection of personal data from children under 13 by websites and online services, without parental knowledge or consent. The law gave the Federal Trade Commission authority to enforce its provisions and required websites to post clear privacy policies, obtain verifiable parental consent before collecting personal information from children under 13, and give parents the ability to review and delete their children's data.
The problem COPPA was designed to address was real. In the mid-1990s, child-directed websites routinely collected names, addresses, and personal information from children through games, contests, and fan clubs, using that data for commercial purposes without any parental involvement. The law was a response to documented commercial exploitation of children's data in a nascent online environment.
The environment that produced COPPA — the mid-1990s internet, before social media, before smartphones, before algorithmic content feeds — bore almost no resemblance to the environment in which COPPA has subsequently been applied. The law was written for a world of static websites collecting names and email addresses. It has been applied, without substantive revision, to a world of algorithmically curated social platforms collecting behavioral data, psychological profiles, location history, contact networks, and biometric information in real time, designed specifically to maximize engagement through mechanisms documented in YR-001 to have their strongest effect on developing brains.
The 13-Year Threshold — An Arbitrary Line
The age-13 threshold in COPPA was not derived from developmental neuroscience. It was a legislative compromise. The threshold emerged from congressional deliberations in which child advocacy groups pushed for higher ages and industry representatives pushed for lower ages or no threshold at all. Thirteen was the number that passed. No developmental psychologist testified that 13 represented a neurologically meaningful threshold for digital consent capacity. No pediatric research established that the cognitive systems required for meaningful evaluation of privacy consequences were sufficiently mature at 13.
As documented in YR-001, the prefrontal cortex does not complete development until the mid-twenties. The dopaminergic sensitivity that makes engagement design most effective is heightened throughout adolescence. The social comparison sensitivity that social platforms most directly exploit peaks in the 11–15 age range — which means it peaks above and below the COPPA threshold simultaneously. There is no evidence that a 13-year-old is meaningfully better equipped than a 12-year-old to consent to data collection practices or to evaluate the long-term consequences of platform use. The threshold is administratively useful. It is not developmentally meaningful.
The consequence of the arbitrary threshold has been structural. Platforms designed for users 13 and over — by which they mean users who claim to be 13 or over — are legally exempt from COPPA's requirements. The vast majority of the social media ecosystem has been designed with this exemption in mind. The design choices that make platforms most effective at capturing adolescent attention — the variable reward schedule, the social comparison engine, the public engagement metrics — have been deployed without COPPA constraint on 13-year-olds, because 13-year-olds are legally adults for COPPA purposes.
The Age Verification Gap — What Platforms Actually Do
COPPA requires verifiable parental consent for data collection from children under 13. The law does not specify what "verifiable" means, delegating that determination to the FTC. The FTC has approved several verification methods: signed parental consent forms, credit card transactions, toll-free telephone numbers with trained operators, and other methods meeting a "reasonable efforts" standard. All of these methods are cumbersome, create friction, and produce lower user acquisition for the platforms that implement them.
The industry solution to this problem has been almost universal: platforms require users to enter a date of birth during account creation. If the entered date of birth places the user under 13, the account creation is blocked. If the entered date of birth places the user at 13 or older, the account is created without any verification. The platform has technically "collected" an age declaration. It has not verified it.
Common Sense Media surveys consistently find that approximately 40% of children under 13 are on social media platforms that have agreed to COPPA compliance. A 2021 survey found that 38% of children aged 8–12 used YouTube regularly, 28% used TikTok, and 23% used Instagram — all platforms that require users to certify they are at least 13 years old. The method by which 8-to-12-year-olds access these platforms is not sophisticated: they enter a false date of birth during account creation, or they use a parent's account, or they are added by an older sibling or friend. The age gate is a text field. A child who can type a number can bypass it.
The platform that asks a child to enter their date of birth and accepts whatever they enter is not complying with the spirit of COPPA. It is complying with the letter of a regulatory interpretation while systematically violating the intent that interpretation was designed to serve.
The platforms are aware of this. Internal documents from Meta released during the Haugen whistleblower disclosure in 2021 showed that company researchers had studied underage use on Instagram and Facebook, had identified it as a significant user population, and had discussed the commercial value of capturing users early. The decision to maintain the self-reported age verification system was made with knowledge of its inadequacy. The platform that "cannot verify" age has made a business decision about the cost of meaningful verification relative to the value of not implementing it.
The Enforcement Record
The FTC has brought COPPA enforcement actions since the law took effect. The enforcement record includes significant settlements: YouTube/Google ($170 million in 2019), TikTok ($5.7 million in 2019 for its predecessor Musical.ly), operators of children's app networks (Unilever subsidiary settlement, various smaller app operators). The enforcement actions establish that violations occur and that the FTC has authority to act.
What the enforcement record does not establish is that enforcement has deterred the behavior it targets at scale. The YouTube settlement — $170 million — represented less than one percent of Google's 2019 annual revenue. The TikTok settlement for Musical.ly preceded TikTok's deployment of engagement design features that have subsequently made it the fastest-growing social platform among adolescents. Major platforms have not faced enforcement specifically for maintaining inadequate age verification systems that allow under-13 users to create accounts by entering false birth dates.
| Action | Year | Settlement | Deterrence Outcome | |
|---|---|---|---|---|
| Musical.ly / TikTok | 2019 | $5.7 million | TikTok became the fastest-growing adolescent platform within 3 years | |
| YouTube / Google | 2019 | $170 million | <1% of annual revenue; YouTube remains largest platform for children under 13 | |
| Operators of children's apps | Various | $100K–$4M range | Settlements affect primarily smaller operators; major platforms largely unaffected | |
| Instagram / Meta (underage accounts) | No action taken | — | Internal documents show Meta aware of underage user population | |
| TikTok (UK ICO) | 2023 | £12.7 million | Illegal processing of children's data; up to 1.4M UK children under 13 used platform without required parental consent | Fine paid; age assurance measures required; underage account creation rates not publicly audited post-fine |
| Meta / Instagram (FTC → DOJ referral) | 2024 | Referred for civil penalty action; amount pending litigation | FTC alleged Meta violated its 2012 consent decree by failing to restrict data sharing with third parties and exposing children's data; FTC referred case to DOJ for enforcement | First referral of a major platform to DOJ for COPPA-adjacent consent decree violations; case ongoing as of 2026 |
The structural problem is that COPPA enforcement is complaint-driven and resource-constrained. The FTC has a limited enforcement budget that is dwarfed by the legal resources of major technology platforms. Enforcement actions are expensive, lengthy, and produce settlements that are calibrated to the FTC's willingness to litigate rather than to the platform's actual culpability or the scale of violations. The enforcement gap is not primarily a function of insufficient legal authority — COPPA gives the FTC substantial authority — it is a function of institutional capacity, political will, and the structural disadvantage of a regulatory agency litigating against companies whose annual revenue exceeds its entire budget.
The 2024 FTC referral of Meta to the Department of Justice for civil penalty action represents a structural escalation within the enforcement architecture. FTC referrals to DOJ are rare and signal that the Commission has concluded that further administrative action is insufficient. The referral does not change the structural constraint — the DOJ now faces the same institutional capacity disadvantage in litigating against Meta that the FTC faced — but it confirms at the federal enforcement level what the COPPA Failure Record documents at the regulatory level: the compliance theater has been acknowledged as theater by the agencies that nominally enforce the law, and the response to that acknowledgment has been escalation within the same architecture that produced the failure.
The Industry's Self-Regulatory Track Record
COPPA relies on a compliance model in which platforms self-certify adherence to the law's requirements. The FTC provides guidance; platforms design their own compliance systems; enforcement occurs after the fact when violations are identified. This model places the burden of compliance on entities whose financial interests are in some respects directly opposed to compliance: platforms that would lose access to a user population by effectively excluding under-13 users have a financial incentive to design compliance systems that technically satisfy regulatory requirements while minimally restricting that access.
The evidence that this model has not produced the protection COPPA intended is documented in the preceding sections: widespread underage use of COPPA-compliant platforms, inadequate verification systems, internal documentation of industry awareness of underage user populations, and an enforcement record that has not reversed these trends. The industry's self-regulatory track record on children's privacy is consistent with the track record documented in the Consent Record series (CR-001 through CR-005) on digital consent generally: industry self-regulation in contexts where compliance is costly and violation is profitable produces documented compliance with regulatory form and systematic non-compliance with regulatory intent.
The systematic gap between COPPA's stated protections and its operational reality: a self-regulatory industry verification system with no effective age-gating mechanism, a minimal enforcement history calibrated to deterrence rather than accountability, and deliberate platform design choices that route children under 13 around the protections the law nominally provides. The Compliance Theater is not a malfunction of COPPA. It is a predictable outcome of a regulatory model that delegates enforcement to the regulated industry, establishes a threshold with no developmental basis, and lacks the institutional capacity to close the gap between the law's text and its effect. The 13-year threshold has no developmental basis and has not been updated in 28 years. The compliance infrastructure produces documented compliance without documented protection.
COPPA 2.0 and Its Fate — The Reform Record
Congressional awareness of COPPA's inadequacy has produced periodic reform attempts. The Children's Online Privacy Protection Act 2.0 (COPPA 2.0), first introduced in 2019 and reintroduced multiple times through 2024, would raise the protected age from 13 to 16, prohibit targeted advertising to minors, and require platforms to establish "Data-Free Zones" for users under 16. As of this writing, COPPA 2.0 has not passed. The bill has cleared committee but faced sustained industry opposition and has not achieved floor votes in either chamber.
State-level reform has produced more action. California's Age-Appropriate Design Code Act (AB 2273), signed in 2022, requires platforms likely to be accessed by minors to conduct data protection impact assessments and design products that default to the highest privacy settings for minors. The law has faced First Amendment challenges that have proceeded through federal courts. Several other states have enacted or introduced similar legislation. The state-level activity reflects both the inadequacy of the federal response and the political difficulty of federal legislative reform in the face of sustained industry lobbying.
The Kids Online Safety Act (KOSA), another federal reform effort, would impose a duty of care on platforms with respect to minors, requiring them to prevent and mitigate specific categories of harm including addiction, mental health decline, and exposure to dangerous content. KOSA has similarly progressed through committee and failed to achieve passage. The pattern — bipartisan support, committee advancement, floor failure — reflects the sustained effectiveness of industry opposition to federal regulation of children's online safety.
What the Record Demands
The COPPA record demands reforms that address the structural failures the current law embeds.
Age verification with actual verification. The FTC's "reasonable efforts" standard for age verification has been interpreted to permit self-reported age data. A revised standard that requires technical age verification methods — parental consent systems that do not rely solely on user self-reporting, or third-party verification services — would close the gap between COPPA's stated requirement and its current implementation. The technical barriers to more effective age verification are not insurmountable; they are cost impositions that platforms have successfully avoided by regulatory acquiescence to inadequate verification standards.
A developmentally calibrated age threshold. The age-13 threshold was a legislative compromise in 1998. The developmental neuroscience reviewed in YR-001 establishes that the neurological systems most directly relevant to consent capacity and vulnerability to engagement design do not mature at 13. A revised threshold that reflects developmental evidence — or a tiered system that provides different protections at different ages based on developmental milestones — would bring the law into alignment with the science it was designed to address.
Design-level requirements, not just data collection rules. COPPA regulates data collection from children. It does not regulate the engagement design mechanisms deployed on adolescents aged 13 and over. The Developmental Asymmetry documented in YR-001 — the heightened dopaminergic sensitivity, the underdeveloped prefrontal inhibitory control, the peak social comparison sensitivity — operates on 13-to-17-year-olds who are fully outside COPPA's scope. A regulatory framework adequate to the evidence would regulate design features, not only data collection, for minors across a broader age range.
Enforcement capacity commensurate with enforcement mandate. The FTC cannot effectively enforce COPPA against major technology platforms with its current resources. Enforcement actions are expensive, slow, and produce settlements calibrated to what the FTC is willing to litigate rather than to the scale of violations. An enforcement regime that could produce meaningful deterrence would require either substantially greater FTC resources, private right of action for COPPA violations, or both.
Selected Evidence Base
- Children's Online Privacy Protection Act of 1998, 15 U.S.C. §§ 6501–6506.
- Federal Trade Commission (2013). Children's Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business. — Current COPPA framework and verification standards
- FTC v. YouTube (Google) (2019). Settlement: $170 million civil penalty for COPPA violations. No. FTC-2019-0052.
- FTC v. Musical.ly / TikTok (2019). Settlement: $5.7 million civil penalty. No. FTC-2019-0030.
- Common Sense Media (2021). The Common Sense Census: Media Use by Tweens and Teens. — 38% of 8-12 year olds regularly use YouTube; 28% TikTok; 23% Instagram
- Senate Committee on Commerce, Science, and Transportation (2021). Hearing: Protecting Kids Online: Testimony from a Facebook Whistleblower. — Haugen documents on Meta awareness of underage user population
- Keats Citron, D., & Wittes, B. (2017). "The Problem Isn't Just Backpage: Revising Section 230 Immunity." Georgetown Law Technology Review, 2(2), 453–473. — Regulatory capture and platform liability framework
- Moody, J. (2023). "COPPA 2.0: Congressional Attempts to Update Child Online Privacy." Federal Communications Law Journal, 75(2). — Legislative history of reform attempts
- California Age-Appropriate Design Code Act (AB 2273), Chap. 320, Stats. 2022.
- Kids Online Safety Act (KOSA), S. 1409, 118th Congress (2023).
- Auxier, B. et al. (2020). "Parenting Children in the Age of Screens." Pew Research Center. — Parental awareness and concern about children's technology use
- Livingstone, S., & Helsper, E.J. (2007). "Gradations in digital inclusion: children, young people and the digital divide." New Media & Society, 9(4), 671–696.
- Information Commissioner’s Office (UK). (2023, May 4). TikTok fined £12.7 million for failing to protect children’s privacy. ICO Press Release. — Age verification failure and illegal processing of children's data under UK GDPR.
- Federal Trade Commission. (2024, September 2). FTC Refers Meta to DOJ for Civil Penalty Action. FTC Press Release. — Referral for alleged violations of 2012 consent decree including children’s data exposure.
The Institute for Cognitive Sovereignty. (2026). The COPPA Failure Record [ICS-2026-YR-002]. The Institute for Cognitive Sovereignty. https://cognitivesovereignty.institute/youth-record/the-coppa-failure-record