The Legal Architecture · Paper II

The GDPR and What It Actually Changed

Lessons from Eight Years of the World's Most Comprehensive Privacy Regulation

The Institute for Cognitive Sovereignty · 2026 · Research Paper · Open Access · CC BY-SA 4.0

ICS-2026-LA-002 Published March 6, 2026 20 min read
2018
Year GDPR took effect — after which behavioral advertising grew from ∼$250B to over $600B in global annual revenue
95%+
Cookie consent banner acceptance rates across major websites — not evidence of meaningful consent; evidence of effective UI design
€1.2B
Meta's 2023 fine — the largest GDPR fine in history; addressed data transfer practices, not engagement architecture
“GDPR has become the de facto global privacy standard. Unfortunately, what it has standardized, in practice, is the consent banner.”
— Woodrow Hartzog, Privacy's Blueprint, on the gap between GDPR's design and its implementation
Section I

What GDPR Was Designed to Do

The General Data Protection Regulation was adopted by the European Parliament and Council in April 2016 and became applicable on May 25, 2018 — the date that produced more corporate privacy-related email communications than any other single day in internet history. It replaced the 1995 Data Protection Directive, which had itself replaced national frameworks that predated the commercial internet. GDPR represented an ambitious attempt to comprehensively restate the rights of EU residents over their personal data in a regulatory environment that had been transformed by social media, smartphones, and behavioral advertising.

The regulation is built on seven principles: lawfulness, fairness, and transparency; purpose limitation; data minimization; accuracy; storage limitation; integrity and confidentiality; and accountability. For the attention economy's primary data processing activity — behavioral advertising, which requires continuous collection of user behavior data, its aggregation into a behavioral profile, and its use to target advertisements — the operative question under GDPR is lawful basis. There are six lawful bases under GDPR Article 6; for behavioral advertising, the only generally applicable bases are consent (Article 6(1)(a)) and legitimate interests (Article 6(1)(f)). The legitimate interests basis was initially used by many platforms to justify behavioral advertising without consent; subsequent enforcement actions and court interpretations have substantially narrowed this basis for behavioral advertising, making consent the primary required basis for most major platforms.

GDPR's consent requirements are demanding in principle. Article 7 specifies that consent must be freely given, specific, informed, and unambiguous. Consent obtained through a pre-ticked box is not valid. Consent cannot be bundled as a condition of service. The data subject must be able to withdraw consent as easily as they gave it. These requirements, taken seriously, would require a fundamentally different relationship between platforms and users than the behavioral advertising ecosystem had built over the preceding decade. The question is whether they were taken seriously.



Section III

The Enforcement Record — Eight Years of Action

GDPR enforcement through 2025 has produced total fines exceeding €4 billion across EU member state supervisory authorities. The concentration of that enforcement is notable: a small number of very large fines account for a substantial share of the total, and the cases are concentrated in data transfer and security breach violations rather than the consent and design violations that bear most directly on the Consent Interface problem.

Case Year Fine Violation type Impact on design
Meta (WhatsApp data transfers) 2023 €1.2B US data transfers without adequate safeguards None (transfer mechanisms)
Amazon (behavioral advertising) 2021 €746M Cookie consent without valid basis Consent banner updates
Instagram (children's data) 2022 €405M Minors' personal data visibility defaults Default settings change
Meta (Facebook data practices) 2022 €265M Data scraping vulnerability None (security)
Google Ireland (analytics cookies) 2022 €150M Cookie consent design (French DPA) Reject button added

The pattern in this table is instructive. The largest fine in GDPR history — €1.2 billion — addressed data transfer mechanisms between Meta's EU and US operations. It did not address how Meta's platforms are designed, how their recommendation algorithms operate, or what effects those design choices have on user cognition and mental health. The Instagram fine (€405M) did address children's data and defaults — a meaningful precedent, though it required children's advocates to bring the complaint and years of investigation to produce. The French DPA's €150M fine against Google did produce a design change — the addition of a same-prominence "Reject" button to Google's consent banner. This change reduced acceptance rates. It did not change what Google does with the behavioral data of users who accept.

The enforcement geography is as significant as the enforcement subject matter. Most major US platforms maintain their EU headquarters in Ireland, making the Irish Data Protection Commission (DPC) the lead supervisory authority under GDPR's one-stop-shop mechanism. The Irish DPC has been the subject of sustained criticism from other EU supervisory authorities for slow and inadequate enforcement. The Schrems II litigation, which invalidated the EU-US Privacy Shield data transfer mechanism in 2020, originated in Max Schrems's complaint to the Irish DPC — a complaint the DPC had declined to investigate for years before being compelled to do so. The EDPB (European Data Protection Board) has had to exercise its Article 65 dispute resolution powers repeatedly to compel the Irish DPC to take binding decisions that it had been reluctant to take unilaterally.

Case Record — LA-002
Data Protection Commissioner v. Facebook Ireland Ltd (Schrems II — CJEU 2020)
Outcome: EU-US Privacy Shield invalidated. Binding CJEU judgment. Data transfers to the US under Privacy Shield declared unlawful.

Max Schrems filed his original complaint with the Irish DPC in 2013. The complaint was straightforward: Facebook Ireland transferred EU residents’ personal data to Facebook Inc. in the United States, where it was accessible to US intelligence agencies under FISA Section 702 — a form of surveillance access that EU data protection law does not permit. The Irish DPC declined to investigate for years, citing the adequacy decision (Safe Harbor, later Privacy Shield) as covering the transfers. It took seven years of litigation, including a preliminary reference to the CJEU, before the Court of Justice invalidated Privacy Shield entirely in July 2020. The Schrems II case is the jurisdictional arbitrage mechanism documented in the Consent Interface analysis made visible in its starkest form: the enforcement body with jurisdiction to act (the Irish DPC) declined to act for years, while the legal mechanism nominally providing protection (Privacy Shield) was being used by every major US platform to transfer data to a jurisdiction with surveillance laws fundamentally incompatible with GDPR’s foundations. It required a decade of private litigation to produce the enforcement the DPC should have initiated in 2013. The Data Protection Board now covering cross-border transfers (Standard Contractual Clauses with supplementary measures) remains subject to the same underlying tension that invalidated both Safe Harbor (2015) and Privacy Shield (2020): US national security law and EU privacy law are structurally incompatible, and legal mechanisms that paper over this incompatibility are vulnerable to invalidation.

Case Record — LA-002
IAB Europe TCF (Belgian DPA 2022) & Meta Platforms v. Bundeskartellamt (CJEU 2023)
Outcomes: IAB Europe’s TCF found GDPR non-compliant (Belgian DPA, 2022). Meta’s “legitimate interests” basis for behavioral advertising ruled invalid (CJEU, 2023).

The IAB Europe Transparency and Consent Framework (TCF) is the consent management infrastructure that underlies most GDPR-compliant advertising on the European internet. In February 2022, the Belgian DPA ruled that the TCF itself violates GDPR: the consent strings it generates do not satisfy GDPR’s consent requirements, and IAB Europe bears co-controller responsibility for the resulting non-compliant processing. The ruling struck at the legitimacy of the industry’s primary compliance mechanism at the same time that mechanism was being used across hundreds of thousands of websites as evidence of consent to behavioral advertising. The CJEU’s 2023 Meta ruling went further: it established that Meta cannot claim “legitimate interests” as the legal basis for processing personal data for behavioral advertising, and that consent to data processing cannot be bundled as a condition of accessing the platform. If Meta requires users to consent to behavioral advertising as a condition of using Facebook, that consent is not “freely given” under Article 7 GDPR. Together these rulings close the two primary legal escape routes from GDPR’s consent requirements for behavioral advertising — the industry consent framework and the legitimate interests basis — while the enforcement record demonstrates that even valid legal frameworks require years of litigation to produce compliance from the largest platforms.


Section IV

What Behavioral Advertising Did Next — Adaptation Without Change

The behavioral advertising industry's response to GDPR was not to abandon behavioral advertising. It was to restructure behavioral advertising to minimize disruption to its economic output while complying with GDPR's formal requirements. The restructuring took several forms.

First, the industry accelerated investment in consent management platforms (CMPs) — third-party vendors whose product is the design and technical implementation of consent banner interfaces. CMPs compete on "consent rates" — the percentage of users who click "Accept All" — as a performance metric. The industry publication Digiday reported on CMP providers competing to advertise the highest consent rates. A vendor whose product achieves a higher "consent rate" is, in the Consent Interface framework documented here, a vendor that has more effectively exploited the interface to suppress meaningful refusal. GDPR had created a market for consent theater optimization.

Second, the advertising ecosystem developed "legitimate interests" and "contextual advertising" frameworks that attempted to reduce reliance on consent-based behavioral advertising for some advertising use cases. Contextual advertising — targeting advertisements based on the content of the page a user is viewing rather than their behavioral profile — is GDPR-compliant without consent and was a pre-existing advertising methodology that GDPR partially revived. However, contextual advertising is less profitable than behavioral targeting for most advertising categories, and the industry's adoption of contextual advertising was partial and selective rather than replacing the behavioral advertising infrastructure.

Third, the industry invested in "privacy-preserving" technical alternatives to third-party cookies — the primary data collection mechanism that GDPR and browser-level cookie restrictions were disrupting. Google's Privacy Sandbox initiative, which aimed to replace third-party cookie tracking with browser-level cohort-based advertising, generated years of regulatory scrutiny and eventually was modified and partially abandoned in 2024 when Google announced it would not deprecate third-party cookies in Chrome after all. The "privacy-preserving" alternative infrastructure turned out to be difficult to implement in ways that satisfied both advertisers' data needs and regulators' privacy requirements — and the attempt demonstrated that the industry's preference was for solutions that preserved behavioral advertising revenue, not for solutions that produced the privacy outcome GDPR was designed to mandate.


Section V

The Rights in Practice — Access, Erasure, Portability

GDPR grants data subjects three substantive rights that, in theory, give individuals meaningful control over their personal data: the right of access (to know what data is held about them), the right of erasure (to have data deleted), and the right of portability (to receive data in a machine-readable format for transfer to another controller). The exercise of these rights in practice is instructive about the gap between GDPR's intent and its implementation.

The right of access, when exercised against major platforms, typically produces a data download that is technically comprehensive and practically unintelligible. Meta's data download for a typical user can contain thousands of files across hundreds of categories, including behavioral inference categories that are described in technical language without explanatory context. A user who downloads their data and asks "what does Facebook actually know about me?" receives a file archive; the question of what those files mean for how the user is targeted by advertising requires technical expertise the regulation does not require platforms to provide.

The right of erasure is subject to exceptions that swallow a significant portion of its practical scope. Data can be retained when processing is necessary for compliance with a legal obligation, for the establishment, exercise, or defense of legal claims, or for legitimate interests that override the individual's erasure request. Platforms' data retention policies under these exceptions are, in many cases, more extensive than the user who requests erasure would expect. The right to be forgotten, as it has been implemented, often means the right to have some data deleted from some systems while other data is retained for other purposes.

What GDPR Actually Achieved
The Genuine Accomplishments of Eight Years of Enforcement

This paper has documented the Consent Interface problem and the limitations of GDPR's enforcement record with specificity. It is necessary to also document what GDPR actually changed, because the failures of GDPR do not mean GDPR achieved nothing.

GDPR created a global privacy standard. Companies operating globally found it more efficient to implement GDPR-compliant practices worldwide than to maintain separate data practices for EU and non-EU markets. This global spillover has produced meaningful changes in data practices in jurisdictions where no equivalent regulation exists.

GDPR produced a genuine cultural shift in organizational data governance. Data protection impact assessments, data protection officers, and privacy-by-design requirements have been institutionalized in organizations that would not have considered them before GDPR. The practices are not perfect; the institutionalization is real.

GDPR created an enforcement infrastructure that, while slow and uneven, produced the largest fines in privacy enforcement history and forced companies to take privacy law seriously as a compliance matter in a way they had not previously. The alternative — no regulation — would have been worse.

The argument of this paper is not that GDPR failed. It is that GDPR's specific mechanism — consent-based data governance — is not calibrated to address the specific mechanisms of attention capture documented in the prior series. A framework that addresses attention capture as a design harm, rather than a data transaction harm, would be built differently. GDPR is necessary but not sufficient.


Section VI

What GDPR Did Change — The Genuine Regulatory Achievements

Alongside the Consent Interface problem and enforcement limitations, GDPR produced a set of genuine, significant changes in the digital data ecosystem that are worth documenting clearly.

The global spillover effect. Research by Goldberg et al. (2023) and Aridor et al. (2020) documented that GDPR enforcement in the EU produced changes in data practices of firms operating globally, including in the United States, because the compliance cost of maintaining separate data architectures for EU and non-EU users exceeded the cost of implementing EU standards everywhere. GDPR's regulatory reach, in other words, extends beyond the EU through market mechanisms rather than legal mechanisms.

The children's data provisions. GDPR Article 8 sets 16 as the default age of digital consent for data processing, with member states permitted to lower this to 13 (which most have done). The Instagram enforcement action in 2022 produced a genuine change: Instagram changed its default account privacy settings for users under 16 from public to private. This is a documented design change produced by GDPR enforcement. It is narrow — it addresses visibility defaults rather than engagement architecture — but it is real.

The data breach notification requirement. GDPR Article 33 requires that controllers notify supervisory authorities of personal data breaches within 72 hours of becoming aware. This has produced a significant increase in disclosed data breaches and created accountability for security practices that was previously absent. The security improvement effect of mandatory breach notification is well-documented in prior literature on US state data breach laws.

The right to explanation for automated decisions. GDPR Article 22 gives data subjects the right not to be subject to solely automated decisions that produce significant effects, with a right to explanation. This provision has the most direct bearing on algorithmic systems — and has been the subject of significant legal interpretation as to its scope. The Article 22 right has not been successfully applied to algorithmic content recommendation systems at scale, but it represents a legal basis for such application that remains contested and potentially significant.


Section VII

What the Record Demands

The GDPR record — eight years of the most comprehensive privacy regulation yet enacted — produces a specific set of lessons for the design of cognitive sovereignty regulation.

Consent is not the lever. Consent-based regulatory frameworks are vulnerable to the Consent Interface because consent is a formal legal category that can be satisfied by interface design without requiring genuine decision-making. A regulatory framework for attention capture should not rely primarily on consent as its operative mechanism. Design standards — specific prohibitions and requirements on how platforms are built — do not require user consent to function and cannot be gamed by consent banner optimization.

Enforcement geography matters. The Irish DPC concentration problem is a GDPR-specific artifact of the one-stop-shop mechanism applied to platforms that deliberately headquartered in the most permissive available jurisdiction. Any future framework must include enforcement mechanisms that either prevent regulatory jurisdiction shopping or ensure that the lead regulator operates under performance standards that prevent extended inaction.

Rights require implementation infrastructure. The right of access, erasure, and portability have not produced the data control GDPR's drafters envisioned because the implementation of those rights was left to the regulated entities. A framework that specifies the format, comprehensibility, and accessibility standards for rights exercises — not just the existence of the right — would close the gap between nominal and functional data control.

The design target matters more than the fine amount. The €1.2 billion Meta fine changed nothing about how Meta's platforms work. The €150 million Google fine added a "Reject" button to a consent banner — a small but genuine change. The ratio of enforcement effort to behavioral change is dramatically better for enforcement actions that require specific design modifications than for actions that impose financial penalties. Future frameworks should calibrate enforcement remedies to require design changes, not merely financial penalties that platforms can absorb as a cost of doing business.


Sources

Selected Evidence Base

  • Utz, C. et al. (2019). "(Un)informed Consent: Studying GDPR Consent Notices in the Field." Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security. — Dark patterns in cookie banners; acceptance rate manipulation
  • Soe, T.H. et al. (2020). "Circumvention by Design — Dark Patterns in Cookie Consent for Online News Media." Proceedings of the 14th International Conference on Theory and Practice of Electronic Governance. — 93% of EU websites deploy at least one dark pattern
  • Aridor, G., Che, Y.K., & Salz, T. (2020). "The Economic Consequences of Data Privacy Regulation: Empirical Evidence from GDPR." NBER Working Paper 26900. — GDPR's global market effect; spillover to non-EU operations
  • Goldberg, S., Johnson, G., & Shriver, S. (2023). "Regulating Privacy Online: An Economic Evaluation of the GDPR." American Economic Journal: Economic Policy, 15(1), 245–282.
  • Irish Data Protection Commission (2023). Decision re. Meta Platforms Ireland Ltd. (WhatsApp). DPC Inquiry Reference IN-18-12-2.
  • Luxembourg CNPD (2021). Decision against Amazon Europe Core S.à r.l. €746M fine. Deliberation 2021-219-AMA.
  • Irish Data Protection Commission (2022). Decision re. Instagram. DPC Inquiry Reference IN-20-9-1. €405M fine; minors' data defaults.
  • CNIL (French DPA) (2022). Sanction Decision against Google LLC and Google Ireland Ltd. €150M fine; cookie rejection mechanism.
  • Hartzog, W. (2018). Privacy's Blueprint: The Battle to Control the Design of New Technologies. Harvard University Press. — Design-based privacy regulation; consent theater critique
  • Schrems, M. (2015). Maximillian Schrems v. Data Protection Commissioner. Case C-362/14. Court of Justice of the EU. — Invalidated Safe Harbor; Irish DPC enforcement dynamics
  • Data & Society Research Institute (2023). Consent, Context, and the GDPR: A Four-Year Review. — Consent interface implementation analysis
  • European Data Protection Board (2023). Guidelines 05/2020 on Consent under Regulation 2016/679. Version 1.1. — Freely given consent; bundling prohibition; withdrawal standards
  • Case C-311/18, Data Protection Commissioner v. Facebook Ireland Limited and Maximillian Schrems (Schrems II), ECLI:EU:C:2020:559 (CJEU, July 16, 2020). — Invalidation of EU-US Privacy Shield; jurisdictional arbitrage through Irish DPC documented in litigation history.
  • Belgian Data Protection Authority, Decision No. 21/2022 (Feb. 2, 2022). IAB Europe Transparency and Consent Framework. — Industry consent mechanism found GDPR non-compliant; IAB Europe found co-controller.
  • Case C-252/21, Meta Platforms Inc. v. Bundeskartellamt, ECLI:EU:C:2023:537 (CJEU, July 4, 2023). — Legitimate interests cannot be used as legal basis for behavioral advertising; consent cannot be bundled as platform access condition.
How to Cite

The Institute for Cognitive Sovereignty. (2026). The GDPR and What It Actually Changed [ICS-2026-LA-002]. The Institute for Cognitive Sovereignty. https://cognitivesovereignty.institute/legal-architecture/the-gdpr-and-what-it-actually-changed

References

Internal: This paper is part of The Legal Architecture (LA series), Saga V. It draws on and contributes to the argument documented across 20 papers in 5 series.

External references for this paper are in development. The Institute’s reference program is adding formal academic citations across the corpus. Priority papers (P0/P1) have complete references sections.