ICS-2026-SG-001 · The Instagram Files · Saga IX

What the Internal Research Showed

Facebook’s own researchers documented the harm. The organization was designed to ensure that documentation did not become a product decision.

Named condition: The Research Suppression Event · Saga IX · 16 min read · Open Access · CC BY-SA 4.0
1 in 3
teenage girls who reported Instagram made body image issues worse — per Facebook’s own internal research
Oct 2021
date of Frances Haugen’s disclosure to the Wall Street Journal
6
internal research presentations documenting adolescent harm that were routed to legal rather than product teams

The Frances Haugen Disclosure

On September 13, 2021, the Wall Street Journal began publishing a series of investigative reports under the title "The Facebook Files." The source was Frances Haugen, a data scientist who had worked at Facebook from 2019 to 2021 on the company's Civic Integrity team. Before leaving the company, Haugen copied tens of thousands of pages of internal documents — research reports, presentation decks, internal communications, and policy deliberations — and provided them to the Journal, to the Securities and Exchange Commission, and subsequently to Congress.

The disclosure was not a leak of trade secrets or proprietary algorithms. It was a disclosure of institutional knowledge. The documents showed what Facebook's own researchers had found about the effects of the company's products on its users, what internal discussions had taken place about those findings, and what organizational decisions had been made about whether and how to act on them. The most consequential subset of the disclosed materials concerned Instagram's effects on adolescent users — specifically adolescent girls.

Haugen testified before the Senate Commerce Subcommittee on Consumer Protection on October 5, 2021. Her testimony was specific. She did not claim that Facebook was unaware of its products' effects. She claimed the opposite: that Facebook possessed detailed internal research documenting those effects and that the organizational architecture was designed to prevent that research from producing product modifications that would reduce the documented harm but also reduce engagement metrics. The distinction is structural. The problem Haugen identified was not ignorance. It was organizational design.

The disclosure produced a specific evidentiary record. It moved the analysis of platform harm from the domain of external observation — epidemiological correlations, survey data, academic studies conducted without access to platform data — into the domain of institutional documentation. What the company knew, when it knew it, and what organizational processes determined the institutional response to that knowledge became matters of documented record rather than inference.

What the Internal Research Found

The internal research was conducted by Facebook's own research teams using the company's proprietary data and direct access to platform users. The methodologies included internal surveys of teen users, focus groups, data analysis of usage patterns, and cross-referencing of behavioral metrics with self-reported wellbeing measures. The research was not preliminary. Multiple studies were conducted over a period of years, with findings that were internally replicated across research teams and presentation cycles.

The headline finding, reproduced in internal slide decks disclosed by Haugen, was stated with a directness that internal research documents typically possess and public communications typically lack: "We make body image issues worse for one in three teen girls." This was not an external critic's characterization. It was the company's own summary of its own research, written on its own slides, presented in its own internal meetings. The finding was specific to Instagram — not Facebook's broader platform — and specific to adolescent girls as a demographic subgroup.

The research identified social comparison as the core mechanism. Teen users reported that Instagram made them feel worse about themselves because the platform presented a continuous stream of images depicting other people's appearances, bodies, and lifestyles in curated, filtered, and idealized forms. The comparison between one's own unfiltered reality and others' curated presentation produced measurable declines in self-reported body satisfaction, mood, and self-worth. The mechanism was not incidental to Instagram's design. The visual feed — populated by images ranked for engagement — was the product. The comparison it produced was the experience the product delivered.

Beyond body image, the internal research documented that teens attributed increases in anxiety and depression to their Instagram usage. Internal presentation materials included data showing that a meaningful percentage of teen users who reported experiencing anxiety or depression identified Instagram as a contributing factor. The research noted that these effects were not uniformly distributed: they were concentrated among users who were already vulnerable to social comparison, body image concerns, and peer evaluation — which is to say, among the users the platform's engagement architecture most intensively engaged.

The methodological competence of the research is relevant to the institutional analysis. These were not amateur surveys or poorly designed studies. Facebook employed trained researchers — data scientists, social psychologists, and survey methodologists — who produced work that met internal standards for rigor. The company did not fail to study the question. It studied the question, reached documented conclusions, and then made organizational decisions about what those conclusions would produce.

The Organizational Routing

The research findings were presented internally. This is documented. The presentation decks exist. They were shared in internal meetings. They reached senior leadership. The question is not whether the findings were communicated within the organization. The question is where they were routed — and where they were not.

The disclosed documents indicate that research findings documenting harm to adolescent users were routed to legal and executive review functions. They were reviewed by teams whose institutional role was to assess the company's liability exposure, manage its regulatory risk, and prepare its public communications strategy. This routing is itself a decision with structural consequences. Legal review evaluates research findings through the lens of what they mean for the company's legal position. Executive review evaluates them through the lens of what they mean for the company's strategic positioning. Neither function has the authority or the institutional mandate to modify the product.

Product teams — the teams with design authority over Instagram's feed algorithm, content recommendation systems, engagement metrics, and user interface — were not the primary recipients of the research routing. The findings did not flow into the product development pipeline as requirements, constraints, or design specifications. They did not produce engineering tickets, design sprints, or A/B tests aimed at reducing the documented harm. The organizational architecture separated the knowledge function (research) from the action function (product design) by routing the output of the first through intermediary functions (legal, executive) whose institutional incentives were not aligned with product modification.

This is the Research Suppression Event. It is not a conspiracy. It is not a single decision made by a single executive in a single meeting. It is an organizational architecture — a routing table — that determines what institutional knowledge produces. Research routed to legal produces liability assessments. Research routed to communications produces messaging strategies. Research routed to product teams produces design modifications. The routing decision is the suppression. The research was conducted. It was competent. It reached its conclusions. The organizational design ensured that those conclusions entered the liability management pipeline rather than the product modification pipeline.

The distinction between suppression by concealment and suppression by routing is the structural point. The research was not hidden in a vault. It was not burned. It was not classified. It was presented, discussed, and reviewed. It was simply routed to organizational functions that would process it as a risk management input rather than a product design input. The institutional knowledge existed. The organizational architecture determined that the institutional response would be protective rather than remedial.

The Public Denial

While the internal research documented the harm in the company's own terms, the company's public communications operated on a different register. In congressional testimony, public statements, and media responses prior to the Haugen disclosure, Facebook's leadership characterized the relationship between Instagram and adolescent mental health as unestablished, complex, and subject to ongoing research. The company's public position was that the evidence did not support a causal connection between Instagram usage and negative mental health outcomes in teens.

The gap between the internal research record and the external communications record is documented in the disclosed materials. Internal slide decks stating "We make body image issues worse for one in three teen girls" coexisted with public statements asserting that the research on social media and teen mental health was inconclusive. The internal research identified specific mechanisms (social comparison, appearance-based evaluation, algorithmic amplification of idealized content) through which the platform produced harm. The external communications treated the question as open, unresolved, and methodologically contested.

This gap is not, by itself, unusual in the history of institutional knowledge of product harm. It is the documented pattern. The tobacco industry's internal research established the carcinogenicity of cigarettes while its public communications maintained that the science was uncertain. The opioid industry's internal data documented addiction rates while its public communications maintained that the risk of addiction from properly prescribed opioids was low. The gap between institutional knowledge and institutional communication is a structural feature of organizations whose revenue depends on products that produce documented harm. The communications strategy is not separate from the product strategy. It is the organizational mechanism that maintains the market conditions under which the product continues to operate.

The Haugen disclosure collapsed this gap. The internal documents became public. The divergence between what the company knew and what the company said became a matter of documentary comparison rather than inference. The public communications record could be placed alongside the internal research record, and the distance between them could be measured in the company's own words.

Standard Objection

"Facebook's internal research was preliminary and the company disputes its interpretation. Internal research documents are working papers, not published findings." — The objection is structurally identical to the tobacco industry's response to its own internal research documents. The question is not whether internal working papers meet the standard of peer-reviewed publication. The question is whether the company possessed institutional knowledge of product harm and whether the organizational architecture was designed to prevent that knowledge from producing product modification. The disclosed documents answer both questions. The research was conducted by trained researchers using proprietary data. The findings were presented to senior leadership. The organizational response was to route the findings to legal rather than to product. The "working paper" characterization does not alter the routing decision — it is part of the communications strategy that the routing decision was designed to support.

The Structural Parallel

The organizational architecture documented in the Haugen disclosure is not novel. It is the same architecture documented in the tobacco industry's internal records (Saga VII), in the opioid industry's marketing and research operations, and in the lead industry's institutional response to toxicity data. The pattern is consistent across industries and across decades: internal research documents product harm; the research is routed to legal and executive functions; public communications maintain that the evidence is inconclusive; the product continues to operate without the modifications the research would indicate.

The tobacco parallel is the most precisely documented. Beginning in the 1950s, tobacco industry scientists conducted internal research that established the carcinogenicity of cigarette smoke. This research was routed to legal departments. It was reviewed by outside counsel. It was assessed for its implications for the companies' litigation exposure. It was not routed to product development teams with the mandate to reduce the carcinogenic properties of cigarettes. The organizational architecture ensured that institutional knowledge of harm produced institutional strategies for managing the consequences of harm rather than institutional strategies for reducing harm.

The Facebook case replicates this architecture in the platform era. The product is different — a visual social media feed rather than a combustible nicotine delivery system. The harm mechanism is different — social comparison and psychological distress rather than carcinogenesis. The organizational architecture is the same. Internal research documents the harm. The routing decision sends the findings to legal and communications rather than to product design. The public position maintains that the evidence is contested. The product continues to operate in its current form.

This is the EPD pattern (Engineered Product Denial) described in the institutional capture analysis (Saga VI), expressed in its platform-era form. The EPD pattern does not require that individual actors within the organization intend to cause harm. It requires only that the organizational architecture route institutional knowledge through functions whose incentives are aligned with product preservation rather than product modification. The routing table is the mechanism. The individual decisions that populate the routing table are made by people acting within the incentive structure the organization provides. The architecture produces the outcome. The outcome is structural.

What the Disclosure Changed and What It Didn’t

The Haugen disclosure produced immediate and measurable institutional responses. Congressional hearings were convened. The Senate Commerce Subcommittee held multiple sessions of testimony. Haugen testified publicly. Internal documents were entered into the congressional record. Legislative proposals — including versions of the Kids Online Safety Act — were advanced. Media coverage was extensive and sustained. The disclosure became a reference point in policy discussions about platform regulation, adolescent safety, and corporate accountability.

The disclosure also produced a corporate response. Facebook rebranded as Meta in October 2021 — a decision that had been in preparation before the disclosure but whose timing coincided with the peak of public attention to the leaked documents. The company announced initiatives related to teen safety on Instagram, including optional features allowing teens to take breaks from the platform and providing parental oversight tools. The company paused the development of "Instagram Kids," a version of the platform designed for children under 13 that had been in development at the time of the disclosure.

What the disclosure did not produce was a structural modification to Instagram's core architecture. The engagement-optimized feed remained engagement-optimized. The algorithmic ranking system continued to surface content based on predicted engagement rather than predicted welfare impact. The visual architecture of the platform — the image-centric feed that constitutes the Comparison Engine analyzed in SG-002 — remained unchanged in its fundamental design. Like counts, while made optionally hideable, were not removed. The content recommendation algorithm, while subjected to incremental adjustments, was not redesigned around welfare metrics. The core product — the product the internal research had identified as the vector of harm — continued to operate on the same design principles it had operated on before the disclosure.

The legislative response, as of this writing, has not produced federal legislation specifically addressing the mechanisms the internal research identified. The Kids Online Safety Act has been introduced in multiple sessions of Congress. It has not been enacted. Proposed regulatory frameworks for algorithmic accountability, age-appropriate design codes, and platform duty-of-care standards have been debated. They have not been implemented at the federal level. Some state-level legislation has advanced. The overall regulatory architecture governing platform interactions with minors in the United States remains substantially similar to the architecture that existed before the disclosure.

The Research Suppression Event, in this sense, produced a public relations crisis but not a product modification. The internal research became public. The organizational routing became visible. The gap between institutional knowledge and public communication was documented. And the product that the research identified as the source of harm continued to operate on the design principles the research had measured and the organization had declined to modify. The disclosure changed the public's knowledge of what the company knew. It did not change the product that the company built.

This outcome is itself a structural finding. The Research Suppression Event is not only the routing of research from product teams to legal teams within the organization. It is the broader architectural condition in which even the public disclosure of that routing — even the full transparency of the internal record — does not produce the product modification the research indicated. The suppression is not merely organizational. It is institutional. The regulatory, legislative, and market conditions under which the platform operates are themselves designed — through lobbying, through the political economy of platform regulation, through the Section 230 architecture analyzed in the Political Economy series — to ensure that institutional knowledge of product harm does not translate into mandatory product modification.

Named Condition · ICS-2026-SG-001
The Research Suppression Event
"The organizational architecture in which internal research documenting product harm is routed through legal and executive review functions rather than to product teams with design authority — ensuring that institutional knowledge of harm is converted into legal liability management rather than product remediation. The Research Suppression Event is not a description of deception; it is a description of organizational design. The research was conducted. The conclusions were reached. The routing decision — to legal, not to product — is the event."

References

  1. Haugen, F. (2021, October 5). Testimony before the U.S. Senate Commerce Subcommittee on Consumer Protection, Product Safety, and Data Security.
  2. Wells, G., Horwitz, J. & Seetharaman, D. (2021, September 13–17). The Facebook Files. The Wall Street Journal. [Six-part investigative series based on Haugen disclosure documents]
  3. Facebook Internal Research. (2019). "We make body image issues worse for one in three teen girls." [Internal presentation slide disclosed through Haugen documents; cited in WSJ Facebook Files Part 1]
  4. Haugen, F. (2021, October 25). Securities and Exchange Commission complaint filing. [Filed via legal counsel; alleges Facebook misled investors about platform risks]
  5. ICS-2026 Saga VII: The Archive. cognitivesovereignty.institute. [Tobacco industry parallel: Research Suppression Event pattern documented across industries]
Series Hub · SG
The Instagram Files
Series overview and all five papers in the Instagram Files.
Next · SG-002
Upward Social Comparison at Scale
The comparison engine: how Instagram's design produces systematic upward social comparison at industrial scale.