"The research showed the harm. The research was routed to legal. This is not a story about what a company didn't know. It is a story about what a company knew and how the organization was designed to prevent that knowledge from becoming action."
In October 2021, Frances Haugen disclosed Facebook's internal research to the Wall Street Journal. The research showed that Instagram made body image worse for one in three teenage girls; that the platform was a source of anxiety, depression, and social comparison for adolescent users; and that research reaching these conclusions had been routed to legal and executive functions rather than to product teams with the authority to act on it.
The Instagram Files are not the beginning of this story — the developmental vulnerability documented in the Developmental Record (DN series) predates any platform. They are the canonical instance of Platform Research Suppression: the organizational architecture in which institutional knowledge of harm does not translate into institutional obligation to remediate it, because the routing of research through legal review rather than product review is itself a structural choice that insulates the revenue function from the welfare function.
This series documents the specific case in its specifics — the research findings, the organizational routing, the public denials, and the Foregone Remediation catalog — and connects it to the epidemiological record that the case corroborates but does not, by itself, establish.