ICS-2026-ET-005 · The EdTech Capture · Saga IX

What Ethical EdTech Would Require

The Classroom Covenant is not aspirational. It is derivable from the documented harms, existing regulatory precedents, and the fiduciary obligation that schools owe to the children in their care.

Named condition: The Classroom Covenant · Saga IX · 17 min read · Open Access · CC BY-SA 4.0
5
core conditions of the Classroom Covenant — each derived from a documented harm
3
existing regulatory frameworks (HIPAA, GLBA, fiduciary duty) that achieve comparable protection in other domains
$0
additional per-student cost of implementing attention architecture impact assessments (the data already exists)

The Derivation

The Classroom Covenant is not a policy proposal assembled from first principles. It is a derivation. Each of its five conditions responds to a specific harm documented in the preceding papers of this series, and each is supported by an existing regulatory precedent in another domain where comparable fiduciary relationships require comparable protections. The derivation is the argument: the conditions are not aspirational because they are already achieved elsewhere, and they are not arbitrary because each one maps to a documented failure.

The mapping is as follows. Data minimization responds to the Data Collection Event documented in ET-002, where EdTech platforms collect behavioral data far exceeding what any educational function requires. The advertising and monetization prohibition responds to the data monetization pathways documented in ET-002, where student behavioral data enters commercial data markets through secondary sharing arrangements. The attention architecture impact assessment responds to the Learning Loss Metric documented in ET-003, where engagement-optimized educational technology produces cognitive development costs that the current evaluation framework does not measure. Teacher training and institutional capacity respond to the Trust Arbitrage documented in ET-001, where schools lack the institutional capacity to evaluate the products they admit into the classroom. The regulatory framework responds to the Educational Privacy Failure documented in ET-004, where FERPA and COPPA provide structurally inadequate protections for student data in the digital context.

The derivation establishes two things simultaneously. First, that each condition is necessary — removing any one of the five conditions leaves the corresponding documented harm unaddressed. Second, that each condition is achievable — the regulatory mechanism required to implement each condition already exists in another domain where comparable protections are provided to populations with comparable vulnerabilities. The Classroom Covenant does not require the invention of new regulatory instruments. It requires the application of existing regulatory instruments to a population that currently receives weaker protection than any other vulnerable population in the American legal system.

Condition 1: Data Minimization

Educational technology may collect only the data required for its stated educational function. The standard is functional necessity: each category of data collected must serve a specific, identified educational purpose, and the vendor must demonstrate that the educational function cannot be delivered without the data in question. Data that would not appear in a traditional educational record — behavioral telemetry, device interaction patterns, attention metrics, engagement scores, social graph data, environmental data captured through device sensors — may not be collected without specific, informed parental consent obtained directly from the parent, not delegated through the school's institutional authority.

The regulatory precedent is HIPAA's minimum necessary standard. Under HIPAA, covered entities must make reasonable efforts to limit the use and disclosure of protected health information to the minimum necessary to accomplish the intended purpose. A hospital does not share a patient's complete medical record with a billing department that needs only the diagnosis code and procedure date. The minimum necessary standard does not prevent healthcare delivery. It prevents the accumulation of data beyond what the specific function requires. Applied to EdTech, the minimum necessary standard would prohibit the collection of millisecond-level click telemetry by a platform whose educational function is to present reading assignments and record comprehension scores. The platform needs the comprehension scores. It does not need the click telemetry. The distinction between what the function requires and what the platform collects is the data minimization gap, and the gap is where the commercial value of student data is generated.

Data minimization also requires prohibiting third-party sharing of student data. When a student's behavioral data leaves the EdTech platform and enters the broader data ecosystem — through data brokerage, advertising partnerships, analytics sharing, or AI training datasets — the educational purpose that justified the collection no longer applies. The data has left the educational context entirely. No legitimate educational function is served by a third-party advertising network possessing a thirteen-year-old's attention pattern data from a Tuesday afternoon math lesson. The prohibition on third-party sharing is not a restriction on educational technology. It is a restriction on the use of educational technology as an access channel for commercial data extraction.

The implementation mechanism is straightforward. Procurement contracts between school districts and EdTech vendors must specify, at the data-field level, what categories of data the platform is authorized to collect. The vendor must certify that each authorized data category serves a specific educational function identified in the contract. Data categories not specified in the contract may not be collected. Audit provisions must allow independent verification of compliance. The contractual specificity required is no greater than what HIPAA business associate agreements already require in the healthcare context — a context in which thousands of technology vendors successfully operate under data minimization constraints without ceasing to function.

Condition 2: Advertising and Monetization Prohibition

No behavioral advertising may target students under 18 using data collected through educational technology, under any circumstances, through any mechanism, regardless of the contractual designation of the vendor or the consent framework under which the data was collected. No student behavioral data collected through educational technology may be monetized through data brokerage, sold to third parties, used to train commercial AI models, or incorporated into predictive analytics products that serve non-educational purposes. The prohibition is absolute because the harm it addresses is categorical: the use of compulsory educational infrastructure as an access channel for the attention economy's data extraction apparatus.

The prohibition does not require novel legal reasoning. It requires the extension of an existing principle: that certain relationships are fiduciary in nature and that fiduciary relationships impose obligations on the party with power that limit what they may do with the information the relationship generates. A financial advisor may not use a client's financial data to trade against the client's interest. A physician may not sell a patient's medical data to a pharmaceutical company for marketing purposes. An attorney may not monetize the contents of privileged communications. In each case, the fiduciary relationship generates information, and the fiduciary obligation limits the uses to which that information may be put. The school-student relationship generates information — behavioral data, cognitive performance data, attention data, social data — and the fiduciary obligation that schools owe to students should limit the uses to which that information may be put in precisely the same way.

The advertising prohibition addresses the structural incentive that drives the data practices documented in ET-002. EdTech companies collect data in excess of educational necessity because that excess data has commercial value in the advertising ecosystem. Remove the commercial value — prohibit the monetization pathway — and the incentive to collect excess data collapses. Data minimization (Condition 1) restricts what can be collected. The advertising prohibition restricts why it would be collected. Together, the two conditions eliminate the economic logic that drives the current data regime. An EdTech company that cannot monetize student data through advertising or data brokerage has no financial reason to collect data beyond what its educational function requires. The conditions are complementary: each reinforces the other, and both are necessary because a data minimization requirement without a monetization prohibition leaves the incentive intact, while a monetization prohibition without data minimization leaves the collection apparatus in place.

Standard Objection

The conditions of the Classroom Covenant would eliminate most EdTech products from the market. Schools would lose access to tools that genuinely improve educational outcomes. The perfect is the enemy of the good.

If a product cannot function without collecting data that exceeds its educational purpose, targeting students with behavioral advertising, or producing attention architecture effects that the evaluation framework does not measure, the product's inability to meet the Covenant conditions is itself diagnostic. The Covenant does not require the elimination of EdTech — it requires the elimination of the specific practices documented as harmful. Products that genuinely improve educational outcomes without producing the documented harms will meet the Covenant. Products that cannot will not. The selection mechanism is the point. The current market includes products whose revenue model depends on data extraction rather than educational effectiveness. Those products would not survive the Covenant's conditions. Products whose revenue model depends on providing educational value sufficient to justify their cost to school districts — the products that would exist in any market where schools paid for educational outcomes rather than for access to student data — would survive. The Covenant does not restrict the EdTech market. It restructures the market's selection mechanism so that educational value, rather than data extraction capacity, determines which products survive.

Condition 3: Attention Architecture Impact Assessment

Before deployment in a school setting, educational technology must undergo a mandatory assessment of its effects on sustained attention, deep reading capacity, executive function development, and screen time displacement of other educational activities. The assessment framework must measure the cognitive development outcomes documented in ET-003 — not only engagement metrics and test scores, which the current evaluation framework already captures, but the Learning Loss Metric outcomes that the current framework systematically excludes: the capacity for sustained focus on a single task, the ability to engage in deep reading of extended texts, the development of executive function capacities including impulse control, working memory, and cognitive flexibility, and the displacement of educational activities (hands-on learning, collaborative discussion, unstructured play, physical activity) whose developmental value is documented but whose absence from the digital platform's data stream makes them invisible to the evaluation framework.

The assessment does not require new data. The data already exists. Educational technology platforms record, in granular detail, how students interact with their products — session duration, task-switching frequency, engagement patterns, completion rates, time-on-task distributions. The research literature on attention development, executive function, and screen time effects provides the interpretive framework for evaluating what those interaction patterns indicate about cognitive development outcomes. The cost of the assessment is not the cost of generating new data. It is the cost of analyzing existing data against developmental benchmarks that the current evaluation framework does not apply. The additional per-student cost is zero because the data is already collected. The cost is institutional: it requires school districts to evaluate EdTech products against criteria that the products' manufacturers have no incentive to highlight.

The assessment framework should be structured as a pre-deployment requirement in the procurement process. Before a school district adopts an EdTech platform, the vendor must submit an attention architecture impact assessment that documents the platform's design choices and their expected effects on sustained attention and executive function development. The assessment must disclose the platform's use of variable reinforcement schedules, notification systems, gamification mechanics, autoplay features, infinite scroll implementations, and any other design elements that the research literature identifies as attention-fragmenting. The school district's procurement decision must include review of the assessment by personnel trained in developmental psychology and attention research — not only in pedagogical use of the technology, which is the current standard. The assessment does not prohibit specific design features. It requires their disclosure and evaluation, so that schools make deployment decisions with knowledge of the attention architecture effects that the current procurement process does not examine.

The regulatory precedent is the environmental impact assessment required under the National Environmental Policy Act (NEPA). Before a federal agency undertakes a major action with potential environmental effects, NEPA requires an environmental impact statement documenting the expected effects and alternatives considered. The requirement does not prohibit actions with environmental effects. It requires that the effects be documented, evaluated, and considered in the decision-making process. An attention architecture impact assessment applies the same principle: the effects must be documented and considered, so that the deployment decision is informed by the full scope of consequences rather than only the consequences the vendor's marketing materials choose to present.

Condition 4: Teacher Training and Institutional Capacity

The Trust Arbitrage documented in ET-001 succeeds because schools lack the institutional capacity to evaluate what they are admitting into the classroom. Teachers are trained in the pedagogical use of EdTech — how to integrate the platform into lesson plans, how to interpret the platform's analytics dashboard, how to manage a classroom in which students are using devices. Teachers are not trained in the attention architecture effects of the products they deploy, the scope of data collection those products conduct, or the behavioral modification mechanisms those products employ. The training gap is not an oversight. It reflects the structural interests of the EdTech industry, which benefits from institutional customers who evaluate products on pedagogical features and cost rather than on data practices and cognitive effects.

Condition 4 requires that teacher training programs — both pre-service preparation programs and in-service professional development — include substantive instruction in three domains that are currently absent or marginal in the standard curriculum. First, attention architecture: the design mechanisms through which digital products capture and sustain user attention, including variable reinforcement schedules, notification systems, social comparison features, gamification elements, and infinite content streams. Teachers who deploy these products must understand how the products are designed to function, not only as educational tools, but as attention-capture systems whose design reflects the economic incentives of the attention economy. Second, data collection and privacy: the scope and nature of the data that EdTech platforms collect, the contractual mechanisms through which that data is shared with third parties, the difference between the data protections that FERPA and COPPA provide and the data protections that the school-student fiduciary relationship requires. Third, behavioral modification: the mechanisms through which educational technology shapes student behavior — not only the intended pedagogical behavior (completing assignments, demonstrating mastery) but the unintended behavioral effects (preference for short-form content, resistance to sustained focus, dependence on external reward signals) that the research literature documents.

The institutional capacity requirement extends beyond individual teacher training to the procurement and governance systems through which school districts select and deploy EdTech products. School districts need personnel with the technical expertise to evaluate EdTech data practices, review privacy policies and terms of service, negotiate contractual data protections, and monitor vendor compliance. Currently, these functions are performed — to the extent they are performed at all — by general procurement staff or IT departments whose expertise is in network infrastructure and device management rather than in data privacy, behavioral architecture, or developmental psychology. The Trust Arbitrage exploits this capacity gap: the school serves as the trusted intermediary through which EdTech enters children's lives, but the school lacks the capacity to evaluate what it is intermediating. Closing the capacity gap does not require every school district to employ a full-time data privacy specialist. It requires that school districts have access to the expertise needed to evaluate the products they deploy — through regional consortia, state education department support services, or independent evaluation organizations — and that procurement decisions incorporate that expertise as a mandatory component rather than an optional supplement.

The regulatory precedent is the informed consent requirement in medical practice. A physician who prescribes a medication must understand the medication's effects, side effects, contraindications, and interactions — not only its intended therapeutic benefit. The standard of care requires that the prescribing physician possess sufficient knowledge to evaluate whether the medication is appropriate for the specific patient and to communicate the relevant risks and benefits. A teacher who deploys an EdTech product is, in a meaningful sense, prescribing a cognitive intervention for a developing brain. The current standard requires only that the teacher know how to use the product. The Covenant requires that the teacher understand what the product does — to attention, to data, to behavior — not only what the product is designed to teach.

Condition 5: Regulatory Framework and Enforcement

The Educational Privacy Failure documented in ET-004 is not a failure of will or resources. It is a failure of statutory design. FERPA cannot protect student data in the digital context because FERPA was not designed for the digital context. COPPA cannot protect student data in the educational context because COPPA's school exception delegates consent to the institution that deploys the technology. The combination of the two statutes produces a regulatory environment in which EdTech companies can collect comprehensive behavioral data on children through compulsory educational infrastructure without individual parental consent, without meaningful regulatory oversight, and without any enforcement mechanism that parents can invoke when their children's data is misused.

Condition 5 specifies the regulatory framework necessary to close the structural gap. The framework treats the school-student data relationship as a fiduciary relationship requiring protections comparable to those that HIPAA provides for healthcare data. The specific elements are drawn from existing regulatory models that have been implemented, tested, and sustained in other domains.

Individual right of action. Parents must have the legal right to bring private actions against institutions and vendors that violate student data protection requirements. The absence of a private right of action under FERPA — confirmed by the Supreme Court in Gonzaga University v. Doe — is the single most consequential structural defect in the current framework. Without a private right of action, enforcement depends entirely on the Family Policy Compliance Office, which cannot impose fines, cannot bring lawsuits, and has never withdrawn federal funding for a data privacy violation. A private right of action shifts the enforcement burden from an under-resourced federal office to the parents whose children's data is at stake. The mechanism is not novel: HIPAA provides for state attorney general enforcement; the Fair Credit Reporting Act provides for individual private actions; the Telephone Consumer Protection Act provides for statutory damages that incentivize compliance without requiring proof of quantifiable harm. Each of these frameworks treats the protected party's ability to enforce the law as essential to the law's protective function.

Meaningful enforcement authority with intermediate sanctions. A dedicated regulatory body — whether a new office, an expanded FPCO, or a designated division within the FTC — must have the authority to investigate student data practices, impose civil monetary penalties for violations, issue cease-and-desist orders, and require remediation. The sanction structure must include intermediate penalties proportional to the violation — not only the nuclear option of withdrawing federal education funding, but fines scaled to the severity and scope of the violation, injunctive relief to halt ongoing data practices, and mandatory disclosure requirements. The HHS Office for Civil Rights provides the model: tiered penalties from $100 to $50,000 per violation, with annual caps adjusted for willfulness, enforced by a dedicated office with investigatory authority and the technical expertise to evaluate the data practices of a technology industry.

Purpose limitation and data minimization as legal requirements. The data minimization standard and advertising prohibition specified in Conditions 1 and 2 must be statutory requirements enforceable through the regulatory framework, not voluntary commitments that vendors can revise through unilateral modification of their terms of service. Statutory data minimization, purpose limitation, and a prohibition on secondary monetization transform the Covenant's conditions from contractual provisions that individual school districts may or may not negotiate into legal requirements that apply to every EdTech vendor operating in the educational market. The GLBA Safeguards Rule provides the precedent: financial institutions are legally required to implement comprehensive information security programs, and the requirement is enforced through regulatory examination and penalty authority. The same mechanism, applied to EdTech, would require vendors to demonstrate compliance with data minimization and purpose limitation standards as a condition of operating in the educational market.

Mandatory data breach notification with developmental impact assessment. When a breach of student data occurs, the regulatory framework must require notification not only to the institution and the regulatory body, but to the affected parents, with a specific assessment of the developmental and privacy implications of the exposed data categories. A breach of a thirteen-year-old's behavioral telemetry data — attention patterns, engagement characteristics, cognitive performance indicators, emotional response data — has implications for the child that a breach of an adult's email address does not. The notification framework must reflect the specific vulnerability of the affected population, with remediation requirements proportional to the nature of the data exposed and the developmental stage of the children affected.

The regulatory framework described here is not speculative. Every element — private right of action, intermediate enforcement sanctions, purpose limitation, data minimization, breach notification — exists in current law as applied to healthcare data, financial data, or consumer data. The framework does not require the invention of new regulatory instruments. It requires the application of existing instruments to the population that most needs their protection and currently least receives it. Children are compelled by law to attend educational institutions. Those institutions deploy technology that collects comprehensive behavioral data. The data receives less legal protection than an adult's financial transactions, less protection than a patient's blood pressure readings, less protection than a consumer's browsing history under the California Consumer Privacy Act. The Classroom Covenant does not propose a new standard. It proposes the standard that already exists for every other fiduciary relationship in which a vulnerable population's data is held by an institution with power over that population. The only question is why children in schools have been exempted from the protections that the legal system already provides to everyone else.

Named Condition · ICS-2026-ET-005
The Classroom Covenant
"The minimum conditions under which educational technology can be deployed in schools without producing the documented harms of the current regime: data minimization aligned with educational purpose only; prohibition on behavioral advertising targeting of students under 18; mandatory attention architecture impact assessment as a procurement condition; teacher training that includes attention effects alongside pedagogical use; and regulatory frameworks modeled on existing fiduciary protections (HIPAA, financial fiduciary duty) that treat the school-student relationship as requiring specific data and behavioral safeguards proportional to the vulnerability of the population being served. The Classroom Covenant is not a wish list — each condition is derivable from a documented harm and supported by an existing regulatory precedent in another domain where professional fiduciary relationships require comparable protection."
Previous · ET-004
The FERPA Gap
The Educational Privacy Failure: a 1974 law designed for paper records in filing cabinets does not protect children in the era of cloud-based behavioral analytics.
Next · I9-001
The Children — What We Owe the Youngest Cohort
The Saga IX synthesis: four series, twenty-one papers, one developmental obligation. What the complete record requires.

References

Internal: This paper is part of The EdTech Capture (ET series), Saga IX. It draws on and contributes to the argument documented across 22 papers in 5 series.

External references for this paper are in development. The Institute’s reference program is adding formal academic citations across the corpus. Priority papers (P0/P1) have complete references sections.