The Short Version
COPPA is federal law restricting how online services collect and use personal information from children under 13. As of April 22, 2026, the FTC's updated COPPA Rule expanded what counts as personal information, tightened parental consent requirements, and added new data retention obligations. If your district uses any AI tool with students under 13, COPPA applies — and the new rules raised the bar.
What COPPA Actually Covers
COPPA applies to operators of websites and online services that are either directed at children under 13, or that have actual knowledge they are collecting information from children under 13. Most AI tools used in schools fall into one or both categories.
What counts as personal information under the updated rule is broader than most people assume. It includes the obvious — names, email addresses, phone numbers — but also persistent identifiers like device IDs and IP addresses, photos and audio recordings, geolocation data, and information collected through AI interactions that could identify a specific child.
For school-deployed AI tools, this matters because student interactions with an AI — their questions, their writing, their responses — can constitute personal information if they contain or could be linked to identifying details.
What Changed on April 22, 2026
The 2026 updates to the COPPA Rule (published April 22, 2025, effective one year later) made four changes that directly affect how schools evaluate AI tools.
Stronger limits on data use
Operators can no longer use personal information collected from children for purposes unrelated to the service the child used. This directly constrains whether an AI vendor can use student interaction data to train or improve their model — even with vague consent language. Districts should verify that any AI tool's DPA explicitly prohibits using student data for model training.
Tighter consent mechanisms
The updated rule narrows what counts as valid parental consent and places additional scrutiny on school-authorization pathways. Districts relying on broad, general consent forms should verify those forms meet the current standard — particularly for under-13 students.
New data retention and deletion requirements
Operators must retain children's personal information only as long as necessary to fulfill the purpose for which it was collected, and must have a documented deletion process. For AI tools, student conversation logs and interaction histories cannot be held indefinitely. Your DPA should specify retention periods and what happens to data when a student leaves the district.
Push notification and advertising restrictions
The rule adds explicit restrictions on push notifications to children and tightens prohibitions on behavioral advertising. For most school-deployed AI tools this is lower risk, but it's worth confirming in any consumer-facing tool that teachers or students might access outside the school environment.
The School Consent Exception — And Its Limits
COPPA includes a school authorization pathway that allows schools to provide consent on behalf of parents for tools used for educational purposes. This is how most districts handle COPPA compliance for edtech tools — and it works, but it comes with conditions that are easy to misapply.
The school can provide consent only for the educational purpose. If a vendor uses student data for anything beyond what the school authorized — including improving their product — the school's consent doesn't cover it. The district needs to confirm in the executed DPA exactly what the vendor is and isn't doing with student data.
Four Things to Verify Before Approving Any AI Tool
For any AI tool used with students under 13, districts should verify these four things before approving deployment.
District Pre-Approval Checklist
Does the DPA explicitly address COPPA, not just FERPA? Many DPAs are written primarily around FERPA and treat COPPA as an afterthought. The 2026 updates make that gap riskier.
Does the DPA prohibit use of student data for model training or product improvement? Vague language like "we may use data to improve our services" is not COPPA-compliant for under-13 students under the updated rule.
What are the data retention terms? The DPA should specify how long student data is retained, what triggers deletion, and what happens to conversation logs when a student or district offboards.
Is your school consent pathway documented? If relying on the school authorization exception rather than individual parental consent, that decision should be documented in your district's AI governance records — not just assumed.
How This Applies to Specific AI Tools
Not all AI tools carry equal COPPA risk. The risk profile depends heavily on how the tool is deployed, what data it collects, and whether a DPA has been executed.
Purpose-built education tools (e.g. SchoolAI, Khanmigo)
These tools were designed with school compliance in mind and typically have DPAs available, SDPC agreements signed, and explicit commitments against using student data for training. COPPA risk is lower — but it is not zero. The key questions are whether your district has executed the DPA (not just seen that one exists) and whether the retention terms match your district's obligations.
General-purpose AI tools (e.g. ChatGPT, Gemini)
Consumer-facing AI tools carry significantly higher COPPA risk with under-13 students. Most were not designed for the school consent pathway, their default data practices are built around consumer use, and their DPAs (where they exist) are newer and less tested in the K-12 context. Districts should apply additional scrutiny before deploying these with elementary students.
K12SafeList provides independent research, not legal advice. COPPA compliance determinations require review of your specific contracts, your district's specific student population, and the specific features you're enabling. This article explains the framework — your district's legal counsel should review the specifics before you finalize any tool approval. Reflects the COPPA Rule as amended effective April 22, 2026 (16 CFR Part 312).