The Short Version

FERPA — the Family Educational Rights and Privacy Act — restricts who can access student education records and under what conditions. It does not automatically block AI tools from being used in schools, but it does require that any vendor handling student records has a signed data processing agreement establishing them as a "school official." That one requirement is the most common compliance gap districts miss with AI tools.

What FERPA Is

FERPA is a federal law enacted in 1974 that gives parents (and students over 18) rights over their education records, and restricts schools from sharing those records without consent. It applies to any school that receives federal funding — which is virtually every public school district in the country.

FERPA does not prohibit technology. It does not say districts can't use AI tools. What it requires is that when student education records are shared with a third party — including an AI vendor — that sharing happens under specific legal conditions.

Key Term
Education Records

Under FERPA, "education records" are records that are directly related to a student and maintained by a school or by a party acting on the school's behalf. This is broad by design. It includes grades and transcripts, enrollment information, disciplinary records, IEP and special education records, and — critically for AI tools — any student data that an AI system processes or stores on behalf of the school.

The Two FERPA Concepts That Matter Most for AI

Concept 1

The "School Official" Exception

FERPA allows schools to share student records with third parties — including vendors — without individual parental consent, as long as the vendor is acting as a "school official" with a "legitimate educational interest." This is the legal mechanism that makes edtech possible. But it requires a signed agreement establishing the vendor in that role.

Concept 2

The Data Processing Agreement

The signed agreement — a DPA — is what creates the school official relationship. It must specify what data the vendor can access, how it can be used, and how it will be protected. Without a signed DPA, there is no legal basis for sharing student records with a vendor, regardless of their privacy policy or public commitments.

These two concepts explain why "this vendor has a privacy policy" is not sufficient. A privacy policy is a unilateral statement. A DPA is a bilateral contract that creates legal obligations. FERPA requires the contract.

What FERPA Does Not Cover

Understanding FERPA's limits is as important as understanding its requirements. Districts sometimes over-apply FERPA in ways that create friction without adding protection, or under-apply it in ways that create real risk.

FERPA does not apply to de-identified data

Information from which all personally identifiable information has been removed is not subject to FERPA. Some AI vendors aggregate anonymized interaction data for product improvement. Whether specific data is truly de-identified — and whether the re-identification risk is low enough — is a factual question that your DPA should address.

FERPA does not replace teacher judgment

FERPA governs records, not conversations. A teacher discussing a student's progress with a colleague is not a FERPA violation. What creates FERPA exposure is when that conversation involves inputting specific student records — names combined with grades, IEP details, disciplinary history — into an AI tool that has not been established as a school official through a signed DPA.

FERPA does not certify compliance

No vendor can be "FERPA certified." The Department of Education does not certify vendors, does not audit their compliance, and does not issue certifications. Any vendor claiming FERPA certification is misrepresenting the law. What vendors can have is a signed DPA, SDPC agreement participation, and a documented commitment to the school official obligations — but those are contractual commitments, not government certifications.

How FERPA Applies to AI Tools Specifically

The proliferation of AI tools in schools creates FERPA exposure in ways that weren't contemplated when the law was written. The core question for any AI tool is: does it process or store student education records? If yes, it needs a signed DPA before student use.

Scenario FERPA Risk What's Required
Teacher uses AI to draft lesson plans (no student data) Low DPA recommended; no student data involved
Teacher pastes student names + grades into AI tool High Signed DPA required before this is permissible
Students interact directly with AI platform High Signed DPA required; COPPA review also needed for under-13
AI tool analyzes student performance data High Signed DPA required; verify data use restrictions
AI tool trained on student interaction data Very High Prohibited without explicit authorization in DPA

The FERPA Questions That Actually Matter

When evaluating an AI tool for FERPA compliance, most districts focus on whether a DPA exists. That's necessary but not sufficient. These are the questions that determine your actual exposure.

District Evaluation Checklist

?

Have you executed the DPA — or just seen that one exists? A DPA available on a vendor's website is not a DPA in effect for your district. It must be signed by an authorized district representative before any student data is shared.

?

Does the DPA prohibit the vendor from using student data for AI model training? This is the most commonly missing provision in AI tool agreements. It must be explicit — not implied by vague language about "educational purposes."

?

Does the DPA define the scope of "legitimate educational interest"? The school official exception requires that the vendor's access be limited to what's needed for the service. A DPA that grants broad data access without defining the purpose may not satisfy this requirement.

?

What are the data return and deletion obligations? When a contract ends, what happens to student data? The DPA must address this. Student records that persist at a vendor after contract termination remain your district's legal responsibility.

?

Does the DPA address subprocessors? AI tools often use third-party services — cloud providers, model APIs, analytics vendors. Any subprocessor that touches student data needs to be covered by the same obligations in the DPA.

FERPA and COPPA: When Both Apply

FERPA and COPPA are separate laws with different requirements, and both can apply to the same tool. FERPA governs student education records at any age. COPPA adds additional requirements specifically for children under 13. When you're deploying an AI tool with elementary students, you need to verify compliance with both.

The practical implication: a DPA that satisfies FERPA may not be sufficient for under-13 use if it doesn't also address COPPA's parental consent requirements and the new 2026 data use restrictions. Review both frameworks, or have your district's legal counsel review both.

ℹ️
New in 2026: Updated COPPA rules took effect April 22, 2026, adding stricter data use limitations and retention requirements for under-13 students. See our COPPA guide for the specific changes.

K12SafeList provides independent research, not legal advice. This article explains the FERPA framework as it applies to AI tools in K-12 schools — it does not constitute legal advice for any specific district situation. No edtech product can be "FERPA certified." Always verify current vendor documentation and consult your district's legal counsel before finalizing procurement decisions. Reflects FERPA as codified at 20 U.S.C. § 1232g and 34 CFR Part 99.