education leadership

Legal Considerations for AI in Education — FERPA, COPPA, and GDPR

EduGenius Team··16 min read

Legal Considerations for AI in Education — FERPA, COPPA, and GDPR

Most school leaders who adopt AI tools are not intentionally ignoring privacy law — they simply don't know which laws apply, what those laws require, or where the line falls between legally safe and legally risky AI use. The legal landscape for AI in education involves at least three major federal/international frameworks (FERPA, COPPA, GDPR), a growing patchwork of state student privacy laws, and a collection of vendor agreements that most administrators sign without fully reading.

A 2024 Center for Democracy & Technology (CDT) survey found that 42% of school districts using AI tools had not executed Data Processing Agreements with their AI vendors, and 31% of administrators could not identify which federal law governs student data privacy. These gaps don't exist because administrators don't care — they exist because the legal framework is genuinely complex and the guidance is written for lawyers, not principals.

This guide translates privacy law into school leader language. It tells you what each law requires, when it applies to AI tools, what compliance looks like in practice, and where the most common legal mistakes occur.

Important disclaimer: This article provides general educational information about privacy law. It is not legal advice. Consult your district's legal counsel for guidance specific to your situation, jurisdiction, and planned AI implementations.


FERPA: The Foundation

What FERPA Covers

The Family Educational Rights and Privacy Act (1974) governs access to and disclosure of student education records in schools that receive federal funding — which is virtually all public schools and many private schools.

FERPA ElementWhat It MeansHow It Applies to AI
Education recordsRecords directly related to a student that are maintained by the school or an agent of the schoolAI tools that store student work, grades, behavior data, or identifiable information create education records
Disclosure restrictionsSchools cannot disclose education records without parent consent (with specific exceptions)Sharing student data with an AI vendor is a disclosure — it requires legal authorization
School official exceptionSchools can share education records with "school officials" (including contractors) who have "legitimate educational interest"AI vendors can qualify as school officials through a DPA — this is the primary legal mechanism for AI tool access
Directory informationBasic information (name, address, grade level, etc.) that schools can share with fewer restrictionsAI tools that access only directory information have lower FERPA burden, but parents can opt out of directory disclosure
Parent rightsParents can inspect records, request amendments, and control disclosure (rights transfer to students at 18)Parents can ask what AI tools access their child's records — schools should be able to answer

When AI Tools Trigger FERPA

Does the AI tool access or store student education
records?

Education records include:
✓ Student names linked to academic information
✓ Grades, test scores, assessment results
✓ Student work samples (essays, assignments)
✓ Attendance records
✓ Disciplinary records
✓ IEP or 504 documentation
✓ Student-generated content in the AI tool
✓ AI-generated analysis of individual student data

NOT education records:
✗ Teacher-generated content with no student data
✗ De-identified aggregate data (no reasonable basis
  to identify individuals)
✗ Directory information (with proper notice)
✗ Teacher's personal notes not shared with others

IF the AI tool accesses education records:
→ FERPA applies
→ DPA required (school official exception)
→ Vendor must have legitimate educational interest
→ Vendor must be under school's direct control
  regarding data use
→ Data use limited to authorized purpose
→ Re-disclosure prohibited

IF the AI tool does NOT access education records:
→ FERPA does not apply to that specific use
→ Other laws may still apply (COPPA, state laws)

A Data Processing Agreement is the legal instrument that makes AI vendor access to student records FERPA-compliant. Without a DPA, sharing student data with an AI vendor is a FERPA violation — even if the vendor has excellent security practices.

Essential DPA provisions for AI tools:

ProvisionWhat It Should SayRed Flag If Missing
Purpose limitation"Vendor processes data solely for the educational purpose authorized by the school"Vendor can use data for any purpose
No secondary use"Vendor shall not use student data for targeted advertising, marketing, or model training"Student data may train AI models
Re-disclosure prohibition"Vendor shall not disclose student data to third parties except as required by law"Data may be shared with partners or affiliates
Data deletion"Upon contract termination, vendor shall delete all student data within 30 days and provide certification of deletion"Data may persist indefinitely after contract ends
Security obligations"Vendor shall implement and maintain reasonable security measures" (specify encryption, access controls)No security commitments
Breach notification"Vendor shall notify school within 48 hours of any data breach"No breach notification obligation
Audit rights"School reserves the right to audit vendor's data practices"No accountability mechanism

COPPA: When Students Are Under 13

What COPPA Requires

The Children's Online Privacy Protection Act (1998, updated 2013) regulates the online collection of personal information from children under 13. Unlike FERPA, COPPA applies to the operator of the online service — the AI vendor — not the school directly. But schools get involved because they can consent on behalf of parents in educational contexts.

COPPA ElementWhat It MeansAI Implication
Applies to children under 13If students under 13 interact with the AI tool, COPPA appliesElementary and middle school AI tools almost always trigger COPPA
Verifiable parental consentVendors must obtain consent before collecting personal information from childrenSchools can consent on behalf of parents for educational use — but only for educational use
School consent limitationSchools can consent only for use "for the benefit of the school" — not for commercial purposesAI vendors cannot use under-13 data collected under school consent for commercial purposes (advertising, model training, profiling)
Minimal collectionVendors must collect only information "reasonably necessary" for the activityAI tools should not require unnecessary personal information from young students
Parental accessParents can review and request deletion of their child's informationSchools and vendors must be able to respond to parent requests

COPPA in Practice for AI Tools

Is the AI tool used BY STUDENTS (not just by teachers)?
├─ NO → COPPA does not apply (teacher-facing tools
│         don't trigger COPPA because teachers are adults)
└─ YES → Continue

Are any students under 13?
├─ NO → COPPA does not apply
└─ YES → Continue

Does the tool collect personal information?
(Name, email, username, photo, voice, persistent
identifier, geolocation, or any combination that
could identify a child)
├─ NO → COPPA may not apply (rare — most tools
│         collect some identifier)
└─ YES → COPPA APPLIES

SCHOOL MUST:
1. Verify the vendor is COPPA-compliant
2. Provide consent on behalf of parents (for
   educational use only)
3. Ensure vendor uses data only for educational
   purpose — NOT for commercial purposes
4. Notify parents that the school has consented
5. Maintain ability to review and delete student
   data upon parent request

Practical tip for schools: If your AI tool is used only by teachers — for lesson planning, content generation, assessment creation — COPPA does not apply because no child's personal information is being collected. This is one reason that teacher-facing AI tools like EduGenius carry lower legal complexity than student-facing AI tools — the teacher interacts with the AI, and students interact with the teacher-reviewed output.


GDPR: International and EU-Connected Schools

When GDPR Applies to Schools

The General Data Protection Regulation applies to schools in two situations:

  1. Schools located in the EU/EEA: GDPR is the primary data protection framework
  2. Schools outside the EU that process data of EU residents: If your school enrolls students who are EU citizens or uses AI tools operated by EU-based companies
GDPR PrincipleWhat It RequiresSchool Application
Lawful basisMust have a legal basis for data processing (consent, legitimate interest, legal obligation, etc.)Schools typically rely on "legitimate interest" or "public interest" for educational data processing
Purpose limitationData collected for one purpose cannot be used for anotherAI vendors cannot repurpose student data collected for education
Data minimizationCollect only what's necessaryAI tools should not require excessive personal data
Right to erasureData subjects can request data deletionParents/students can request deletion of data from AI tools
Data protection impact assessment (DPIA)Required for "high-risk" processing — which AI often qualifies asSchools using AI for profiling, automated decision-making, or large-scale data processing should conduct DPIAs
International transfersData transfers outside the EU require specific legal mechanismsUsing a US-based AI tool from an EU school requires Standard Contractual Clauses or equivalent

For U.S. schools: GDPR typically doesn't apply unless you enroll international students from the EU or use EU-based AI vendors. However, GDPR's principles — purpose limitation, data minimization, deletion rights — represent best practices that strengthen privacy regardless of legal obligation.


State Student Privacy Laws

The Patchwork Problem

Beyond FERPA and COPPA, approximately 40 U.S. states have enacted their own student data privacy laws, creating a complicated patchwork:

StateKey LawNotable Requirement Beyond FERPA
CaliforniaSOPIPA (2014)Prohibits targeted advertising based on student data; prohibits selling student data; requires deletion upon request
New YorkEducation Law 2-d (2014, updated 2020)Requires DPAs for all third-party vendors; parents' bill of rights; data inventory reporting
IllinoisSOPPA (2021)Requires breach notification within 30 days; prohibits targeted advertising; requires data governance plan
ColoradoStudent Data Transparency and Security Act (2016)Requires publicly available data inventory; transparency portal
ConnecticutStudent Privacy Act (2022)Prohibits AI-generated profiles used for non-educational purposes; requires operator compliance
TexasStudent Privacy Act (2017, updated 2023)Prohibits selling student data; requires annual security audit; operator certifications

Action item: Check your state's specific student data privacy requirements. Many states participate in the Student Data Privacy Consortium (SDPC), which provides standardized DPA templates that address both FERPA and state-specific requirements. Using the SDPC DPA template for your state simplifies compliance.


Risk ScenarioLegal IssuePrevention
Teacher pastes student names/grades into ChatGPTFERPA violation — disclosure of education records to an unauthorized partyClear policy: no student PII in general-purpose AI tools without DPA
AI tool uses student data to train modelsFERPA (purpose limitation), COPPA (commercial use), state laws (selling/commercial use)DPA provision prohibiting model training; verify in terms of service
AI tool makes automated decisions about students (grading, placement, discipline)Due process concerns; potential IDEA/Section 504 violations for students with disabilitiesAll AI-assisted decisions reviewed by humans; never delegate protected decisions to AI
Student-facing AI collects voice dataCOPPA (personal information from under-13); state biometric laws (Illinois BIPA, Texas CUBI)Verify what data the AI collects; obtain appropriate consent; prefer text-only interaction
AI-generated reports shared with parents contain errorsPrivacy (if wrong student's data included); liability (if recommendations based on incorrect analysis)Human review of all AI-generated parent-facing output before distribution
Data breach at AI vendorFERPA breach notification; state breach notification laws; potential liabilityDPA with breach notification clause; vendor security requirements; cyber insurance

Compliance Checklist for School Leaders

AI TOOL LEGAL COMPLIANCE CHECKLIST

Before deploying any AI tool:

□ FERPA
  □ Determine if tool accesses education records
  □ If yes: execute DPA with vendor
  □ Verify DPA includes: purpose limitation, no
    model training, breach notification, deletion
    clause, re-disclosure prohibition
  □ Document legitimate educational interest

□ COPPA (if student-facing, students under 13)
  □ Verify vendor is COPPA-compliant
  □ Provide school consent for educational use
  □ Verify vendor does not use data commercially
  □ Notify parents about tool and school consent

□ STATE LAW
  □ Check state student privacy requirements
  □ Use state-specific DPA template if available
  □ Verify any state-specific prohibitions
    (targeted advertising, data selling, biometric
    data)
  □ Comply with state breach notification
    requirements

□ GENERAL
  □ Terms of Service reviewed (not just clicked
    through)
  □ Privacy policy reviewed — specifically for
    AI model training and data sharing
  □ Tool added to data governance inventory
  □ Staff trained on appropriate use
  □ Parent communication about tool deployment

Key Takeaways


Frequently Asked Questions

Do I need a DPA if teachers use ChatGPT only for their own lesson planning (no student data)?

If teachers genuinely never input student names, grades, work samples, or any personally identifiable information, FERPA is not triggered for that specific use. However, "never" is hard to enforce, and the line between "no student data" and "let me just ask about my struggling reader in 3rd period" is easily crossed. The safest approach is to either (1) execute a DPA with OpenAI/Anthropic/Google for enterprise educational use, or (2) issue a clear policy prohibiting any student identifiers in general-purpose AI prompts. Option 1 protects against accidental slips. Option 2 relies on teacher discipline. Most districts benefit from pursuing both.

Can parents opt their child out of AI completely?

FERPA gives parents the right to control disclosure of education records. If an AI tool accesses their child's records, parents can object to that specific disclosure — and the school generally must honor the objection. For AI tools used by teachers that don't access student records (teacher-facing content generation), there's no student record disclosure to opt out of, and the generated materials are functionally identical to any other instructional resource. Schools should be prepared to offer reasonable accommodations while explaining which AI uses involve student data and which don't — the distinction usually resolves most parent concerns.

What liability does the school have if an AI vendor has a data breach?

The school retains responsibility for the student data it shares, even when the breach occurs at the vendor. FERPA holds the school (the educational agency) accountable for protecting student records. This is why DPA provisions matter enormously: breach notification requirements, security standards, and liability allocation in the DPA determine how much of the practical burden falls on the vendor vs. the school. Cyber insurance for educational institutions increasingly covers third-party vendor breach costs — check whether your district's insurance policy covers this scenario. The best protection is choosing vendors with strong security practices, executing comprehensive DPAs, and minimizing the data shared to what's strictly necessary.

Is de-identified data safe to use with any AI tool?

FERPA does not cover data that has been properly de-identified — meaning that there is "no reasonable basis" to identify individual students from the data. But de-identification is harder than it sounds. Removing names and ID numbers is not sufficient if the combination of remaining data points (grade, gender, race, specific course, specific scores) could identify a student — which it often can in small schools or small classes. The FERPA de-identification standard requires either expert determination or the safe harbor method (removing 18 specific identifiers). Aggregate data (school-level averages, grade-level distributions) is generally safe. Individual-level data with names removed may not be.

#FERPA-AI#COPPA-compliance#GDPR-education#AI-education-law#student-privacy