edtech reviews

Comparing AI Education Tool Privacy Policies — What Parents Should Know

EduGenius Team··17 min read

Comparing AI Education Tool Privacy Policies — What Parents Should Know

Your child's school uses an AI tutoring platform. Your child inputs homework problems, personal notes, learning struggles, and study habits into this platform daily. Where does that data go? Who can see it? Is it used to train AI models that other companies access? Could your child's academic struggles become part of a dataset sold to advertisers, insurance companies, or future employers?

These aren't hypothetical concerns. A 2024 report from the Future of Privacy Forum found that 89% of K-12 schools use AI-enabled educational technology—but only 34% have reviewed the privacy policies of these tools. The gap between adoption and oversight means millions of students are using AI platforms whose data practices their parents and schools haven't examined.

This guide translates AI education tool privacy policies from legal language into practical understanding. We compare the data practices of major tools, explain the laws that are supposed to protect students, and provide a concrete checklist for parents evaluating whether their child's school AI tools are safe. For the broader AI tool landscape, see The Definitive Guide to AI Education Tools in 2026.


The Laws: What's Supposed to Protect Student Data

Key Privacy Laws for K-12

LawProtectsRequiresEnforced By
FERPAEducation records at schools receiving federal fundingWritten consent for disclosure; right to review recordsUS Department of Education
COPPAChildren under 13 onlineVerifiable parental consent for data collectionFTC (Federal Trade Commission)
State LawsVaries by state (e.g., SOPIPA in CA)Varies: many prohibit selling student dataState attorneys general
GDPREU/UK students (and global impact)Explicit consent, data minimization, right to erasureEU data protection authorities

What FERPA Actually Requires

FERPA (Family Educational Rights and Privacy Act) protects "education records"—any records directly related to a student and maintained by an educational institution. When a school adopts an AI tool, that tool becomes a "school official" under FERPA if the school designates it as one in its data processing agreement. This designation allows the school to share student data with the tool without individual parent consent—but the tool must:

  1. Use the data only for the educational purpose specified
  2. Not disclose the data to unauthorized third parties
  3. Maintain data security standards
  4. Comply with data retention and deletion policies

The FERPA gap: FERPA was written in 1974—long before AI. The law doesn't specifically address AI model training, algorithmic profiling, or generative AI that could reproduce student information in outputs. Schools and vendors interpret FERPA's requirements for these scenarios inconsistently.

What COPPA Actually Requires

COPPA (Children's Online Privacy Protection Act) applies to commercial websites and apps directed at children under 13. It requires verifiable parental consent before collecting personal information from children. Schools can provide consent on behalf of parents for educational tools—but only when the data is used exclusively for educational purposes.

The COPPA gap: When a school consents to data collection on behalf of parents, parents may not know what tools are collecting data or what data is being collected. Schools are supposed to inform parents, but the 2024 Future of Privacy Forum report found that only 52% of districts actively notify parents about specific AI tools used in classrooms.


Privacy Comparison: Major AI Education Tools

Tier 1: Education-First Tools (Purpose-Built for Schools)

ToolFERPA CompliantCOPPA CompliantSigns DPAsData Used for AI TrainingData RetentionStudent Data Sold
Khan AcademyYesYesYesNoAccount lifetimeNo
IXLYesYesYesNoAccount lifetimeNo
MagicSchoolYesYesYesNoAs specified in DPANo
EduGeniusYesYesAvailableNoUser-controlledNo
NewselaYesYesYesNoAccount lifetimeNo

What "Tier 1" means: These tools were designed for educational use and have privacy practices built into their architecture. They generally:

  • Sign data processing agreements (DPAs) with school districts
  • Don't use student data to train AI models
  • Don't sell or share student data with advertisers
  • Provide data deletion upon request
  • Have designated privacy contacts for school administrators

EduGenius privacy approach: EduGenius processes user inputs to generate educational content but doesn't retain student work, personal information, or generation content for AI training purposes. Content generation happens in real-time, and users maintain control over their data. The platform doesn't require student accounts (teachers generate content on behalf of students), which minimizes student data collection entirely.

Tier 2: General AI Tools Used in Education

ToolFERPA CompliantCOPPA CompliantSigns DPAsData Used for AI TrainingData RetentionStudent Data Sold
ChatGPT (OpenAI)Enterprise/Edu: Yes; Free: NoEnterprise/Edu: Yes; Free: NoEnterprise/Edu: YesFree: Yes; Enterprise: No30 days (API); varies (consumer)No
Google GeminiWorkspace Edu: Yes; Consumer: NoWorkspace Edu: Yes; Consumer: NoWorkspace Edu: YesConsumer: Yes; Workspace: NoVaries by productNo (but ad-related for consumer)
Claude (Anthropic)Enterprise/Edu: Yes; Free: NoEnterprise: Yes; Free: NoEnterprise: YesFree: May; Enterprise: No90 days (API); varies (consumer)No
Microsoft CopilotM365 Edu: Yes; Consumer: NoM365 Edu: Yes; Consumer: NoM365 Edu: YesM365 Edu: No; Consumer: VariesPer M365 policiesNo

The critical distinction: Each of these tools has two versions with radically different privacy postures:

  1. Consumer/free versions: Data may be used to train AI models, content filtering is less restrictive, FERPA/COPPA protections may not apply, and no DPA is available.
  2. Enterprise/education versions: Data is not used for training, FERPA/COPPA compliance is verified, DPAs are available, and content filtering is enhanced.

If a school directs students to use the free version of ChatGPT, Claude, or Gemini, they may be violating FERPA and COPPA. The educational protections only apply to the enterprise/education versions with signed agreements.

Tier 3: Tools with Privacy Concerns

ToolConcernRisk Level
Character.AICollects extensive conversation data; age verification challengesHigh
Snapchat My AISocial media integration; ad-targeted dataHigh
Grammarly (Free)Retains all text input; AI training useMedium
Quillbot (Free)Data retention and training policies evolvingMedium

Recommendation: Tools in Tier 3 should not be used for educational purposes without thorough privacy review by the school district's data privacy officer.


What Data Do AI Education Tools Actually Collect?

Data Types by Category

Data CategoryExamplesFERPA Protected?Risk Level
Account dataName, email, grade, schoolYesLow (if properly secured)
Performance dataScores, completion rates, time on taskYesMedium
Content dataStudent writing, answers, uploaded workYesHigh
Behavioral dataClick patterns, feature usage, session durationVariesMedium
Interaction dataAI chat logs, questions asked, prompts submittedYes (if identifiable)High
Device dataIP address, device type, browser, OSVariesLow-Medium

The highest-risk category: Interaction data—the questions students ask AI tools, the problems they input, the conversations they have with AI tutors. This data reveals academic struggles, personal concerns, learning disabilities, and interests in ways that traditional educational data does not. A student typing "I don't understand anything in math and I'm going to fail" into an AI tutor is generating emotionally sensitive data that goes well beyond test scores. See AI Tools for After-School and Enrichment Programs for privacy considerations in non-school settings where protections may be weaker.


The AI Training Question

Do Education AI Tools Use Student Data to Train Models?

This is the question that concerns parents most—and the answer varies by tool:

Tools that DO NOT use student data for AI training:

  • Khan Academy: Explicitly excludes user data from model training
  • IXL: No AI training use
  • MagicSchool: No AI training use
  • EduGenius: No AI training use; content generated on-demand without retention
  • Newsela: No AI training use

Tools where it depends on the version:

  • ChatGPT: Free/Plus versions use conversations for training (opt-out available); Enterprise/Team/Edu versions do not
  • Google Gemini: Consumer version data may improve Google products; Workspace for Education data is excluded
  • Claude: Consumer conversations may be used; Enterprise/API data is not
  • Grammarly: Free version text is used; Business/Enterprise versions provide different terms

What "used for AI training" means practically: If student data trains an AI model, that data becomes embedded in the model's parameters. It can't be individually deleted afterward—it's mathematically woven into the model's weights. This means the "right to erasure" (deleting your data upon request) is technically impossible once data has been used for training. The data exists in a form that can't be extracted or removed.


Parent Checklist: Evaluating Your Child's School AI Tools

Use this checklist when your child's school introduces a new AI tool:

Essential Questions (Ask the School)

  • What is the tool? Get the specific name and version (free vs. paid matters enormously)
  • Is there a signed DPA (Data Processing Agreement)? If the school can't produce one, the tool may not have FERPA protections
  • What data does it collect? Request the specific data categories (see table above)
  • Is student data used for AI training? This is a yes/no question with major implications
  • What is the data retention period? How long is data stored, and is it deleted when the student leaves?
  • Can parents request data deletion? Both FERPA and many state laws provide this right
  • Was the tool reviewed by the district's privacy officer? If no privacy review was conducted, the tool shouldn't be deployed

Red Flags (Watch For)

  • Tool requires student email addresses for accounts (increases data footprint)
  • No DPA exists between the school/district and the vendor
  • Privacy policy mentions "improving our services" with user data (code for AI training)
  • Tool is a consumer product without an education-specific version
  • Data retention period is "indefinite" or "as long as the account exists"
  • No clear data deletion process documented

Green Flags (Good Signs)

  • District-signed DPA with specific data use limitations
  • Tool designed specifically for K-12 education
  • "No AI training with student data" explicitly stated in policy
  • Data retention limited to academic year or specific period
  • SOC 2 Type II or equivalent security certification
  • Student Privacy Pledge signatory (Future of Privacy Forum)

For detailed guidance on evaluating tool quality beyond privacy, see AI Content Generators That Export to Multiple Formats.


State-Level Privacy Protections

States with Strong Student Data Privacy Laws

StateKey LawNotable Provision
CaliforniaSOPIPA (2014) + CCPA/CPRAProhibits using student data for advertising; extends adult privacy rights to minors
New YorkEducation Law 2-d (2020)Requires parent notification of all third-party data sharing; DPA registry
IllinoisSOPPA (2021)Requires data breach notification within 30 days; operator transparency
ColoradoStudent Data Transparency ActAnnual data inventory; prohibits commercial use
ConnecticutStudent Data Privacy ActProhibits targeted advertising; security requirement

Parent action: Check your state's student data privacy law. Many states provide additional protections beyond FERPA—including mandatory parent notification, data breach requirements, and advertising prohibitions that FERPA doesn't address.

Resource: The Student Privacy Compass (studentprivacycompass.org) from the Future of Privacy Forum maintains a state-by-state guide to student data privacy legislation.


Pro Tips for Parents

  1. Ask "which version?" not "which tool?": The privacy difference between ChatGPT Free and ChatGPT for Education is enormous. When your school says "we use ChatGPT," ask specifically whether it's the enterprise/education version with a signed DPA or the free consumer version.

  2. Request your child's data annually: FERPA gives parents the right to inspect education records. This includes data held by AI tools when the school has designated them as school officials. Request a data export from the school annually to understand what's being collected and stored.

  3. Opt out when possible—but understand the trade-offs: FERPA allows parents to opt out of directory information sharing. Some AI tools allow usage opt-out. But opting out may mean your child can't use the tool at all—which can create educational disadvantage if the tool is integral to classroom instruction. When opting out, ensure the school provides an alternative that doesn't disadvantage your child academically.

  4. Teach your child data awareness: Students who understand that AI tools record their inputs are more cautious about what they share. Teach children: don't share personal information (address, phone number, family details) with AI tools. Don't input other students' information. If an AI tool asks for information that feels personal, ask a teacher first. See AI-Assisted Report Card and Progress Report Writing Tools for how student data flows into reporting systems.


What to Avoid

Pitfall 1: Assuming School-Approved Means Privately Reviewed

Many schools adopt AI tools without formal privacy review—especially when individual teachers adopt free tools for their classrooms. "School-approved" may mean an administrator said "sure, try it" without consulting the district's data privacy officer. Ask specifically whether a privacy review was conducted, not just whether the tool was "approved."

Pitfall 2: Ignoring Terms of Service Updates

AI companies update their terms of service frequently—sometimes changing data practices significantly. A tool that didn't use student data for training in 2024 might start doing so in 2025 via a terms of service update that nobody read. Schools should review AI tool privacy policies annually and parents should ask whether this review occurs. See How AI Is Transforming Daily Lesson Planning for K–9 Teachers for how teachers should manage these policy changes.

Pitfall 3: Conflating Privacy with Security

A tool can be secure (data encrypted, servers protected from breaches) but not private (data used for AI training, shared with third parties, retained indefinitely). Security protects data from unauthorized access; privacy controls how data is used by authorized parties. Both matter, but parents typically focus on security (will data be hacked?) while the more common risk is privacy (will data be used in ways I didn't consent to?).

Pitfall 4: Blocking All AI Tools

Some parents respond to privacy concerns by requesting their children be excluded from all AI tools. While understandable, this can create significant educational disadvantage as AI tools become more integrated into instruction. A more effective approach: require the school to use AI tools that meet specific privacy standards (signed DPA, no AI training, defined retention period) rather than prohibiting AI tool use entirely. See AI Tutoring Platforms for Students — Personalized Learning at Scale for the educational value these tools provide when privacy is properly managed.


Key Takeaways

  • 89% of K-12 schools use AI-enabled education technology, but only 34% have reviewed these tools' privacy policies (Future of Privacy Forum, 2024).
  • The version matters enormously: free/consumer AI tools have radically different privacy postures than enterprise/education versions of the same tools.
  • FERPA and COPPA provide foundational protections, but were written before AI and don't specifically address model training, algorithmic profiling, or generative AI data use.
  • Education-first tools (Khan Academy, IXL, MagicSchool, EduGenius) have the strongest privacy defaults — they don't use student data for AI training and support school data processing agreements.
  • Data used for AI training can't be individually deleted afterward — once embedded in model parameters, it's permanent. Ask whether tools use student data for training before allowing access.
  • Use the parent checklist to evaluate every AI tool your child's school introduces: signed DPA, specified data types, no AI training use, defined retention period, and deletion process.
  • Check state laws: many states provide protections beyond FERPA, including advertising prohibitions, breach notification requirements, and mandatory parent disclosure.
  • Teach children data awareness: students who understand AI tools record their inputs make better decisions about what information to share.

Frequently Asked Questions

Can I see what data AI tools have collected about my child?

Under FERPA, parents have the right to inspect education records. If the school has designated an AI tool as a "school official" through a DPA, the data held by that tool is considered an education record. Request a data export through the school (not directly from the vendor). Some tools (Khan Academy, IXL) also provide parent-accessible dashboards showing collected data.

What happens to my child's data when they leave the school?

This depends on the DPA. Good DPAs specify that student data must be deleted within 30-90 days of the school's request after a student leaves. Without a DPA, data retention is governed solely by the vendor's terms of service—which may allow indefinite retention. Ask your school what data deletion procedures are triggered when students graduate or transfer.

Should I be worried about AI tutoring conversations being stored?

Yes, if the tool is a consumer version without education privacy protections. AI tutoring conversations can reveal learning struggles, emotional state, family situations, and personal interests. If these conversations are stored indefinitely, used for AI training, or accessible to unauthorized parties, the risk is significant. Use only education-version AI tutoring tools with explicit conversation data policies.

How is this different from data collected by textbooks and worksheets?

Paper worksheets generate no data trail. Digital tools generate extensive behavioral data: how long students spend on each question, which answers they change, what time of day they work, how they interact with help features. This behavioral data paints a detailed portrait of learning patterns, work habits, and cognitive development that paper materials never captured. The volume and granularity of AI tool data is qualitatively different from traditional educational data collection.


Next Steps

#ai-tools#edtech-reviews#privacy#student-data#FERPA#COPPA#parent-guide