education leadership

How to Conduct an AI Readiness Assessment for Your School

EduGenius Team··16 min read

How to Conduct an AI Readiness Assessment for Your School

Schools that rush into AI adoption without understanding their starting point tend to fail in predictable ways: they buy tools their infrastructure can't support, roll out AI without policies that address inevitable academic integrity questions, train teachers who aren't interested while ignoring teachers who are ready, and allocate budget without knowing their actual ongoing costs. A 2024 ISTE report found that schools completing a structured readiness assessment before AI implementation were 3.2 times more likely to report sustained adoption after two years compared to schools that adopted tools opportunistically.

Readiness assessment isn't about determining whether your school is "ready enough" for AI — every school is ready for some level of AI adoption. The purpose is to understand where you are strong, where you have gaps, and what sequence of actions will produce the most progress with the least friction. A school with excellent infrastructure but no AI policy needs different first steps than a school with strong teacher interest but unreliable Wi-Fi.

This guide provides a five-domain readiness framework, a self-assessment rubric you can use immediately, interpretation guidance for your results, and a gap-closing action plan.


The Five Domains of AI Readiness

DomainWhat It CoversWhy It Matters
1. InfrastructureInternet bandwidth, Wi-Fi coverage, device availability, SIS/LMS capabilitiesAI tools require reliable connectivity and adequate devices; infrastructure failures create frustration that kills adoption
2. PolicyAcceptable use policy, data privacy framework, academic integrity guidelines, procurement proceduresAI without policy creates confusion, inequity, and legal risk; teachers won't commit to tools that might be prohibited next semester
3. PeopleStaff AI literacy, teacher comfort level, IT support capacity, administrative knowledgeAI tools are only as effective as the people using them; skill gaps create dependence on a few champions rather than sustainable adoption
4. CultureInnovation norms, risk tolerance, collaboration practices, attitude toward changeSchools with fear-based cultures suppress AI experimentation; schools with trust-based cultures enable organic adoption
5. BudgetCurrent edtech spending, available funding streams, sustainability planning, total cost of ownership awarenessAI adoption without budget clarity leads to subscription sprawl, funding cliffs, and eventual tool abandonment

AI Readiness Self-Assessment Rubric

Score each item 1-4 based on your school's current state. Be honest — an inflated assessment produces a useless action plan.

Domain 1: Infrastructure

Item1 — Not Ready2 — Developing3 — Ready4 — Advanced
Internet bandwidthFrequent slowdowns; streaming/video unreliableAdequate for basic use; struggles during peak demandReliable for all standard uses; supports simultaneous cloud toolsHigh-bandwidth; no performance issues even at full capacity
Wi-Fi coverageDead zones common; teachers report frequent disconnectionsCoverage in most areas; some classrooms unreliableFull building coverage; reliable in all learning spacesFull coverage plus outdoor areas; documented heat maps; monitored
Student devicesShared devices; less than 1:2 ratio1:2 or better; students share during some activities1:1 or near-1:1; devices reliably available1:1 with take-home; devices current; replacement cycle planned
LMS/SIS integrationNo LMS or SIS, or systems don't integrate with third-party toolsLMS/SIS available but limited integration; manual data transfer commonLMS/SIS with SSO; some third-party integrations activeLMS/SIS with comprehensive API; SSO for all tools; automated data flows

Domain 1 Score: _ / 16

Domain 2: Policy

Item1 — Not Ready2 — Developing3 — Ready4 — Advanced
AI acceptable use policyNo policy exists; AI use is unaddressedInformal guidance (verbal or email) but no formal policyWritten AUP covering staff and student AI use; board-approvedComprehensive AUP with grade-level guidelines, review schedule, and incident protocol
Data privacy frameworkNo DPA process; privacy decisions made ad hocDPA template exists but not consistently appliedDPA required for all tools; FERPA compliance documentedDPA + state law compliance; privacy review board; vendor audit process
Academic integrityTraditional honor code only; AI not addressedGeneral statement about AI and integrity; no specific guidelinesAssignment categories (prohibited/assisted/integrated); grade-level guidanceComprehensive integrity framework with detection alternatives, response protocol, and teaching components
Procurement processNo formal process; individuals purchase tools independentlyCentralized purchasing for major items; small purchases untrackedAll edtech purchases require approval and privacy reviewFormal approval workflow with duplicate check, privacy review, instructional review, and budget verification

Domain 2 Score: _ / 16

Domain 3: People

Item1 — Not Ready2 — Developing3 — Ready4 — Advanced
Staff AI literacyMost staff have not used AI tools; limited understanding of capabilities/limitationsSome staff have explored AI independently; awareness is inconsistentMajority of staff have used AI tools; understand basic capabilities and limitationsStaff proficient in AI use; can evaluate AI output quality; understand bias and ethics
Teacher comfort levelWidespread anxiety or resistance to AIMixed — some enthusiastic, many apprehensiveMost teachers willing to try AI with support; concerns are specific and addressableTeachers actively seek AI applications; peer mentoring happening organically
IT support capacityIT is reactive; no capacity for new tool supportIT can support basic tool deployment; limited training capacityIT can manage tool deployment, SSO configuration, and basic troubleshooting supportIT proactively evaluates tools; provides teacher support; monitors usage; manages integrations
Administrative knowledgeAdministration has limited AI understanding; delegates all technology decisionsAdministration is aware of AI trends but not engaged in planningAdministration engaged in AI planning; understands policy and budget implicationsAdministration leads AI strategy; can articulate vision, evaluate impact, and communicate with community

Domain 3 Score: _ / 16

Domain 4: Culture

Item1 — Not Ready2 — Developing3 — Ready4 — Advanced
Innovation normsNew ideas are met with skepticism; "we've always done it this way" is commonInnovation happens in isolated classrooms; not systematicSchool encourages experimentation; structured sharing opportunities existInnovation is expected and supported; time, recognition, and resources provided
Risk toleranceFailure is penalized; teachers avoid trying new approachesIndividual risk-taking tolerated but not encouraged; no safety netStructured risk-taking supported; pilot programs with clear evaluation criteriaFailure is treated as data; rapid iteration is normal; learning from failure is celebrated
CollaborationTeachers work in isolation; limited professional sharingSome collaboration (grade-level teams or departments)Regular collaborative structures (PLCs, peer observation, shared planning)Deep collaboration including co-teaching, shared resource creation, and cross-school learning
Change attitudeChange fatigue is strong; initiative overloadMixed — some openness, some resistance; depends on the specific changeGenerally positive attitude toward purposeful change; questions are constructiveStaff actively seeks improvement; distinguishes between productive and performative change

Domain 4 Score: _ / 16

Domain 5: Budget

Item1 — Not Ready2 — Developing3 — Ready4 — Advanced
Edtech spending awarenessNo central tracking of edtech spending; unknown total costSome tracking exists but incomplete; significant untracked spendingComprehensive edtech inventory with costs, renewal dates, and usage dataReal-time subscription dashboard; annual ROI review; consolidation strategy
Funding source identificationOnly local funds considered; federal funding streams not explored for AIAwareness of federal funding options but not utilized for AI toolsActive use of Title I/II/IV-A or other federal funds for technologyMultiple funding streams strategically aligned to AI priorities; sustainability planned
Total cost awarenessOnly subscription cost considered; training, support, and opportunity costs ignoredSubscription + some implementation costs consideredFull cost model: subscription + training + support + migration + opportunity costTCO analysis standard practice; includes multi-year projections and exit costs
Sustainability planningNo plan for funding beyond current year or grant periodGeneral awareness of funding cliff risk; no specific planSustainability funding identified for key tools before purchase commitmentMulti-year budget projection; transition plan for every grant-funded tool; contingency reserves

Domain 5 Score: _ / 16


Interpreting Your Results

Overall Score: _ / 80

Score RangeReadiness LevelWhat It Means
60-80AdvancedYour school is well-positioned for ambitious AI adoption. Focus on optimization, advanced applications, and sharing your practices with other schools
45-59ReadySolid foundation exists. Address specific gaps (low-scoring items) before scaling AI use. You can adopt selectively while building capacity
30-44DevelopingSignificant work needed in multiple domains. Start with foundational actions — policy, basic PD, infrastructure upgrades — before committing to AI tool purchases
16-29Early StageMajor foundational investments needed. Focus on the lowest-scoring domain first. AI tool adoption should wait until infrastructure and policy foundations are established

Domain-Level Analysis

The total score tells you your overall readiness. The domain scores tell you where to focus:

Domain ScoreInterpretationPriority
13-16Strong in this domain; maintain and optimizeLow — no immediate action needed
9-12Adequate with areas for improvementMedium — address specific low-scoring items
5-8Significant gaps; this domain will limit AI successHigh — address before expanding AI adoption
4Critical gaps; this domain will block AI adoptionUrgent — prioritize above all other domains

The Most Common Readiness Profile

Based on ISTE survey data (2024) and CoSN infrastructure reports, the most common profile for U.S. schools is:

  • Infrastructure: 10-12 (Adequate — most schools have basic connectivity and devices)
  • Policy: 5-8 (Significant gaps — most schools lack AI-specific policy)
  • People: 7-10 (Developing — mixed comfort levels, limited systematic PD)
  • Culture: 8-11 (Developing to Adequate — depends heavily on school leadership)
  • Budget: 5-8 (Significant gaps — most schools don't track edtech spending comprehensively)

This means Policy and Budget are typically the binding constraints — not infrastructure or teacher skill. Schools often invest in devices and PD while neglecting the governance and financial frameworks that sustain AI adoption.


Gap-Closing Action Plan

Priority Sequencing

IF Infrastructure Score < 8:
  → Fix infrastructure FIRST
  → AI tools can't work without reliable connectivity

ELSE IF Policy Score < 8:
  → Build policy framework NEXT
  → Teachers won't adopt tools without clear guidelines

ELSE IF People Score < 8:
  → Invest in PD and support
  → Skill building requires time; start early

ELSE IF Budget Score < 8:
  → Build financial framework
  → Prevents subscription sprawl and funding cliffs

ELSE IF Culture Score < 8:
  → Culture work is ongoing
  → Addressed through leadership behavior more than
    programs

Quick Wins by Domain

DomainQuick Win (Achievable in 30 Days)Impact
InfrastructureMap Wi-Fi coverage; identify and resolve dead zonesRemoves the #1 teacher frustration with technology
PolicyDraft a Version 1.0 AI AUP (even a 1-page interim guideline)Removes ambiguity; gives teachers permission to experiment within boundaries
PeopleHost a single 90-minute AI awareness workshopShifts conversation from fear to informed curiosity
CulturePrincipal publicly tries an AI tool and shares the experience (including mistakes)Signals that experimentation is safe and valued
BudgetCompile a complete edtech subscription inventoryReveals consolidation opportunities and unknown spending

How to Administer the Assessment

Who Should Complete It

The assessment is most valuable when completed by a small, cross-functional team — not by one person alone. The ideal completion team mirrors the AI committee structure:

  • Principal or assistant principal (administrative perspective)
  • IT coordinator (infrastructure and technical perspective)
  • 2-3 teachers (classroom reality perspective)
  • Curriculum coordinator (instructional perspective)

Process: Each team member completes the rubric independently, then the group meets to compare scores, discuss disagreements, and arrive at consensus scores. Disagreements are often the most valuable part — they reveal assumptions and blind spots.

When to Administer

  • Initial assessment: Before any formal AI adoption planning
  • Annual reassessment: Each spring, to track progress and reprioritize
  • Post-implementation check: 6 months after major AI tool deployments, to identify emerging gaps

Key Takeaways

  • Readiness assessment prevents the predictable failures of rushing AI adoption — tools without infrastructure, adoption without policy, spending without sustainability. Schools with structured assessments are 3.2x more likely to sustain adoption after 2 years (ISTE, 2024). See AI for School Leaders — A Strategic Guide to Transforming Education Administration for strategic context.
  • Five domains determine readiness: infrastructure, policy, people, culture, and budget. Score each on a 1-4 rubric across 4 items per domain. Your total (out of 80) tells you overall readiness; domain scores tell you where to focus. Most schools score lowest on policy and budget.
  • Policy and budget are typically the binding constraints, not infrastructure. Most U.S. schools have adequate connectivity and devices. What they lack is AI-specific policy (acceptable use, academic integrity, data privacy) and financial frameworks (subscription tracking, sustainability planning). Fix these first. See Building a Culture of Innovation — Leading AI Adoption in Schools for culture-building.
  • Complete the assessment as a cross-functional team. Individual assessments reveal one perspective; group discussion reveals blind spots. Principal, IT, teachers, and curriculum coordinator should each score independently, then discuss disagreements. See Building an AI Committee — Who Should Lead Your School's AI Strategy? for team structure.
  • Address the lowest domain first. Infrastructure below 8? Fix connectivity before buying tools. Policy below 8? Write the AUP before launching PD. Budget below 8? Inventory your subscriptions before adding new ones. See Managing AI Tool Subscriptions Across a District for subscription management.
  • Quick wins build momentum. A 30-day action in each domain — map Wi-Fi, draft interim AUP, host one workshop, share a principal's AI experiment, compile subscription list — creates visible progress before the assessment even produces a formal plan.

See Creating AI Usage Reports for Stakeholders and Parents for communicating progress. See AI for Student Information Systems (SIS) and Administrative Tasks for SIS integration. See Best AI Content Generation Tools for Educators — Head-to-Head Comparison for evaluating tools — platforms like EduGenius that require only a web browser and internet connection have low infrastructure barriers, making them accessible even to schools in early readiness stages.


Frequently Asked Questions

Our infrastructure score is high but our policy score is very low. Should we start adopting AI tools anyway?

You can begin limited, structured piloting — but not broad adoption. Without policy, you'll face confusion about what's allowed, inconsistent practices across classrooms, potential privacy violations, and academic integrity questions you can't answer. Start a controlled pilot with clear interim guidelines: 3-5 volunteer teachers, one or two approved tools, explicit expectations documented in writing. Use the pilot period (60-90 days) to draft formal policy informed by real experience. This approach lets you build policy from practice rather than theory, which typically produces more practical and enforceable guidelines.

How do we address the "culture" domain when the problem is leadership, not teachers?

Culture scores below 8 almost always trace to leadership behaviors, not teacher attitudes. If leadership penalizes failure, resists change, or doesn't model technology use, no amount of PD will improve the culture score. The most effective intervention is principal behavior change — publicly experimenting with AI, sharing mistakes, protecting time for teacher exploration, and celebrating process over product. If the barrier is above the principal (superintendent or board resistance), the principal can still create a micro-culture of innovation within their building by using their discretionary authority to protect experimentation. Culture change starts at the top of whatever scope you control.

Should we share our readiness scores publicly (with staff, parents, board)?

Share selectively and strategically. Share with staff — transparency builds trust and invites contribution. Share domain-level results with your board — "we're strong in infrastructure but need investment in policy and professional development" — to justify resource requests. Be cautious about sharing specific item-level scores publicly; low scores can be taken out of context. Frame results as "baseline measurements that inform our improvement plan" rather than "grades." The assessment is a diagnostic tool, not a report card. The value is in trajectory — how scores improve over time — not in any single measurement.

How long does the full assessment process take?

Individual completion of the rubric takes 20-30 minutes. The group discussion and consensus-building meeting takes 60-90 minutes. The gap-closing action plan can be drafted in the same meeting or a follow-up 60-minute session. Total investment: 2-3 hours for a cross-functional team of 5-6 people. The most common mistake is spending too long on the assessment and too little on the action plan. Set a firm time limit for the discussion phase and ensure at least 30 minutes for action planning. Perfect scores don't exist and aren't the goal; directional accuracy with clear next steps is the goal.

#AI-readiness#technology-assessment#digital-readiness#school-planning#AI-adoption