← Home

The 10x School; What It Would Take to Build One

The Problem

The Problem

Within one hour of learning something new, people forget 50% of it. Within 24 hours, 70%. Within a week, they retain only 25% [1]. This is Ebbinghaus’s forgetting curve, replicated consistently for over a century.

Traditional education ignores it completely. Students cram, pass exams, forget almost everything.

The result: 40% of employers don’t believe graduates are career-ready [2]. There’s a 30-point gap between how competent students think they are and how competent employers find them [3]. Companies spend $3.4 billion annually on remedial training for entry-level employees who can’t apply what they supposedly learned [2].

This isn’t a mystery. We know what works.


The Solutions Exist — In Isolation

Solutions in Isolation

Research has proven these approaches work:

Personalized tutoring: Bloom’s 2 Sigma research showed tutored students perform two standard deviations above conventional students — the 98th percentile [5]. AI tutoring now matches human tutor effectiveness [5].

Spaced repetition: Distributing practice over time improves retention by 200% compared to cramming [4]. Graham Nuthall found that encountering content three times predicts with 80-85% accuracy whether a student learns it [1].

Adaptive learning: A 2024 meta-analysis found personalized adaptive learning produces moderate-to-strong effects on achievement, with 59% of studies showing improved performance [6].

Active learning: A meta-analysis across 225 STEM courses found active learning reduced failure rates by 55% and increased exam scores by half a standard deviation [8].

Knowledge graphs: Mapping concepts as nodes and prerequisites as edges enables precise diagnosis — finding which foundational concept is broken, not just which question was wrong [7].

Explorable explanations: Bret Victor’s concept of text as “an environment to think in” rather than “information to be consumed” [9]. Nicky Case demonstrated this approach works: interactive simulations let learners discover cause-and-effect relationships themselves rather than being told conclusions [10].

Brilliant does interactive problems. Khan Academy does videos. Duolingo does gamification. Anki does spaced repetition. Nicky Case’s explorables prove concepts through play [10]. Each works in isolation.

Here’s what’s remarkable: Khan Academy once had both knowledge maps and spaced repetition. Skills would visibly “fade” over time, and an interactive map showed how concepts connected. Users could see at a glance what needed reinforcement. Then they removed both features in favor of linear course progression [12]. The company that came closest to integration walked it back. As one critic noted: “The concept behind Khan Academy is that if you get the right answer ten times in a row, you got the concept. But that’s not the entire truth — you got it at that moment, but it’s far from being in your long-term memory” [13].

Even Khanmigo, their promising AI tutor now reaching 700,000 students [14], exists as a separate layer — not deeply integrated with spaced repetition or knowledge graph diagnosis. The pieces remain siloed.

Nobody combines them.


The Real Insight: Compound, Don’t Isolate

Compound, Don't Isolate

The insight isn’t that these techniques work. Everyone knows that.

The insight is that combining them should produce compound gains — and nobody has seriously tried.

Spaced repetition is more powerful when the system knows your knowledge graph and can identify exactly what to reinforce. Adaptive learning is more powerful when it’s organized around a goal you actually care about. AI tutoring is more powerful when it has your full learning history and can connect today’s confusion to yesterday’s gap. Live sessions are more powerful when the app has already surfaced what each student struggles with. Explorable explanations are more powerful when they’re sequenced by prerequisite knowledge and reinforced over time.

Each innovation amplifies the others. Yet the EdTech industry treats them as separate products competing for the same market, not as components of a single system.

Someone should build that system.


Why Previous Attempts Failed

Why Previous Attempts Failed

Three patterns killed earlier efforts:

1. Scale-or-die pressure. VC-backed EdTech must grow fast. This forces premature scaling before the product works. MOOCs optimized for enrollment (millions of signups!) rather than completion (5% finish rates). When you need hockey-stick growth, you can’t afford to iterate slowly on a complex integrated system.

2. App-only thinking. Digital learning scaled, but it’s lonely. No accountability. No peer motivation. No live discussion that deepens understanding. Completion rates reflect this. Meanwhile, cohort-based courses (Maven, bootcamps) have engagement but don’t personalize. The industry split into two camps instead of combining both.

3. Integration was genuinely hard. Mapping goals to skills to prerequisite concepts used to require armies of curriculum designers. Building adaptive content required expensive manual tagging. The coordination cost of combining multiple systems exceeded what most teams could afford.

4. Passive content masquerading as active. As Nicky Case points out, not all “hands-on” learning is equal: “In high school, I did lots of those chemistry ‘experiments’ where the teacher just tells you exactly what to do and exactly what result to expect. I remember setting my hand on fire. I don’t remember any actual chemistry” [10]. Following instructions isn’t learning. Thinking is. Most digital courses are just video lectures with quizzes — passive consumption with a thin interactive veneer.

5. Even successful platforms retreated from integration. Khan Academy is the most telling example. They built knowledge maps showing concept relationships. They built spaced repetition where skills visibly aged. Users loved these features — called them “literally the only thing that made Khan Academy stand out” [12]. Then Khan Academy removed them, concluding that “linear course progression works best for most students.” The organization with the most resources and best intentions still couldn’t sustain the integrated approach. If they couldn’t make it work, what would it take?


Why Now Is Different

Why Now Is Different

AI makes integration tractable.

The same technology creating demand for AI education also makes building this system feasible. Mapping goals → skills → prerequisites no longer requires manual curriculum design. AI can generate practice problems at the right difficulty. AI can explain concepts multiple ways based on what worked before. AI can analyze response patterns to detect confusion. The integration problem that was prohibitively expensive is now solvable.

Live experiences create a forcing function.

Previous attempts relied purely on apps — and apps alone can’t sustain engagement. But live sessions (workshops, cohort calls, peer projects) create accountability and motivation that digital-only platforms lack. More importantly, live experiences provide rapid feedback on what’s working. When you see students confused in real time, you know immediately what to fix. The app + live combination isn’t just better for learners — it’s better for iteration.

AI learning is the perfect test domain.

The demand is real and growing. The learners are motivated adults who chose to be there, not captive students. The content is structured enough for knowledge graphs. The goals are concrete: build a model, deploy an agent, understand how transformers work.

Crucially, the knowledge domain is small enough to test properly. Unlike “teach all of mathematics” or “teach all of history,” AI/ML has a bounded set of core concepts. A system can map the full prerequisite graph, measure retention across every node, and validate whether the compound approach actually works — all within a tractable scope. Start narrow, prove it works, then expand.

Independent builders have a path.

Crowdfunding and app distribution mean someone could build this without VC pressure. Start with a small cohort. Measure what actually happens. Adjust. Repeat. This is the opposite of the MOOC playbook — and that’s exactly what’s needed.


What a 10x School Would Look Like

What a 10x School Would Look Like

A system that integrates all these innovations, organized around goals, combining digital and live learning, built on explorable explanations rather than passive content.

Goal-First Architecture

The system starts with one question: What do you want to be able to do?

Build a recommendation system. Fine-tune a language model. Understand how attention mechanisms work. From this goal, it maps required skills, identifies gaps, and generates a personalized path.

Instead of grades, students see goal proximity: “You’re 65% of the way to deploying your first agent.” Video game psychology applied to learning. Games show “3 skills until you can fly,” not “you scored 78% on level 3.”

AI-Powered Personalization

Every student gets a tutor that adapts in real time. The system detects confusion from response patterns, generates practice at the right difficulty, explains concepts multiple ways until one clicks. This is Bloom’s 2 Sigma finding, finally scalable.

Spaced Repetition Built In

The forgetting curve destroys learning without reinforcement. The system schedules review at optimal intervals automatically. Short daily sessions maintain everything learned. Students stop losing what they worked to gain.

Nicky Case’s interactive comic “How To Remember Anything Forever-ish” [11] demonstrates this beautifully — teaching spaced repetition through an explorable explanation that embeds the very technique it’s teaching. That’s the kind of integration a 10x school needs: not just using spaced repetition, but making students understand why it works through direct experience.

Knowledge Graph Diagnosis

When a student struggles, the system traces back through prerequisites to find what’s actually broken. Not “you got question 7 wrong” but “your confusion about backpropagation stems from a gap in chain rule understanding.” Remediation becomes precise instead of guesswork.

Explorable Explanations, Not Passive Content

Bret Victor’s core insight: the goal is “to change people’s relationship with text. People currently think of text as information to be consumed. I want text to be used as an environment to think in” [9].

A 10x school wouldn’t use videos and articles. It would use interactive simulations where learners discover principles by manipulating variables and observing results. The same approach Nicky Case used to teach game theory in “The Evolution of Trust” [10] or segregation dynamics in “Parable of the Polygons” — making the abstract concrete through play.

This means:

  • See, Model, Apply: Learners generate their own data points and form patterns, rather than receiving pre-made models [10]
  • Start small, build big: Isolate mechanics before combining them, like teaching jumping separately from 4D movement [10]
  • Prove it to yourself: Learners aren’t told counterintuitive conclusions — they discover them through exploration [10]

Nobel Laureate Carl Wieman compared lectures to “the pedagogical equivalent of bloodletting” [10]. A 10x school would be the pedagogical equivalent of what Case envisions: “kids getting exercise by racing along the beach, immersed in sunlight and fresh air, building tiny castles on the shore of a deep, vast, and beautiful ocean” [10].

App + Live, Not App vs. Live

This is where a 10x school would diverge from most EdTech thinking.

Apps excel at personalization, spaced repetition, adapting to schedules, meeting learners where they are. But apps are lonely. Completion rates prove it.

Live experiences excel at motivation, accountability, peer learning, and the kind of discussion that deepens understanding. But live doesn’t personalize.

The answer isn’t choosing one. It’s using both for what each does best.

The app handles daily practice, spaced repetition, and adaptive explorable content. Live sessions handle discussion, peer teaching, accountability, and the motivation that comes from learning alongside others. The app informs the live sessions (here’s what this cohort struggles with). The live sessions reinforce the app (you committed publicly to finishing this module).

Adapts to Life

Students specify availability: 30 minutes on a commute, an hour after work. They specify energy: low-energy sessions get light review, high-energy sessions get challenging new material. Learning fits into life instead of demanding life reorganize around it.


How to Measure Success

How to Measure Success

If the goal is “deploy a working ML model,” success isn’t passing a quiz. It’s deploying a working ML model.

Goal achievement rate. What percentage of students accomplish what they set out to do? Traditional schools don’t track this. A 10x school would.

Retention at six months. Not “can you pass a test next week” but “can you still apply this six months later?” The forgetting curve predicts ~25% retention with traditional methods [1]. With integrated spaced repetition, 80%+ should be achievable.

Employer/peer validation. For career-oriented goals, external validation matters. Can graduates actually do the thing in a real context?

Completion rate. MOOCs hover around 5%. Cohort courses do better. A compound system combining the engagement of live with the flexibility of apps should beat both.

The results should be published — what works, what doesn’t, what surprises. An experiment, not a sales pitch.


The Bet

The Bet

The hypothesis: integrating these approaches produces compound gains that exceed the sum of parts.

The test: do students achieve their goals faster, retain more, and apply knowledge better than existing alternatives?

The domain: AI learning, where demand is high, learners are motivated, and the content fits the approach.

The method: start small, measure rigorously, iterate, scale only what works.

If the compound effect is real, this changes how education should work. If it’s not, we’ll learn something important about why these innovations don’t combine the way theory predicts.

Someone should find out.


References

[1] Ebbinghaus, H. “Memory: A Contribution to Experimental Psychology” (1885). Modern replications: https://www.digitaled.com/resources/blog/the-forgetting-curve-why-retention-fails-without-reinforcement/

[2] Cengage Group. “2025 Graduate Employability Report.” https://www.cengagegroup.com/news/press-releases/2025/cengage-group-2025-employability-report/

[3] NACE. “The Gap in Perceptions of New Grads’ Competency Proficiency.” https://www.naceweb.org/career-readiness/competencies/the-gap-in-perceptions-of-new-grads-competency-proficiency-and-resources-to-shrink-it

[4] Kang, S.H.K. “Spaced Repetition Promotes Efficient and Effective Learning.” PMC. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8759977/

[5] Bloom, B.S. “The 2 Sigma Problem” (1984). AI replication: https://arxiv.org/html/2404.02798v2

[6] Heliyon. “Personalized adaptive learning in higher education: A scoping review.” https://pmc.ncbi.nlm.nih.gov/articles/PMC11544060/

[7] PMC. “Knowledge graph construction and application in education.” https://pmc.ncbi.nlm.nih.gov/articles/PMC10847940/

[8] Freeman, S. et al. “Active learning increases student performance in science, engineering, and mathematics.” PNAS (2014). https://www.pnas.org/doi/10.1073/pnas.1319030111

[9] Victor, B. “Explorable Explanations.” https://worrydream.com/ExplorableExplanations/

[10] Case, N. “I Do And I Understand.” https://blog.ncase.me/i-do-and-i-understand/ and “Explorable Explanations.” https://blog.ncase.me/explorable-explanations/

[11] Case, N. “How To Remember Anything Forever-ish.” https://ncase.me/remember/

[12] Khan Academy Help Center. “Please bring back comprehensive spaced repetition.” https://support.khanacademy.org/hc/en-us/community/posts/360077091931-Please-bring-back-comprehensive-spaced-repetition

[13] Big Think. “Why Spaced Repetition is the Missed Opportunity in Education Today.” https://bigthink.com/guest-thinkers/why-spaced-repetition-is-the-missed-opportunity-in-education-today/

[14] Khan Academy / Khanmigo. “Khan Academy rolls out AI-powered teaching tools.” https://www.khanmigo.ai/