The Brain Doesn't Learn Faster With AI: Why Neuroscientists Say Slow Learning Wins
Artificial intelligence tools like ChatGPT are built for productivity, not learning, and using them to accelerate student work may actually undermine how brains develop critical thinking skills. While technology companies market AI as a way to speed up education, neuroscientist Jared Cooney Horvath argues in his book "The Digital Delusion" that this distinction matters enormously, especially during adolescence when the brain undergoes its most significant development since ages zero to two.
Why Do Teachers Fear AI More Than Job Loss?
When educators talk about AI in schools, the conversation rarely focuses on whether robots will replace them. Instead, 84 percent of teachers worry that AI could erode students' critical thinking and foundational skills. A survey of more than 1,000 college faculty found that 90 percent believe AI will diminish students' ability to think critically. The real fear isn't obsolescence; it's learning loss.
Classroom teachers express the most anxiety about AI, while administrators and college faculty tend to be more optimistic. This gap reflects how each group experiences the technology. For teachers working directly with students, AI shows up as a tool that students might use to bypass thinking. For administrators, it appears as a productivity solution for paperwork.
"Teachers don't see AI as a job threat. They see it as a learning threat when students use it to bypass thinking," according to survey findings on educator attitudes toward artificial intelligence.
Jotform Education Survey, 2026
How Does the Brain Actually Learn Best?
Decades of neuroscience research reveals that learning requires effort, attention, and productive struggle. When students use AI to offload cognitive tasks, they miss the mental work that builds lasting knowledge. Horvath compares this to watching someone else lift weights at the gym instead of lifting them yourself: the task gets done, but you don't develop strength.
Research consistently shows that certain low-tech practices outperform digital alternatives. Reading on paper produces better comprehension and retention than reading on screens, a phenomenon called the "screen inferiority effect." Paper provides spatial anchors that help readers locate and remember information, while scrolling online leaves words without fixed locations, encouraging skimming rather than deep processing.
Handwriting also outperforms typing for learning. When students handwrite notes, they process information more deeply because handwriting is slower than typing. This forces students to summarize and synthesize what they hear rather than transcribe it word-for-word. Brain imaging shows that handwriting activates the same neural circuits used to decode text, strengthening both writing and reading skills simultaneously.
- Paper Reading: Students comprehend and retain more information when reading on paper compared to screens, due to spatial anchors that help memory encoding
- Handwritten Notes: Handwriting produces stronger learning outcomes than typing because the slower pace forces cognitive processing and synthesis of information
- Productive Struggle: Learning that is slowed down rather than accelerated tends to be more lasting and transferable to new situations
- Foundational Knowledge: Higher-order thinking is built upon lower-order knowledge; students cannot bypass memorizing facts and building basic skills with technology shortcuts
What Happens When AI Allows Students to Skip the Hard Parts?
One of the most problematic claims from technology companies is that students no longer need to memorize facts because knowledge is available at their fingertips. Instead, companies argue, students should learn to write effective AI prompts and edit the output. Horvath calls this argument "nonsense". Higher-order thinking depends on foundational knowledge. A student cannot evaluate AI-generated research without understanding the subject matter well enough to judge accuracy and relevance.
When AI completes tasks quickly, it prevents students from developing the underlying skills needed to perform those tasks independently. This "cognitive offloading" is efficient for experts who already possess deep knowledge, but for novices and adolescents, it short-circuits the learning process. Students need to put their brains on the treadmill, learning how to gather sources, evaluate information, and construct arguments themselves.
Educators surveyed identified the tasks most likely to be automated by AI: administrative work (ranked by over 50 percent of respondents), lesson planning (49 percent), and grading (29 percent). Notably, far fewer educators see AI replacing the human core of teaching, such as student support and advising (14 percent) or content delivery (21 percent). The pattern is clear: teachers want AI to handle busywork so they can focus on relationships and real learning.
How Should Schools Actually Use AI Without Compromising Learning?
The solution isn't to ban AI from schools. Instead, educators and neuroscientists recommend a thoughtful approach that protects the slow, effortful process of learning while using AI strategically for administrative tasks. Schools need clear policies, better training, and firm boundaries for responsible use rather than more AI hype.
Practical recommendations include prioritizing paper reading to foster comprehension and analysis, emphasizing handwriting for note-taking and early drafting to strengthen cognition and literacy, designing assignments that engage students in the full learning process (researching, organizing, drafting, revising) rather than allowing shortcuts, and being cautious about permitting students to use AI when it allows them to offload tasks they need to learn.
Some educators are finding success by using AI as a thinking tool rather than a shortcut. This might mean having students use AI to generate initial ideas, then requiring them to critique, revise, and improve the output. Or using AI to create practice problems that students must solve themselves. The key is keeping students in the driver's seat, challenged to think, rather than passive consumers of AI-generated content.
Jonathan Haidt, author of "The Anxious Generation," notes that Horvath's work is "not anti-tech, but pro-learning." Schools use technology thoughtfully when it puts kids in the driver's seat, challenging them to code, join robotics teams, or create documentaries. But when students become passive consumers of technology, growth is inhibited.
The goal of education is not to train students to use technology. It is to teach them how to think. As Horvath reminds educators, "effort isn't the enemy of learning; it's the secret ingredient." Producing tomorrow's leaders requires protecting the slow process of learning, one class and one conversation at a time.