The right tool for the job
In which I stop debunking and start building
I’ve spent the last couple of Fridays reading the references. I looked at Kirschner, Sweller, and Clark and found that its research base—for math at least—was narrow, its terminology slippery, and the three National Academy of Sciences reports it cited in support actually contradicted its thesis. I looked at Alfieri et al. and found a careful meta-analysis whose strongest finding was for guided discovery—an article widely cited in arguments against discovery learning that turns out, on a close reading, to support it. That has been fun, and I think it was worth doing, but debunking is easy and has its limits. At some point you have to stop saying what’s wrong with other people’s arguments and start trying to figure out where they are right.
So let me try.
I want to start by noticing that a lot of disagreements in this area come from different sides focusing on different problems. Often, both problems deserve solving. Everyone agrees that students should develop fluency with basic number facts so that working memory is freed up for reasoning with the facts rather than burdened with finding them. Everyone agrees that students should understand important mathematical concepts—what a fraction is, why the addition algorithm works, what an equation says. And everyone agrees that students should be able to apply concepts and procedures to solve problems with something like real-world complexity. These are genuinely different learning targets. What is involved in learning each one is different, and we should not expect a single way of teaching and learning to be best for all of them. We don’t drive nails with screwdrivers or pound screws with hammers.
So the question I want to ask is not whether explicit instruction is the right way or the wrong way, but what it is good for. And then, once we’ve answered that, what about everything else students need to learn?
What is explicit instruction, exactly?
To answer this question I went looking for an article that gives a definition and found “Explicit Instruction: Historical and Contemporary Contexts” by Hughes, Morris, Therrien, and Benson, published in 2017 in Learning Disabilities Research and Practice. The authors searched the literature for “explicit instruction,” “explicit teaching,” “explicit direct instruction,” and “learning disabilities,” and assembled what they found into five pillars:
Segment complex skills
Draw student attention to important features of the content through modeling/think-alouds
Promote successful engagement by using systematically faded supports/prompts
Provide opportunities for students to respond and receive feedback
Create purposeful practice opportunities.
The first and the last of these are sound principles for any approach to teaching; you would find them in a well-designed curriculum supporting a guided discovery approach. The middle three are where you see the specific shape of the model.
Think-alouds are the teacher performing the expert reasoning out loud while the student watches and listens. Compare that with the elicited explanations supported by Alfieri et al., where the student is articulating the reasoning. Both make thinking visible, but the direction is reversed. Faded supports presupposes that the teacher started by showing the student how to do it: you fade from full support to no support, which means you began with full support. In guided discovery the student starts with partial knowledge and builds toward fuller understanding, a different trajectory. And the language in pillar 4 is telling—providing opportunities for students to respond is not quite the same as inviting students into a conversation where information flows in several directions.
Put the five pillars together and what you have is a recognizable instructional model, a descendant of the traditional I-do-we-do-you-do approach. It is a model with a long tradition and it is worth asking when it works.
What explicit instruction is good for
There are some pretty clear answers to this in the research literature.
Hughes et al. has a section called “Is Explicit Instruction Effective?” and the first paragraph of that section cites literature reviews that “identified explicit instruction as effective for teaching students with LD [learning disabilities] in the areas of math, reading, and writing.” Of the studies in this group, only one is specifically about math: “Mathematics Interventions for Children with Special Educational Needs” by Kroesbergen and Van Luit, a 2003 meta-analysis of 58 studies of math interventions for elementary students with special educational needs. It’s a careful piece of work and it finds that direct instruction is effective for basic computational skills with students who have special needs. The sample sizes are small and the advantage does not extend to problem solving—and interestingly, a third approach the authors call “self-instruction” has the highest overall effect size.1 But the finding for direct instruction and basic computational skills is real.
The second citation is a 2009 IES Practice Guide by Gersten et al., Assisting Students Struggling with Mathematics: Response to Intervention for Elementary and Middle Schools. IES Practice Guides are careful surveys of the literature, with a high bar for accepting studies and an explicit grading scheme for the strength of evidence behind each recommendation. This one recommends explicit and systematic instruction and gives that recommendation the highest evidence rating. And it is refreshingly clear about its own scope: the recommendation is limited to Tier 2 and Tier 3 support, and “the guide does not make recommendations for general classroom mathematics instruction” (p. 5).
So explicit instruction has evidence behind it for a specific job: securing fluency with foundational procedural skills, especially for students who are struggling and need focused support. That is not a small job. Fact fluency and procedural fluency matter, working memory is a real constraint, and a student who has to reconstruct how to add two fractions from first principles every time is not going to have much capacity left over for thinking about what the answer means. People in cognitive load theory have been saying this for years—I agree with them.
What about everything else?
The trouble comes when the same model is promoted as the right approach for every learning target in mathematics. Hughes et al.’s next paragraph does exactly that, expanding the claim from struggling students studying arithmetic to math students in general, and bringing out the big guns: four more IES Practice Guides. So I read each of them, looking for what they actually recommend.
Siegler et al. (2010), Developing Effective Fractions Instruction for Kindergarten Through 8th Grade
Woodward et al. (2012), Improving Mathematical Problem Solving in Grades 4 Through 8
Frye et al. (2013), Teaching Math to Young Children
Star et al. (2015), Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students
A pattern emerges here: each guide takes on a different learning target, and each picks tools that fit the target.
Siegler et al. (2010) is aimed at conceptual understanding of fractions. Its five recommendations are about building on informal understanding, helping students see fractions as numbers, understanding why procedures make sense, developing conceptual understanding of ratios and proportions before exposing students to cross-multiplication, and emphasizing fractions in teacher professional development. The emphasis is thoroughly conceptual, because the job is conceptual. The only mention of direct instruction in the guide is in describing a teaching approach “designed to address concerns about the limitations of direct instruction.”
Woodward et al. (2012) is aimed at one of the learning targets I mentioned above: applying concepts and procedures to problems with real complexity. Its recommendations include explicit teacher modeling and think-alouds and guided questioning and engaging students in conversations about their own thinking and exposing students to multiple strategies. The guide is explicit that no single instructional philosophy owns this territory:
When developing recommendations, the panel incorporated several effective instructional practices, including explicit teacher modeling and instruction, guided questions, and efforts to engage students in conversations about their thinking and problem solving. The panel believes it is important to include the variety of ways problem solving can be taught.
Frye et al. (2013), on mathematics for young children, does not recommend an instructional model at all. Its recommendations are about content areas, progress monitoring, helping children describe their world mathematically, and setting aside daily time for math. Where explicit instruction appears, it is in an appendix that found positive effects for semi-structured discovery learning and for structured discovery learning paired with explicit instruction on patterns and relations. Explicit instruction shows up as a partner to structured discovery, not as a standalone approach.
Star et al. (2015), on algebra—which I discussed in an earlier post—recommends using solved problems to analyze reasoning, teaching students to use algebraic structure, and having students choose among alternative strategies. The emphasis is on developing strategic thinking. The guide allows that explicit instruction may be necessary for some students, but adds a caution: “it is important to distinguish between providing explicit instruction and teaching only a single solution strategy and asking students to memorize the steps of that strategy.”
None of these four guides recommends explicit instruction as a general approach to teaching mathematics. I don’t think the authors were worrying about labels; they were just picking the right tools for the job. Conceptual understanding of fractions is a different target from securing fluency with single-digit facts, and problem solving with real-world complexity is different from either, and the research—when you read it—tells you as much.
The mechanism underneath
There is a nice confirmation of this from inside the cognitive load research itself. Here is part of the abstract from “The Expertise Reversal Effect” by Kalyuga, Ayres, Chandler, and Sweller (2003) (the same Sweller as in Kirschner et al.) with emphasis mine.
When new information is presented to learners, it must be processed in a severely limited working memory. Learning reduces working memory limitations by enabling the use of schemas, stored in long-term memory, to process information more efficiently. Several instructional techniques have been designed to facilitate schema construction and automation by reducing working memory load. Recently, however, strong evidence has emerged that the effectiveness of these techniques depends very much on levels of learner expertise. Instructional techniques that are highly effective with inexperienced learners can lose their effectiveness and even have negative consequences when used with more experienced learners.
In other words, the same cognitive science that tells us why foundational fluency matters also tells us that the techniques best suited to building that fluency are not the techniques best suited to every later stage of learning. Different learning targets need different tools, and the people who built the theory already know this. The trouble only starts when citations of the theory lose nuance and inflate its results.2
Where this leaves me
So here is the position I arrived at, which I don’t think is going to surprise anyone. Fluency with basic facts and foundational procedures matters, it reduces the mental effort required to execute basic steps, and the evidence is pretty good that structured, explicit practice—especially for students who need focused support—is a reasonable way to get there. Conceptual understanding matters, and the evidence there points to making connections among ideas, facts, and procedures explicit while students work on problems within reach but not yet understood. Application to complex problems matters, and that is a different job again, with its own pedagogy, and the field’s best guidance on it draws from several traditions rather than picking one.
None of these is in conflict with the others. The mistake is thinking that one pedagogy wins all the time, or that a finding about one target is a verdict about the others. If I have a quarrel with anyone, it isn’t with the researchers who did the careful work; it’s with the way their work gets used in broader arguments, where the narrow finding becomes a broad slogan and the careful caveats fall away.
So let’s can the slogans and start building a world for students and teachers in real classrooms. There is plenty to work on—about what guided discovery actually looks like in a classroom, about how the expertise reversal effect plays out as students move through a curriculum, about what making connections explicit looks like in practice—and I intend to address some of it on future Fridays. I invite you to join with me and bring your friends, especially the ones who disagree with you. Which learning targets do you think your own teaching, or your own children’s schooling, handles best, and which the worst? What does the right tool for the job look like to you? Let’s build this thing together.
I’m not sure what self-instruction is. From the references it seems to involve teaching students a strategy and then having them guide themselves through it with self-monitoring and self-regulation. The fact that it outperformed direct instruction is notable because it’s the approach that gives students the most agency. But these are weeds into which I will not wade further.
A case in point is Richard Mayer’s beautifully clear 2004 paper “Should There Be a Three-Strikes Rule Against Pure Discovery Learning?”. Mayer’s answer is yes: three decades of research show that pure discovery doesn’t work. I don’t disagree. Nobody I know advocates for pure discovery. What’s striking is the caveat Mayer offers: “Nothing in this article should be construed as arguing against the view of learning as knowledge construction or against using hands-on inquiry or group discussion that promotes the process of knowledge construction in learners.” Indeed, some of the studies he describes conclude that guided discovery is superior to both pure discovery and to “expository methods, in which the student is given the problem along with the correct answer.” And yet Mayer’s article is routinely cited in arguments against exactly the view of learning Mayer says he is not arguing against. Exhibit A for losing nuance and inflating results.



Bill, this is a synthesis I’ve been waiting for — and I find myself in genuine agreement. What strikes me most is how naturally you’ve sorted the guides by learning target, to illustrate how each one reaches for a different set of tools without any apparent anxiety about the label.
The way I hear you presenting this: when a lesson is aimed at conceptual understanding, the most effective path begins with why — why does this work mathematically — then builds toward how we generalize the procedure, and closes with when this technique is most useful or appropriate, perhaps set alongside another approach. The direction of thinking matters: student articulation first, not teacher performance first.
When the focus shifts to applications and messy real-world modeling, the teacher’s toolbox gets leveraged differently. Expert demonstration sets a frame of reference. Guided inquiry leads students down more productive paths than leaving them entirely to their own devices. And there has to be structured time for reflective iteration — refining and improving models through feedback. The right close for a modeling unit might be a gallery walk, a display of the various methods different groups employed, with a teacher-led comparison that helps students see the landscape.
For the youngest learners, developing fluency to offload working memory burden — and clear the way for conceptual understanding — is best accomplished through structured and semi-structured explorations intertwined with explicit instruction. The explicit interventions aren’t carrying the content alone; they’re there to emphasize, revoice, and sharpen the precision of patterns students have already been guided to notice.
And then there’s algebra — the symbolizing of patterns into a language of structure. A research-backed entry point here is the targeted examination of a solution as a lesson for learning. The analysis of reasoning builds deeper understanding of structure, and the close of a lesson in this territory ought to center on the strategic selection of strategies for mathematical problem solving.
None of these are in tension. They’re just different jobs, calling for different tools. I’m glad you’re building this thing in public — looking forward to the next Fridays.
Thank you for taking the time to write these posts and delve into the research. All of this is so important. Discussing the different types of mathematical learning and approaches that work in each case is extremely practical. In my 10th grade math class I use a wide variety of instructional approaches depending on the content, where we are in a unit, student mastery data, and the nature of the mathematics involved. Most of the time students are constructing meaning by working through problems and discussing them in a guided format. In my support block at the end of the school day where I work with 10-15 students who are struggling to access the curriculum, I lean heavily on explicit instruction. Teachers need to know how to leverage a variety of instructional approaches and when each one is appropriate but that will never happen if there are entrenched viewpoints that only one type of instruction is “right”.