
Self-Explanation as a Learning Tool
Why this skill matters for you
You read a paragraph, solve a step, or watch one minute of a lesson. Then you pause and write two short sentences that say what changed and why it makes sense. That tiny habit—called self-explanation—turns exposure into learning you can use later.
Learners, teachers, and trainers can adopt it without new software or long lectures. You need short prompts, a notebook or text box, and a plan to practice in small chunks. A well-built routine supports deeper reasoning, better transfer to new problems, and stronger recall on tests.
What self-explanation is—and what it is not
Self-explanation is you generating brief why/how statements during study. You connect a step to a rule, name a cause that links two ideas, or justify an operation in your own words. It is not a summary, a rephrasing, or a highlight. Classic worked-example research showed that students who produced principle-based explanations solved new problems more effectively than peers who only read the steps.
Simple prompts help you stay on track:
-
Which principle or rule supports this step?
-
What causes the change from A to B?
-
How is this case different from the last one?
These prompts keep the focus on justification, not paraphrase.
How self-explanation works: three mechanisms
Constructive engagement (ICAP)
Learning behaviors fall into four broad modes: Passive, Active, Constructive, and Interactive. Self-explanation sits in the Constructive band. You generate new inferences and link prior knowledge with the current step, which predicts stronger learning than passive or purely active behaviors.
Bridging missing links
Texts, diagrams, and videos often leave connections implicit. When you create a short explanation, you build the bridge yourself. Studies with low-cohesion texts show better comprehension when readers receive training to generate inferences during reading.
Error checking and model repair
Writing a reason exposes gaps. You notice when a step lacks a rule, or when a diagram suggests a relation you cannot explain. Research links the quality of explanations—naming rules, stating relations, and flagging uncertainty—to larger learning gains.
What the evidence says
-
Overall effect: A meta-analysis across dozens of studies reported small-to-moderate gains from induced self-explanation. This pattern holds across tasks and subject areas.
-
Time-matched comparisons: When groups receive equal study time, self-explanation still helps, which points to benefits beyond simple time on task.
-
Mathematics: Syntheses in math show improvements in both conceptual and procedural outcomes when learners explain steps and principles linked to worked examples.
-
Reading: Self-Explanation Reading Training (SERT) teaches readers to build inferences and question the text. Gains appear most clearly on science passages with lower cohesion.
-
Multimedia/video: Pauses with short written explanations raise test performance compared with watching straight through. Reviews of video learning confirm this pattern.
-
Strategy ecosystem: Reports that rank study methods for durability place self-explanation alongside practice testing and spaced review, which suggests using them together.
Where self-explanation helps most
Worked examples
Prompted explanations during example study improve transfer, especially when examples vary across surface details but share the same principle. Gains increase when prompts are present early and then faded.
Reading of science and technical texts
SERT combines self-explanation with a small set of reading strategies. Readers learn to ask “What inference links these sentences?” and “What else would make this claim hold?” Studies show better comprehension for texts that hide links between ideas.
Short online lessons
Brief pause-points with two-sentence explanations support selection, organization, and integration of new information. This improves both immediate understanding and application to new problems.
Limits and conditions you should know
-
Guidance helps with complex materials. When materials include graphs, formulas, and text, assisting prompts outperform broad “Explain this” prompts. Learners integrate multiple representations more effectively with targeted questions or sentence stems.
-
Prior knowledge shapes outcomes. Novices gain from explicit scaffolds and short chunks. Unguided explanation can slip into paraphrase or repeat an error. The ICAP model predicts stronger results when activities push you into constructive territory.
-
Text cohesion matters. Low-cohesion texts invite inference making; training helps. For dense materials, shorter segments reduce overload.
A quick routine you can start today (3–2–1 cycle)
Step 1: Three prompts—pick two.
-
Which rule or definition supports this step?
-
What causes what here?
-
How does this case differ from the previous one?
Step 2: Two sentences.
Write a crisp why/how that ties the step to the rule or relation.
Step 3: One check.
Compare your note with the source or solution. Mark “?” for anything that still feels shaky.
Short cycles fit exam prep, problem sets, or textbook study. Research on pause-and-explain shows benefits even when the writing is brief.
Classroom playbook for teachers and trainers
Prompts that work
-
Rule/Principle: Name the principle used and state why it applies.
-
Mechanism: Describe the process that links A to B.
-
Constraint: What remains constant here? Why?
-
Prediction: If X doubled, what would change? Why?
Studies with multiple representations show stronger outcomes with assisting prompts than with open prompts.
From full support to independence (fading)
-
Week 1: Provide full stems under each example step.
-
Week 2: Provide partial stems and ask learners to complete them.
-
Week 3: Ask learners to write their own stems before solving.
Example variability—different surface features, same underlying rule—encourages transfer.
Intelligent tutor add-on
A geometry tutor that required students to name the rule for each step improved understanding beyond problem solving alone. That small field for typed explanations acted like a prompt and a check.
Self-explanation for reading: SERT in practice
SERT trains readers to generate inferences, ask bridging questions, and explain claims with reasons. It works well for science passages that leave links implicit. Try this sequence during a page of text:
-
Highlight the sentence that introduces a claim.
-
Ask, “What assumption or mechanism connects this claim to the previous idea?”
-
Write a two-sentence explanation, then note any “?” for follow-up.
Controlled studies report higher comprehension scores after SERT training than after unstructured reading practice.
Video learning: where to place pauses
Short videos benefit from planned stop points every 60–120 seconds. At each pause:
-
Show one prompt on screen.
-
Ask for a two-sentence why/how in a text box or notebook.
-
Resume only after writing.
This structure improves test performance compared with uninterrupted viewing. It also helps manage transient information in videos.
Measuring explanation quality (simple rubric)
A fast 0–3 rubric supports feedback and self-monitoring:
-
0 — Paraphrase only. Restates with no rule or relation.
-
1 — Label. Names a term without a link.
-
2 — Rule + relation. States the principle and shows how it applies.
-
3 — Rule + relation + check. Adds a quick self-check or a “what-if.”
Higher-quality explanations correlate with stronger gains in worked-example studies and classroom applications.
Pairing with other high-yield strategies
Self-explanation works best beside methods with strong evidence:
-
Practice testing. After a short quiz, add one sentence that explains each correct answer.
-
Spaced review. Return two days later and repeat only for items you flagged with “?”.
-
Interleaving. Mix problems that look different but share a principle, then explain the rule that unites them.
Comprehensive reviews list these methods among the most effective for durable learning.
Age-group and context adjustments
-
Primary and lower-secondary: Use concrete stems and labeled visuals. Keep chunks short. Provide full prompts at first.
-
Upper-secondary and university: Combine worked examples, assisting prompts, and gradual fading. Ask for short predictions during labs or problem sets.
-
Professional training: Tie explanations to standards, procedures, or code policies. Ask for the rule and the risk if someone ignores it.
These patterns align with ICAP’s view of constructive activity and with evidence from classrooms and tutoring systems.
Common Mistakes—and fixes that work
-
Paraphrase without reasons. Require a named rule or mechanism in each note.
-
Overload from multiple representations. Use assisting prompts and smaller segments.
-
Low-cohesion texts with no training. Teach SERT-style prompts before independent reading.
-
Time bloat. Cap each explanation at two sentences; use the 3–2–1 cycle. Evidence shows benefits even with brief writing.
A 10-minute starter plan for learners
-
Pick a topic segment that takes five to seven minutes.
-
After each paragraph, step, or minute of video, answer two prompts and write two sentences.
-
Add one quick check against the text or solution.
-
End with two retrieval questions of your own and write one sentence per answer that explains why it holds.
-
Two days later, review only the items you flagged with “?”.
This plan builds a durable habit with minimal overhead. The cycle draws on research that favors short, frequent generative actions.
A 30-minute model lesson for teachers
-
Warm-up (5 min): One worked example with three assisting prompts; collect two-sentence explanations.
-
Practice (15 min): Two more examples with partial stems; then two problems without stems.
-
Wrap-up (10 min): One minute of retrieval, two “what-if” explanations, and one student exemplar shared on screen using the 0–3 rubric.
This sequence blends assisted explanation, fading, and example variability, which together support transfer.
Final Thought
Short, frequent explanations create the links that textbooks and videos often leave implied. You read or watch a small chunk, you write two sentences that say why/how, and you do a quick check. That habit scales from a single learner to a whole class, from print to video, and from school to workplace training. The research base is broad, the routines are simple, and the payoffs are practical.
FAQs
Is self-explanation the same as summarizing?
No. Summaries restate content. Self-explanations justify steps or claims with rules, mechanisms, or cause-effect links, which supports transfer to new problems.
How often should I pause during a video lesson?
Every 60–120 seconds works well for dense content. Write two sentences at each pause, then continue.
I’m new to the topic. Will this slow me down?
Short prompts add minutes, not hours. Even in time-matched comparisons, learners who explain still gain more than those who only read or watch.
Which prompts work best for math and science?
Rule-naming and mechanism prompts: Which theorem? What invariant holds? What process produces the change? These outperform generic “Explain this” prompts.
Can technology help without a full platform change?
Yes. A simple text box that asks for the rule at each step improved results in a geometry tutor. Any LMS discussion field or form can play this role.
Learning Skills