As the executive director of Read Charlotte, one of my jobs is to seek out the best interventions to support early literacy in Mecklenburg County. In 2016, our team spent hundreds of hours reviewing literacy intervention studies to find what works. Let me share two surprising facts that we learned.
First, many of the literacy interventions used in our schools have not been rigorously evaluated. We don’t have enough or the right kind of information to know how much impact they make and for whom.
Second, the impact of literacy interventions that have been rigorously evaluated is lower than you’d expect. The typical evidence-based reading intervention improves reading achievement for a mere three out of 100 children.
What is an evidence-based intervention?
To begin, let’s clear up some vocabulary. An “intervention” is a program with specified goals, objectives, and structured components (e.g., a defined curriculum, an explicit number of program hours or “dosage,” and an optimal length of time) to ensure the program is implemented with fidelity to its model.
Many of the interventions used in schools today are “research-based.” They were developed using insights from educational research into how children learn to read. But to be “evidenced-based,” we must rigorously compare outcomes for students who received the intervention with a similar group of students who did not.
The Every Student Succeeds Act (ESSA) defines four tiers of evidence for educational interventions. (At Read Charlotte, we prefer the top two tiers.) Not all interventions have been rigorously evaluated so that they can be called “evidence-based.”
I want to be clear that we cannot say that these interventions are ineffective. We don’t have enough information to determine the level of impact they make for students in our schools. And it’s not enough simply to say that a program is “evidence-based.”
We also need to ask: “What is the evidence?” How many children who get an intervention will likely have improved outcomes compared to usual practice (usually regular classroom instruction or no intervention at all)? The answer can vary greatly by intervention.
Here is what we’ve learned about some popular literacy programs used in our schools.
Some “interventions” are actually instructional strategies
Some “interventions” used in schools are in fact instructional strategies that should complement, but not replace, targeted literacy interventions.
Florida Center for Reading Research. Many educators use the research-based literacy activities from the Florida Center for Reading Research. These are great resources for educators, however, these activities are instructional strategies, not targeted reading interventions.
Guided reading. The researchers behind guided reading are clear that it is not an intervention, but rather a framework for classroom instruction. (You can read more from them on this topic here.) Also, a word of caution: if the text level chosen for small group reading instruction is too low, it may inhibit opportunity for rich classroom discussion, vocabulary building, and writing development that students need to become proficient readers.
Some interventions don’t have clear evidence (yet)
We consulted over two dozen databases to review literacy interventions. For this list of interventions frequently used in our schools, I refer to two – the What Works Clearinghouse and Evidence for ESSA. We can’t say for certain how effective these interventions are without a rigorous comparison of results between a group of students who received the intervention and a similar group who did not.
Great Leaps. This intervention is not included in the What Works Clearinghouse database. The Evidence for ESSA database says it could not find any rigorous studies of the impact of this program.
i-Ready Online Student Instruction. A linking study shows a very high correlation between the i-Ready diagnostic and the North Carolina EOG for reading. However, we couldn’t find any rigorous studies of the efficacy of the i-Ready online student lessons.
Letterland. No studies of Letterland meet the What Works Clearinghouse rigorous standards for an evidence-based intervention. (However, at least one North Carolina school district compared their results using Letterland with a similar North Carolina school district.)
Orton-Gillingham. So far, no studies of unbranded (general approach) multisensory Orton-Gillingham meet the What Works Clearinghouse standards for an evidence-based intervention. We do, however, have rigorous evidence for branded versions of Orton-Gillingham like Wilson Reading (see below).
Some interventions do have evidence (and here’s what it is)
The following interventions have been rigorously evaluated and shown to have impact on specific literacy outcomes. To make it easier to evaluate their impact, we ask how many students out of 100 will likely have improved outcomes. (For an explanation of how we compute this, click here.)
Based upon this 2012 publication from the U.S. Department of Education, we are able to say that the typical elementary school literacy intervention will improve broad reading outcomes (e.g., fluency, comprehension, reading achievement) for about three out of 100 students and will improve early foundational reading skill outcomes (e.g., letter knowledge, phonemic awareness, phonics) for about seven out of 100 children.
KPALS. On average, we’d expect KPALS will improve reading achievement outcomes for about 14 out of 100 students.
Leveled Literacy Intervention. On average, we’d expect LLI will improve reading achievement outcomes for about 11 out of 100 students.
Read Naturally. On average, we’d expect Read Naturally will improve phonemic awareness outcomes for about nine out of 100 students.
Reading Recovery. On average, we’d expect Reading Recovery to improve reading achievement for about 29 out of 100 students.
Sound Partners. On average, we’d expect Sound Partners will improve phonemic awareness outcomes for about 22 out of 100 students.
Wilson Reading System. On average, we’d expect Wilson Reading will improve phonemic awareness outcomes for about 13 out of 100 students.
The low numbers in this list are a sobering reminder that there are many other factors that impact children’s reading achievement, such as limited book access at home, food insecurity, housing instability, and/or adverse childhood experiences, to name a few. These interventions strictly focus on building literacy skills and have shown they can make an impact despite these other factors. Moreover, we believe it is possible to amplify the impact of any one intervention by intentionally stacking and aligning it with other proven interventions. However, to get these numbers requires high quality program implementation.
Two North Carolina-based interventions you should know about
Here are two evidence-based literacy interventions developed by researchers in our state that you should know about.
HELPS. Developed by John Begeny, a professor at North Carolina State University. Included in the 2016 What Works Clearinghouse practice guide for K-3 foundational literacy skills. Based upon four rigorous studies, on average we’d expect HELPS to improve fluency and comprehension outcomes for about 35 out of 100 students.
Targeted Reading Intervention. Developed by Lynne Vernon-Feagans, a professor at the University of North Carolina at Chapel Hill. Using both the What Works Clearinghouse and Evidence for ESSA reviews, on average we’d expect TRI to improve phonics outcomes for about 20 out of 100 students.
Choosing the right interventions for your students
If they are doing their job, literacy interventions should help students accelerate mastery of targeted skills so that they can more fully benefit from Tier 1 whole class instruction. This starts with selecting the right literacy interventions for your students. (Keep in mind that there are also evidence-based practices – strategies or procedures – guided by specific principles that can flexibly be used in classroom instruction. Peer tutoring is one such example.)
I propose there are eight questions we all should ask when making this decision:
- Does this intervention target the specific literacy skill(s) I’m concerned about?
- Is it evidence-based (ESSA levels) and, if so, what is the evidence?
- How does this compare to the “typical” evidence-based intervention?
- Will this work for my targeted population?
- What will it require to implement this intervention with fidelity?
- What will it require to implement this intervention at scale (if necessary)?
- Do I believe the “return on investment” (cost divided by impact) is worthwhile?
- Would I put my own child in this intervention?
Just because an intervention is not yet proven does not mean it’s not valuable. It just needs to be studied. I believe there are two opportunities for us to consider in North Carolina.
First, can we work together across the state to rigorously evaluate interventions to better understand their impact for students in our schools? Second, can we work together to adapt and improve evidence-based interventions so that they can benefit more students across our state? Greater coordination across school districts on literacy interventions can help us do more faster.