Over the span of our collective years in education, we have heard many times that it takes a minimum of three years to get significant results from initiatives and reforms, with the best results occurring after the five-year mark. Yet, despite these numbers, we have witnessed a multitude of programs come and go in short order, usually in far less than five years. Just when educators truly understand and start seeing the payoff from an initiative, reform, or assessment, the next new “thing” replaces it. The new initiative is typically billed as the “golden ticket” that will close achievement gaps and create the conditions for all students to succeed. Meanwhile, the previous “golden ticket” is relegated to the dustbin of forgotten work.
One exception to this dizzying array of change in elementary education here in North Carolina is the use of the mCLASS assessment tool. mCLASS is an online platform that allows teachers to conduct running records and capture other key literacy information for students in grades K-3. The assessment is completed one-on-one and measures comprehension, fluency, and phonemic awareness skills.
The full implementation of mCLASS statewide just hit the five -year mark, which has allowed teachers and district leaders to begin to feel comfortable that this is an initiative here to stay. The consistency of using a specific assessment tool instead of jumping from one tool to the next is starting to pay off for teachers and students.The example of mCLASS has positive implications for education leaders becoming more open to giving initiatives the needed time to take root, blossom, and thrive.
Boosting Self-Efficacy (Elaine Miles)
The use of a formative assessment tool such as mCLASS is not anything new to most educators who have been teaching for awhile. I remember when I began teaching 19 years ago, the only assessment tools that I had available were the formative assessments I created myself or purchased from a teacher resource supplier. Back then, reading instruction was done as a whole group and I really had very little sense of my students as individual readers. Then in 2000, the district I was teaching in moved towards a balanced literacy approach.
As a key part of that approach we were taught how to conduct running records to gain formative assessment data about our students. I can remember how overwhelming this was as a classroom teacher. However, after a few years of using the tool, I became more knowledgeable about my students’ specific needs as readers.
While learning a new tool was overwhelming, the benefits of sticking with an assessment for several years made me a better reading teacher and allowed my students to truly show growth as readers. One success story that stands out to me is a student I had who came to my fifth grade class a full year below grade level. Once I conducted a running record where I could analyze specific errors the student was making, I was then able to teach him to self correct with several strategies.
This concentrated work on a specific area allowed me to target instruction more closely resulting in the student reaching grade level expectations with comprehension and fluency. He learned to make meaning from reading by accurately and consistently applying word attack strategies.
This experience helped me to easily learn and adapt to mCLASS when I was part of the pilot for the assessment in 2008. Because I already had prior knowledge of conducting running records, the use of mClass to capture information about my students as readers seemed a more efficient and streamlined way to collect the data. As with any new digital learning initiative, there were “bugs” in the system to work out, but after a year of using the tool, I was able to overcome the digital learning curve and really analyze and use the data to guide my instruction. The fact that we have consistently been using mClass has really contributed to my knowledge base as a reading specialist and has allowed me to work with the teachers I now coach to look at their student data in a more focused way.
The Impact of mCLASS on PLC Work (Mary Alicia Lyons)
As grade level teams examined their beginning of the year mCLASS assessment data, they looked at the specific skills each student needed to learn next. Some children were struggling with decoding, some were disfluent, and some needed to develop their writing about reading skills. Within writing about reading, teachers broke the student needs down further. Some students could answer the question but did not provide enough details, some answered one part of the question but not the other, while others had really not answered the question at all. This deeper analysis of students’ strengths and needs supported teachers in creating targeted goals for each student.
As we analyzed the data, I thought back to when the mCLASS assessment was rolled out statewide. In the first couple years of our implementation, the data we gathered was used in a superficial way. Scores were taken at face value alone without drilling down into what caused children to score the way they did. We were not yet able to utilize the assessment information to expertly to plan for targeted instruction. With several years of using the tool now under our belts, we are able to use mCLASS data to plan for whole and small group reading instruction that is tightly aligned to the needs of our learners.
Getting past the nuts and bolts
Administrators sometimes underestimate the amount of energy and time that goes into learning a new assessment tool. When we started with mCLASS, for example, we provided training for each grade level on how to give the differing assessments. Kits had to be assembled and copies for assessments had to be made. There were learning curves on how to sign in, what assessments needed to be given, how to score writing about reading papers, and a myriad of other things.
When assessments change frequently, more energy is expended in the process of “getting to know how to do this new thing” than in effectively using any data that comes from it. While new teachers have to be trained from year to year, sticking with a tool means that a new teacher has experienced teachers to turn to for support as they learn the tool. The experienced teachers are also able to then support a new teacher in a more specific way in using the data collected by helping them to analyze results and make instructional decisions such as grouping students with similar needs as part of the small group instruction component of the day. This allows for richer discussions in grade level teams which, in turn, leads to improved instruction for students because teachers know exactly what their students need as readers.
Impacting instruction
While no assessment is perfect, the consistency of using mCLASS to measure student growth has allowed educators to use the tool to fine tune and inform instruction. For example, before mCLASS, teaching students to write about their reading was done in a hit or miss way. This happened despite the fact that there is ample research evidence that shows writing about reading significantly boosts reading comprehension.
In the first year of mCLASS implementation, the Text Reading Comprehension (TRC) portion of the assessment highlighted that even though some of our students were doing amazing work in their writing, their skills did not always readily transfer to the skill of writing about reading. Even some of our strongest readers were struggling with writing adequate responses to questions about texts they read. As a result, we began having students write about their reading on a regular basis, but the approach at first was more of a “hope they improve from assigning them lots of writing about reading work” than a systematic approach to teaching the skills students needed to learn.
Over the years, as we have had the opportunity to analyze student responses, we have developed our repertoire of skills for teaching children how to write about their reading and moved away from trying to teach through simply giving assignments. This instructional shift was supported by the power of using mCLASS and analyzing work over time. Had we shifted to a new assessment tool, we likely would not have seen the growth in deepening our repertoire of teaching strategies, along with seeing such significant positive impacts on the ability of our students to write about their reading.
Moving the bar for expectations
We have seen the impacts of mCLASS statewide as well. In 2015, the most recent year for which data is available, North Carolina was one of only 13 states nationwide to see significant gains on their fourth grade NAEP (National Assessment of Educational Progress) scores in reading. What makes that improvement even more stunning was that from 2002 to 2013 NC had been stagnant with the 222 point score as the high-water mark. In 2015, the score rose to 226, moving us from 28th among states in the nation in 2013 to 15th in the nation in 2015.
One common initiative rolled out statewide during that time period was the implementation of mCLASS, which introduced an emphasis on being able to not only talk about one’s reading but to be able to respond in writing to questions. In most other states, statewide assessments for year had included questions where students had to write responses to questions about passages they read. However, prior to the addition of mCLASS, NC elementary students weren’t being asked to do this work in any of their state required reading assessments.
This leads us to conclude that mCLASS has been an important factor in raising the bar for teachers and students and our NAEP scores, once stuck like a broken record, shot up.
Consistency is key
While we are addressing the use of one assessment tool over time, we are addressing something much broader: consistency. Education is a field plagued with constant initiatives that appear to promise more than they can deliver. Often new initiatives are not given enough time to see if they can deliver the results we so desperately seek. Teachers’ self confidence in their abilities to teach, and to feel as if they are making an actual difference in their students’ success, comes in part from being given the time to utilize, reflect and plan with stable curriculum and assessments.
No one tool, or initiative is perfect in and of itself. However, rather than reinventing the wheel with a new assessment or initiative, it is often more beneficial to gather teacher feedback to tweak and improve current practices. We hope that legislators and education leaders begin to value consistency and the positive impacts that staying the course with initiatives has for teachers and students alike.