Hope Street Group Perspective

mCLASS – Helping or hurting our youngest readers?

About the authors

Courtney Sears is a second-grade teacher and Hope Street Group Fellow.

Kim Mellor works with teachers, students, and administrators as a literacy coach (grades 3-5), encouraging collaboration and reflective practice to enhance teaching and learning.

Recommended for you

Hope Street Group: Teachers shaping policy

Join the conversation

  • mClass is aligned to Common Core, therefore the TRC is often age and developmentally inappropriate.
    I can attest to this, as my own third grader had a fluency and lexile score akin to a 5th grader, yet the TRC component score almost made them need to do Read To Achieve. I reviewed the materials and they were well above the level of third grader, with dense passages and questions requiring kids to infer information from the text in a manner that was ‘tricky’ instead of straight forward.

    While mClass is a good formative tool, it’s still far from being an accurate one.

    • Evelyn K

      I have to agree with part of your statement, in our school district we have the largest problem growing our most proficient students. When they take the TRC or even the F&P assessments we use to give there is only so far they can go before the questions are too complex and past their maturity for them to answer. I always remember the F&P book that deals with the kid and his dad living in the van and trying to get my little 2/3 graders to try and answer questions dealing with the problems of the world. Though I want them to be thoughtful citizens and think globally I also want them to enjoy the ability to be naive and imaginative. Bringing these larger than life problems to them doesn’t allow us to get a realistic view of their reading skills but their experiences and prior knowledge. I would recommend multiple text set questions that align with grade level CCS so that we can get a true vision of the student.

      • Thank you for sharing that. I am not sure what district you are in – we’re in Wake. I’ve found the reading materials for these exams, as well as just in class materials, to often be way of the head of the students.
        On top of that, the materials are often things that don’t grab the student’s interest. I understand there needs to be sort of a vanilla flavor to them, but reading about the intricacies of how shoe leather is made puts an adult to sleep — imagine a 7 year old’s reaction!

  • Victoria Creamer

    The recommendations the authors share are reasonable and would be easy to implement. The idea of allowing teachers to interpret data to make instructional decisions is key.

  • GG

    MClass is a diagnostic tool that provides a snapshot of a student’s reading behaviors. This one day, one book, one moment of reading information must be placed side by side with the rest of our student data before an informed instructional decision can be made. To use mClass as if it were a summative tool is an incorrect use of the data it provides. The Department of Public Instruction for the state of North Carolina never intended mClass to be used to make high stakes decisions regarding a child’s report cards, summer school or retention. To do so, would be a complete and utter misunderstanding of mClass’s diagnostic nature. Let’s hope this article spurs the conversation in such a way that districts will get back to capitalizing on the formative nature of mClass and allow teachers to reflectively interpret data in order to make sound instructional decisions.

    • Kristie Compton

      Almost all of your comments are straight on and valid. BUT, dpi is now using this as a decision for summer school in 1st and 2nd grade Summer Reading Camp. All children in 1st with red/yellow Dibels and all children in 2nd at TRC L or below (red and yellow) are required to receive an invitation to summer school.

      • GG

        It is a conundrum! If mClass indicates a child’s performance is red/yellow with Dibels and, or in grade 2 the TRC is a level L AND all the other data the teacher has agrees then, of course, it would be in the best interest of the child to receive summer support. However, to use mClass alone as a high stakes decision maker flies in the face of what DPI stated when mClass was first piloted in our schools.

        • Kristie Compton

          Yes!

  • Natalie Sayag

    I agree with and have experienced much of what the authors write about in this article. To add on…research proves that writing does not measure reading comprehension or a student’s ability to comprehend text. Writing is a completely separate process. It is quite limiting to be told to instruct a student on a lower reading level due to their needs in writing. The processes should be taught as intended – separately. I am not advocating that writing should not be used in the comprehension process, but it should be used more in the process FOR learning as opposed to solely in the assessment OF learning. I agree with the article and other commentary that states that mClass should be used formatively, as opposed to in a summative way. Thank you to the authors for starting this discussion! I really hope discussions like these help move teachers towards less limiting practices (i.e. groupings solely based on mClass designated levels) and more towards practices that look more wholly at student strengths, abilities, and needs.

  • Jo Kattz

    It will not make a good summative assessment. Many children can, as noted in this article, read and verbally answer questions about a text, thus demonstrating comprehension. In the primary grades, especially K & 1, students who are very much above grade level in reading comprehension will fail the test because they must read, without assistance, the question to answer, think about the text and develop an answer that is very specific (not just Sally had a brown dog but more like Sally, who is Johnny’s little sister, has a dark brown, curly poodle that she plays with every afternoon), and, finally, write the answer using key details. That requires using many different skills simultaneously, most of which are newly acquired skills. Students can answer a question verbally, or even write an answer, IF they are read the question. They can read the question and answer verbally. They often cannot do all three at once! In fact, this is difficult for some adults!

    I find mclass useful for looking at trends through the class. Do many students know word a but miss word b? I will adjust my approach in teaching phonics and sight words and decoding skills to help them better read words. Are they rushing through a text and not thinking about it? Let’s work on slowing down and thinking! Are there specific types of questions that many struggle to answer? Teach skills that will help them understand. I can also use mclass on an individual level, over time, to sort out the same data. It is not, however, a good tool to measure proficiency or decide whether the student should advance. Keep in mind many students fall under the “late bloomer” category and are somewhat behind and then blossom later in the primary grades. Mclass may say they need to remedial assistance and the district may hold them back but, in reality, the student is making progress and will be fine in the next grade level. A machine can never take the place of teachers, or their judgment. Machines are tools, not masters. Allow teachers to use this tool but don’t use the tool to judge the teacher OR the student.

  • teacher teacher

    You summed up everything I have been saying for years beautifully!

  • RZ

    Why has no one, the authors nor all the commenters, mentioned the absurd amount of time teachers spend out of the classroom to administer these assessments? Each child must read at least two books during each assessment period. If each book, to include the writing, takes 15 minutes then you need a minimum of 30 minutes per child. So even if all 24 of your students took only 30 minutes, the classroom teacher is being pulled away from valuable instruction for 12 hours! Of course all teachers who have experience with TRC assessments know that some kids taking much longer than 30 minutes and will go through more than just two books. Some students will actually go through four or sometimes even five different titles before finally settling on what this tool deems appropriate. So we all know it takes closer to 15-20 hours to complete a class of 24 students. Think about that…20 hours out of the classroom to administer a test that will give you data that all experienced teachers have quickly realized is useless and inaccurate. Now multiply that by 3 since we have to administer this test three times each year; that adds up to 60 hours that you are pulled from your classroom to administer a test! TRC is the worst thing happening in k-3 classrooms across North Carolina at this time. Even if you could convince me that there are a few valuable insights gained from our TRC assessments, nothing you say will convince me that it is worth losing 60 hours of instructional time with my students. Parents, teachers, and administrators alike all need to stand together to put an end to TRC and over-testing in general!

    • Courtney Sears

      You make a great point! mCLASS is a time consuming assessment. We weren’t able to address time in this article but hope that it is a part of the conversations we are all having about how NC’s reading assessments could be better for kids and teachers.

  • Miss Tina

    I teach third grade and I strongly feel we are spending WAY TOO much time “assessing” these kids and not doing enough teaching. I am responsible for testing 50 students 3 times a year with Mclass. It often takes the entire testing window (and lets not forget Portfolio tests, STAR tests, MAP tests, BOG and EOG tests, and the wonderful Read to Achieve EOG). Therefore, these kids are losing aboutt 3 months worth of valuable teaching and growing. How is that productive at all??? There are many holes in Mclass including flaws from the people who have even trained us. I think it would be much more effective to use as PM in RTI and maybe a benchmark 1-2 times per year, but not 3. I also feel that these kids (still very young) should be able to test with their homeroom or core teacher. Since these scores are not used for our standard 6, then the teachers that teach them, need to be assessing them. WE need hear & see what they can do — not some stranger. These young kids, often freeze up and shut down because they are not used to the teacher that is testing them. So, how does that prove their reading capabilities? I also disagree with the kids being pushed by a clock to read fast enough or tell enough in 60 seconds. Some kids need to process things at a different rate. It is not fair for all children to be tested alike when they are all extremely different. We are also being criticized for having EOG scores that are significantly lower than our Mclass scores so we are treated like criminals that are cheating on the tests. We work our butts off to be treated horribly?!?! We have learned enough about Mclass to become great “teachers of the Mclass tests” but how do we provide 3-4 hour prep experiences for the “most important test” ?
    Yes, we teach the CC standards. Yes we prepare them for all the standards being tested but when do we shut things down and practice stamina and endurance for that long and difficult test? WE DON’T? So how can we compare Mclass/aka apples (full of short 3 min DAZE, 6 min DORF, and 15 min TRC test) to the 3-4 hour EOGs/aka oranges?!?!?
    I fell in love with teaching because I LOVE making a difference in children’s lives. I enjoy seeing their lightbulbs glow! I enjoy challenging them and helping them become stronger and better. I NEVER IMAGINED THAT I WOULD BE WORKING AS A TEST GIVER AND TEST PREPARER. Where is the joy in that??

    • Peggy Grantham

      I totally agree with you! What I don’t understand is why isn’t the Mclass data used as part of the 3rd grade teachers’ effectiveness. Only the Reading EOG is used for 3rd Grade. I spent too much time last year progress monitoring and not enough time teaching.

  • Kim C.

    Couldn’t agree more with this article and all the feedback/comments provided here. Not only is mClass a frustratingly inaccurate tool that lacks one key component: TEACHER INPUT, it is also a gateway to a status of proficient vs. non-proficient. In fact, in the district I teach in, it is the sole determining factor in whether or not you are invited to summer school. The impact of this is far reaching. Not only does it send a mixed message to parents, it forces students who truly need the extra support during the summer months to share resources (manpower, technology, books, time, etc.) with students, who in the eyes of a qualified educator view them as proficient.

  • Dee Jackson

    I think this is funny that they chose to only speak about this one assessment. I wonder how many parents would be floored to know exactly how many different types of assessments their child takes in just one year. It is astronomical. The saddest thing about this practice is that it takes away from the time teachers have to actually teach their students. It also has psychological impact on more and more students every year. It is beyond sad to see a third grader stressed to the point of physical sickness because of all the assessments they are expected to take. It breaks my heart that we are putting our students through this. This is truly not a representation of the real world and in no way prepares them for the real world. I have been in my job for 18 years and I have not had to take or pass one assessment to prove my knowledge or ability. I completely understand having benchmarks to show growth and areas where needs are. However, what we are doing to these students are beyond cruel and unusual punishment. Question is: What can be done about it?

  • GG

    Thank you for clearly and concisely mapping out a few of the pitfalls with mClass. When this tool is considered a summative assessment decisions for students and teachers are made that are based on a snapshot of information. MClass was never designed to be a teacher evaluation tool nor was it ever designed to make high stakes decisions about student report cards, summer school, or retention. It is to be used as a diagnostic tool that, with effective and reflective teacher input, leads to a focused instructional plan for students that will lead to stronger literacy development.

  • Courtney Sears

    Thank you for all of your comments! We believe in the power of teacher voices and are so excited by how many people have shared this article. Hopefully, this will serve as a springboard for many conversations across the state about high quality reading instruction and assessment for our youngest learners.

  • Kathie Guild

    Great article! There is too much testing in schools. Tests provide one snapshot, but not the entire picture of a student’s knowledge and growth.

  • BUSYMOM4

    School district leaders and administrators, please get the screens out of our kids’ faces all day long. Students (of all ages) have exceeded the 2 hour screen limit set by the American Academy of Pediatrics by lunch time! Our children do not learn more, or learn better via screens and yes, format of information does matter. The data is out and the research shows that there is less abstract thinking, critical thinking and inference/meaning made when learning via screen. Students report more distractions, less focus and less recall when learning via screens. Notes taken by hand are recalled better and over a longer period of time. The Bring your own tech (BYOT) program is a joke at most schools, with students rushing through work to play games, or check social media, or text each other. Of course a teacher cannot, and should not, have to be the “Screen Cop” in the classroom.

    • Courtney Sears

      Thanks for commenting! I want to clarify that students are asked to read traditional paper books for this assessment. Teachers use a device to record a student’s reading behaviors and responses.

      • BUSYMOM4

        Hi Courtney, yes I know that I veered off course a bit from the message of this article, but it all works together. Like I said, the mode and format of info does matter. “Now, it is ultimately an algorithm on a device that makes the final analysis.” Schools are relying too much on tech software and devices to be the “magic” bullet, to make time more efficient, to make learning more efficient, to give us data and to reach our kids on some other level, and in the end it just doesn’t work. The whole system is relying on unproven methods, versus what we all know works but takes more time and resources for our students. I may be preaching to the choir, but it makes me sad that “we” have allowed this to happen to our education system right in front of our faces. We’ve been sold a bill of goods from high tech companies and school administrators. Ultimately there is a cost of relying on the screen to teach and entertain our children on a daily basis!

  • TTM

    One of the most destructive implementations in the history of NC education! Leave to NC to jump on this band wagon when so many other states abandoned this so called assessment nightmare long ago! It is robbing teachers of instructional time and destroying children’s self esteem. MClass, RTA, are RTI are destroying literacy instruction in this state. Asinine politicians and ill informed DPI decision makers should be held accountable for setting literacy instruction back thirty years while also destroying children’s love of reading and teacher’s love of teaching. Teachers are leaving this state for many reasons but high on the list is the fact they are no longer teachers but test administrators! NC DPI, State Superintendent, School Board members and Politicians should all be ashamed, embarrassed and apologetic for the destruction…but that would first require them to get their heads out of the sand!!!

  • Debbie

    Excellent article. I teach second grade in North Carolina. My major problem with the written questions is that they are not using level/grade appropriate language. It measures whether a child can read the question – not comprehension. Thank you for writing this!