Skip to content

It’s not so simple

The release of the biennial “Nation’s Report Card” comes in an outpouring of digital charts, graphs, and words posted on national and state government websites. If it came in a box, it would bear the stamp:

Handle with care.

Since 1990, states have gotten a report card from the National Assessment of Educational Progress — NAEP, pronounced “nape” — based on scores from math and reading tests taken by 4th graders and 8th graders. For North Carolina, NAEP has provided an incentive to strengthen standards on its state-developed tests, while also serving to point toward the unfinished business of raising achievement and narrowing educational gaps across lines of race, ethnicity, income class, and regions.

The 2015 scores gave North Carolina little to cheer about, and did nothing to advance any side’s agenda in the great debate over the future of its preK-12 schools. In math, North Carolina 4th graders scored 244, above the national average of 240; 8th graders matched the national average of 281. In reading, 4th graders had an average score of 226, five points above the national average; however, 8th graders, at an average score of 261, fell below the national average of 264. (For a full story on the NAEP scores in North Carolina, read Alex Granados’ report here.)

A recent report by the Urban Institute adds an instructive layer of analysis to the NAEP scores. “A leading concern is that NAEP punditry usually ignores differences across states in demographics and other student characteristics in test-score performance,” writes report author Matthew Chingos.

Chingos, using NAEP data from 2003 to 2013, adjusted states’ scores to take into account such factors as gender, race and ethnicity, special education, books in the home, family structure, and eligibility for free or reduced-price lunch — and measured how each state’s students performed compared with similar students in other states. He calculated the average of the four tests and reported the results in “months of learning” gained or lost.

Including students in both public and private schools, Chingos found that “the states where students ‘break the curve’ (i.e., perform better than their demographic peers) are often not the states with high scores overall.” In his calculation, North Carolina broke the curve.

In 2013, the NAEP scores of its students showed a learning gain of 0.6 months, ranking North Carolina 26th among the states. The adjusted scores showed a gain of 2.7 months, lifting North Carolina to seventh among the 50 states: below Florida and Texas but above Virginia and all other Southern states. Chingos cited North Carolina and Oregon as the states with “notable shifts” up in the rankings. (He published a quick recalculation using the 2015 reports, but not the full data-base; North Carolina ranks eighth.)

And yet, North Carolina ranked lower — in fact, 36th — in the calculation of “change in state NAEP performance” over the 10 years from 2003 to 2013. Chingos cited North Carolina and New York as states with “above-average performance but below-average growth” in test results.

While heeding multiple cautions against over-interpreting NAEP data, a few, perhaps useful, observations come to mind:

  • Amid the concern over too much testing, NAEP remains among the test regimens the country needs. It has limitations: not every student takes the tests, and the results give teachers no information to improve instruction of specific students. Still, as nationwide uniform assessment, NAEP provides reliable comparative data on states and urban districts. NAEP serves as a check on states that would adopt weak standards and easy-to-pass tests to give the appearance of educational success.
  • Data are critical to credible policymaking; data-informed reform is usually superior to anecdote-driven decisions. And yet, data without analysis amount, as they say, to junk. Even NAEP data require discussion and context, conclusions tested in light of additional research and the real-life experiences of teachers and principals.
  • In explaining why he undertook to recalculate states’ NAEP scores, Chingos argues convincingly that knowing that Massachusetts and New Jersey rank at the top and that Mississippi and Alabama rank near the bottom does not tell everything about the effectiveness of their schools. Student demographics vary from state to state, and those demographics have a bearing on test scores. In much the same way, demographics vary within North Carolina, among its cities and towns, regions, and counties. North Carolina has its Massachusetts-like communities and its Mississippi-like places. It’s important for state policymakers to comprehend differences in in-state conditions and challenges.
  • It’s heartening that North Carolina “broke the curve” in the Urban Institute analysis. Such educational gains arise out of policy persistence, public investment, and leadership commitment over years. And yet, the NAEP results also indicate how far North Carolina still has to go. NAEP defines three achievement levels: basic, proficient, and advanced. About 70 percent of 8th graders reached only the basic level in math and reading.

Breaking the curve contradicts sweeping statements that North Carolina public schools are broken.

Still, the Nation’s Report Card reinforces the urgency of lifting more young North Carolinians to attain the education that prepares them to thrive in modern economic and civic life.

Here are more links to NAEP information and analysis:

The DPI press release on NAEP results

The National Center for Education Statistics official NAEP site

A commentary on the Urban Institute report in The New York Times

Ferrel Guillory

Ferrel Guillory is the Director of the Program in Public Life and Professor of the Practice at the UNC Hussman School of Journalism and Media, and the Vice Chairman of EducationNC.