Skip to content

EdNC. Essential education news. Important stories. Your voice.

A conversation on measuring teacher quality

Last May, UNC General Administration teamed up with SAS Institute and released a website to present decades of research on the quality of teachers and teacher programs in the UNC system.

The UNC Educator Quality Dashboard is open to the public for policymakers, deans, teachers — anyone — to navigate data on teacher preparation and effectiveness from different schools of education.

The dashboard is the result of one of seven recommendations from the UNC Board of Governors on how to improve teacher and school leader quality.

Education deans and experts from around the country who form the dashboard’s advisory committee met last Thursday to have a conversation on the dashboard — how to make information more accessible, what information it should include, and how to use the tool to make tangible change in education programs.

The dashboard is an effort to use research-based evidence to make informed changes.

“Now we’re going to try to institutionalize innovation,” said Gary Henry, who is chair of the board and a Rodes Hart Professor at the Peabody College of Education at Vanderbilt University.

Ideas floated around the room on how to expand and improve the dashboard, how to interpret the data, and how to apply the information to reality.

Context

Several committee members voiced concerns for the need of a wider picture to understand the data.

Ellen McIntyre, dean of the College of Education at UNC-Charlotte, said many teachers who appear to be successful upon graduation from education programs go and teach at wealthy, already-successful schools. That fact, McIntyre said, isn’t shown in the data and is important.

To better compare schools of education, Cassandra Herring — dean of the School of Education and Human Development at Hampton University — said the dashboard needs to include a state aggregate.

Jim Wyckoff, EdPolicyWorks director and professor, suggested including bits of analysis and interpretation to help out dashboard viewers.

“There is the real risk of either over-interpreting or people getting lost in this information,” Wyckoff said.

David Steiner, director of the John Hopkins Institute for Education Policy, added that being able to better filter data and understand what factors are leading to data trends is important.

“I want to know as a dean or administrator, what’s random here? What is white noise?” Steiner said.

Wyckoff later pointed out a hole in context: teacher demographics. He said there is evidence that race is important in recruitment, placement, and several other places.

Consistency

The data on the dashboard is presented using different metrics and graphics. Committee members wanted more consistency across the website.

“People have to relearn everything on each page,” Henry said. He said everything should be labeled, measured, and presented in a cohesive way, focusing on the audience.

McIntyre said that a consistent approach to including data in the dashboard is needed. She said data on school culture, working conditions, and licensure pass rates for teacher education tests are examples of lenses that could be added. The more information on education programs, McIntyre said, the better.

With more data points on a program, she said, a change would be better supported.

“If this and this and this are all low, we’ve got a problem. Then we start talking about dramatic change or closing a program or something,” she said.

Real change

The part of the conversation that was most heavily discussed was: What’s the point?

After talk about metrics, demographics, and data analysis, Wyckoff shifted the focus to the end goal.

“Should this be about motivation or about program improvement?,” Wyckoff asked.

He said worrying about coming up with some specific number instead of fixing real problems may be missing the mark.

Steiner expressed frustration with the time scale of seeing real changes from data.

“There are real students getting lousy teachers,” he said.

Steiner said “the shame approach” — clearly showing education programs and teachers where they need improvement — often works.

“If it doesn’t have teeth, students are going to get the same lousy teachers,” he said.

Bill McDiarmid, distinguished professor of education at UNC-Chapel Hill, favored a different strategy.

“Having a hammer to hit people over the head with gets their attention, but it doesn’t necessarily help them figure out what to do next,” he said.

Henry added that, after presenting the information and motivating people to change, “the second layer is providing information about the nature of that change.”

“We have research-based problems that need to be solved, but we don’t necessarily have research-based solutions that are ready for us to use,” Henry later added.

Liz Bell

Liz Bell is the early childhood reporter for EducationNC.