Grading the Teachers: Teachers in Richer Schools Score Higher on Value-Added Measure
Meant to gauge whether students learn as much as expected in a given year, value-added will become a key part of rating individual teachers from rich and poor districts alike next school year.
But a Plain Dealer/StateImpact Ohio analysis raises questions about how much of an equalizer it truly is, even as the state ramps up its use.
The 2011-12 value-added results show that districts, schools and teachers with large numbers of poor students tend to have lower value-added results than those that serve more-affluent ones.[module align="right" width="half" type="aside"]
This series about valued-added, a new way that Ohio is using to measure whether teachers provide a year’s worth of learning to their students, is the result of a partnership between The Cleveland Plain Dealer and StateImpact Ohio. StateImpact reporters Molly Bloom and Ida Lieszkovszky worked with Plain Dealer reporter Patrick O’Donnell and Plain Dealer data analysis editor Rich Exner to produce these stories.
- Overview: Using Data To Evaluate Teachers
- Pay vs. Value-Added Performance
- Secrets Of Two "Most Effective" Teachers
- Value-Added's Poverty Factor
- How is Value-Added Calculated?
- Audio: Measuring Performance Through Growth
- Audio: Push for Performance Pay
- Video: Guide to Ohio’s New Way of Evaluating Teachers
- Search Ratings For 4,200 Ohio Teachers
- Map: See Teacher Ratings at Your School
- Why We Published Teachers' Ratings
- About Our Analysis
[/module]As with most issues surrounding value-added, those results raise questions about whether the system is a flawed and unfair measure or one that points out a problem with real consequences for kids who fall further behind.
“If all we know is that value-added is lower in high-poverty schools, we can’t determine what explanation is correct,” said Tim Sass, co-author of a multistate study of value-added scores and school poverty levels. “Either your value-added is flawed, or teacher quality is related to the kinds of kids they get assigned.”
Sass, now a professor at Georgia State University, predicted that the differences in value-added calculation methods “are going to be a very big issue very soon.”
“As they get rolled out, along with the consequences attached to them, they’re going to get a lot of scrutiny,” he said.
Academic researchers have long agreed that more-affluent students outscore poor ones on tests of overall knowledge, in part because the more affluent generally have the advantage of starting school better prepared. Ohio’s state test results have followed that pattern for years.
Value-added is designed to disregard that head start on the learning curve. It looks only at how fast students are moving along that curve now, using reading and math test scores to measure progress for fourth- through eighth-graders.
Students who make the expected year’s worth of academic progress in a school year will score well for their schools or teachers. The Ohio Department of Education has pointed in recent years to charts showing that poor districts perform as well as more-affluent ones on value-added as a sign that the measure is working.
“Value-added is not influenced by socioeconomic status,” said Matt Cohen, the chief research officer at the Ohio Department of Education. “That much is pretty clear.”
But a Plain Dealer/StateImpact Ohio analysis of value-added and economic data for districts, teachers and schools shows a clear trend:
While value-added does not rise with income or fall with poverty rates as much as it does with other academic measures, students — and consequently teachers — in richer schools still score better on value-added than those in poorer schools.
Consider that for the 2011-12 school year:
- Value-added scores were 2½ times higher on average for districts where the median family income is above $35,000 than for districts with income below that amount.
- For low-poverty school districts, two-thirds had positive value-added scores — scores indicating students made more than a year’s worth of progress.
- For high-poverty school districts, two-thirds had negative value-added scores — scores indicating that students made less than a year’s progress.
- Almost 40 percent of low-poverty schools scored “Above” the state’s value-added target, compared with 20 percent of high-poverty schools.
- At the same time, 25 percent of high-poverty schools scored “Below” state value-added targets while low-poverty schools were half as likely to score “Below.”
For both districts and schools, high poverty was defined as those with 75 percent or more students qualifying for free or reduced-price lunches. Low-poverty schools had 25 percent or fewer of its students qualifying.[module align="right" width="half" type="aside"]Value-Added Ratings
(from highest to lowest)
- Most Effective
- Above Average
- Approaching Average
- Least Effective
[/module]Students in high-poverty schools are more likely to have teachers rated “Least Effective” — the lowest state rating — than “Most Effective” — the highest of five ratings. The three ratings in the middle are treated by the state as essentially average performance.
In the highest-poverty districts, with 90 percent or more of students qualifying for free or reduced-price lunches, students were twice as likely to have a Least Effective teacher as a Most Effective one.
But the lowest-poverty schools — with less than 10 percent qualifying for free or reduced-price lunch — had seven times as many Most Effective teachers as Least Effective ones.
Whether these results mean that Ohio’s value-added measure is flawed, or that Ohio has a problem with less-effective teachers clustered in poor schools, is a matter of debate. It’s also one that researchers nationwide have tackled for the last several years as more and more states use different value-added calculations and see similar results.
Critics cite possible bias, other variables
Differences between richer schools and poorer ones have teachers concerned that their ratings could be biased. With districts starting to use value-added and teacher ratings to help determine pay or layoffs, those concerns are amplified.
“There are those inconsistencies in there that make it difficult to use for high-stakes decisions,” said Ohio Federation of Teachers President Melissa Cropper. “There are factors that go into a school day other than what happens on a test score.
“I can say that in these areas that have poor socioeconomic conditions, perhaps the students have achieved a year’s worth of growth but perhaps the students haven’t eaten breakfast for a week and are coming to school hungry, so that affects their ability to take the test that day. It doesn’t capture all these other variables.”
[module align="right" width="half" type="pull-quote"]“There are factors that go into a school day other than what happens on a test score."
--Melissa Cropper, Ohio Federation of Teachers president[/module]
Researchers and the creators of the multiple value-added models in use across the country disagree on how to best account for different variables, the most controversial of which are race and income.
All of the formulas use students’ previous test performance and compare it with their most recent tests. Ohio’s formula, developed by former University of Tennessee professor William Sanders, and another much-used system called the Colorado Growth Model are based on test scores only.
A second group of approaches, like those used in Florida and New York City, build factors such as poverty rate, parental education and attendance rates into the formula.
Ohio excludes some students from value-added calculations who are not required to take the Ohio Achievement Assessments, like those with severe learning disabilities or who have more than 60 unexcused absences from school. But other factors are not accounted for in the formula. SAS Institute Inc., the firm Sanders now works for, calculates value-added for Ohio using only students’ previous test results.
“By including all of a student’s testing history, each student serves as his or her own control,” SAS’ website reads, responding to criticism of the approach. “To the extent that SES/DEM [socioeconomic/demographic] influences persist over time, these influences are already represented in the student’s data. This negates the need for SES/DEM adjustment.”
The same brief argues that adjusting value-added for the race or poverty level of a student adds the dangerous assumption that those students will not perform well. It would also create separate standards for measuring students who have the same history on scores on previous tests. And it says that adjusting for these issues can also hide whether students in poor areas are receiving worse instruction.
“We recommend that these adjustments not be made,” the brief reads. “Not only are they largely unnecessary, but they may be harmful.”
Washington, D.C., schools take different approach
Eric Isenberg, who helped develop the Washington, D.C., school district’s value-added model with Mathematica Policy Research, takes the opposite approach.
Isenberg made sure D.C.’s calculations had adjustments for English-language learners, special-education status and the need for free or reduced-price lunch. He even included each student’s attendance and absence percentage, to account for how motivated they are to succeed in school.
[module align="right" width="half" type="pull-quote"]“We’d see teachers that have all the wealthy kids look pretty good and teachers who are teaching all the poor and disadvantaged kids are going to look weak... That might not be a reflection of how well a teacher did.”
--Eric Isenberg, helped develop Washington, D.C. value-added model[/module]While he agreed with Sanders that a student’s past performance on tests is the best indicator of future performance, he said not including these other factors would penalize teachers for students’ backgrounds.
“We’d see teachers that have all the wealthy kids look pretty good and teachers who are teaching all the poor and disadvantaged kids are going to look weak,” he said. “That might not be a reflection of how well a teacher did.”
Dan Goldhaber, a University of Washington professor who researches performance measures, last year calculated how teachers would fare under multiple value-added formulas, using extensive data for teachers in North Carolina.
The exact differences varied by method, but the trend was clear: Teachers with affluent students had higher value-added scores than those with poor students, in both reading and math, in every method. The differences were much less — usually cut in half — when the socioeconomic backgrounds of students were included.
His analysis did not include the Sanders model that Ohio uses because Sanders and SAS do not share their entire formula and Goldhaber could not re-create it.
He said he’d like to see SAS’ results compared side-by-side with a model that includes socioeconomic factors.
“If it’s true that the SAS system accounts for them, the results should be very similar,” Goldhaber said.
He was quick to note, however, that he and other researchers have not yet figured out how to tell whether lower value-added scores in high-poverty districts are due to the model, or to issues in the classroom.
“You can’t tell whether that’s because the model doesn’t account for things it ought to, or it’s because school districts that are educating poor students tend not to have quality educators with these classes,” Goldhaber said.
[module align="right" width="half" type="pull-quote"]“If you cancel out poverty in the model, how would you know if you’re ineffective with high-poverty kids? It may very well be that as a state we are making more progress with kids that are low-poverty than high-poverty.”
--Matt Cohen, Ohio Department of Education[/module]
Cohen, from the Ohio Department of Education, said he believes in Ohio’s model and that statewide data show only a tiny relationship between poverty and value-added results.
He doesn’t view disparities like those found in the Plain Dealer/State Impact analysis as showing flaws in the calculations, but rather as flaws in how students are educated. He said adjustments for poverty would hide any shortcomings for those students that it may now be highlighting.
“If you cancel out poverty in the model, how would you know if you’re ineffective with high-poverty kids?” Cohen asked. “It may very well be that as a state we are making more progress with kids that are low-poverty than high-poverty.”
Cohen said he believes that any disparities are likely because of issues in the large urban districts or with how schools handle the very-lowest-performing students — a population that is more pronounced in large urban districts. He said the state is still trying to determine exactly what those issues are.
Ohio’s largest districts have fared poorly
The members of the Ohio 8 coalition of the state’s largest urban districts, including Cleveland, have fared poorly under value-added in recent years. In the two years since the state last adjusted how it grades districts in value-added, Ohio 8 districts were scored “Below” standards eight times and “Above” them only twice.
[module align="right" width="half" type="pull-quote"]"We need to do a better job of supporting our kids. If we could do that, maybe the value-added scores of our kids would be higher.”
--David James, Akron school district superintendent[/module]
Cohen said the state’s new district and school report cards will start highlighting how the poorest students fare, making the performance and growth of the lowest-performing 20 percent of students a measured item. (Gifted and special-education students will also be highlighted.)
Cohen said those results may give some insight into why some schools perform so poorly.
Akron Superintendent David James, co-chairman of the Ohio 8, said his group has not focused on whether Ohio’s value-added formula should include poverty measures. James said the Ohio 8 pushed the state to include value-added in the first place and wants to keep it because it does give schools credit for helping students advance.
He said that the state should consider socioeconomic issues that urban students face and that he considers the Ohio 8’s low value-added scores “odd.”
But James quickly added, “I don’t want to use that as an excuse. We need to do a better job of supporting our kids. If we could do that, maybe the value-added scores of our kids would be higher.”