Home > Uncategorized > Department of Unintended Consequences Redux: Teacher Postings

Department of Unintended Consequences Redux: Teacher Postings

February 27, 2012

An article from a day earlier could have been titled “Big Oops”… it seems that the methodology for calculating “value added” was  “too sensitive” among teachers whose students did either very well or very poorly. As a result there were “73 cases in which teachers whose students produced consistently outstanding test scores — at or above the 84th percentile citywide — were nonetheless tagged as below average”… as the Naked Capitalism blogger would say “Quelle Surprise!”. Despite the well known fact among education statisticians that scores on either end of the bell curve would be distorted by value added analyses, parents of high performing students were surprised:

For parents, seeing the rankings of the teachers they know well can be shocking. Vicki Kahn, a parent at Public School 333, the Manhattan School for Children on the Upper West Side, was surprised to see that some of the teachers whom she considers outstanding had poor ratings, including one who routinely sends many of her students to a highly selective middle school.

“It seems completely wrong,” Ms. Kahn said, adding that one of the co-teachers in the sixth grade did not even get a rating, in an apparent mistake.

Anna Rachmansky, whose son is a fifth grader at P.S. 89 in TriBeCa, was visibly stunned upon discovering that a teacher she held in high regard scored in the 10th percentile in math.

“I’m very surprised she would get poor in anything,” Ms. Rachmansky said. “She’s a very strong educator.”

At least one parent was not surprised:

Sandra Blackwood, the co-president of the parent teacher association at Public School 41 in Greenwich Village, said she had little confidence that the data would be meaningful, though she felt it was important to look.

“If it is anything like the school grading system,” she said, “it will probably be highly arbitrary.”

Though parents can get a peek inside school buildings for the first time to see differences among teachers, it does not help if the underlying information is incorrect, Elizabeth Phillips, the principal of P.S. 321, said.

“What people don’t understand is that they are just not accurate,” she said. “We are talking about minute differences in test scores that cause a teacher to score in the lowest percentiles,” like a teacher whom she finds great and who scored in the sixth percentile because her students went to a 3.92 average test score from a 3.97, out of a possible 4.

And that is not to mention one teacher who had test scores listed for a year she was out on child care leave, Ms. Phillips said.

“The only way this will have any kind of a positive impact,” she said, “would be if people see how ridiculous this is and it gives New York State pause about how they are going about teacher evaluation.”

I would be VERY surprised if it had any such effect since so many political players from Arne Duncan on down who have so much invested in this method working. Alas, no matter which party takes the White House we are stuck with four years of value-added blather. As one who once believed in the promise of this kind of methodology, I am especially disappointed in how this is all playing out. Data driven decision making CAN make a difference if the data driving the decision making is valid. But when bad data is used to make decisions it can lead to bad results… just ask the people who believed in the data coming from Arthur Anderson on Enron’s financial stability!

Advertisements
Categories: Uncategorized
%d bloggers like this: