"That's How We Do It In Government"

Posted on March 29, 2012 by

0


At a recent meeting of the curriculum committee, a Hillsboro School District official presented a statistic that looked something like this:

  • Mean test score: 73%
  • Margin of error: 9%
  • Adjusted score : 82%

Now, for any of us in engineering or other professions that use margins of error, this looked distinctly odd. A “margin of error” represents the imprecision in a measurement, and inherently can show uncertainty in either direction. So a 73% with a 9% margin of error represents a range of 64%-82%. Why does it make sense to add the margin of error to the mean score? The answer, when I raised the question: “That’s how we do it in government.”

This is a nice trick: it enables every statistic to be presented in the best possible light for promoting the success of public officials. It also is inherently insane, in my opinion, granting bonus points for the imprecision of the measurement. Normally, measurements with lower margins of error are seen as more valuable, as they give a clearer and more precise picture. But look at the scores above: if they had a better sampling and lowered their margin of error, the “adjusted score” would actually be penalized! And if they know the true scores are going down, they can game the system by aiming to increase the margin of error rather than improving student knowledge. Is this the right way measurements should be done in our education system?

I can see how this would become the custom in government: once one official does it, everyone else has to follow suit, or else their statistics would appear inferior. Imagine if the district suddenly stopped “adjusting” these scores. “Look, in Hillsboro the scores went down 9% this year!” Any elected officials involved would see their opponents demagogue the issue, and the employees who stopped the adjustments would suffer for it.

Don’t take this post as a criticism of the particular official (Travis) who made this presentation though: in fact, I am commending him. In a regime where this silly “adjusted score” must be produced, the most intellectually honest policy is to do what he did: present the actual source numbers in addition to the final adjusted score, and let the viewers see the full story. I’m happy to see our district doing this.

The big lesson: any time a government body reports an “adjusted” statistic, look very closely at the adjustment.

 

[Reposted from http://hillsboroerik.com .  See that blog for more comments on local education.]

 

Advertisements