Home > Uncategorized > ESSA Was Supposed to “Unleash a Wave of Innovation” in Metrics… It’s Fallen Short of the Mark

ESSA Was Supposed to “Unleash a Wave of Innovation” in Metrics… It’s Fallen Short of the Mark

October 2, 2017

Education Week writer Darrel Burnette II posed this question as the title of a blog post over the weekend: “”  Why Did So Many States Choose to Use These Two Indicators? The two indicators: chronic absenteeism and “college and career readiness”. The answer to the question Mr. Burnette poses should be obvious to anyone who’s read this blog: they are cheap and easy to measure. Data on absenteeism has been collected for eons and determining what constitutes “chronic absenteeism” requires a minimal debate. Data on “college and career readiness” is also easy to collect. Virtually all high schools collect data on how many students applied for college and the percentage of students who graduated. Moreover, State Departments could make a straight faced argument that their graduation standards define “college and career readiness” and thus graduation rates, which are also collected, could be used to determine that metric. Mr. Burdette, like me, was skeptical that ESSA would yield any imaginative metrics, as he noted in his post:

It’s an issue I wrote about a year ago as state departments started rejecting outright some pretty unusual and innovative ideas from parents and teachers about how best to measure their schools. This caused consternation and confusion amongst advocates who wanted to break away from heavy reliance on testing.

From my story:

One big issue: whether states and districts are able to retrofit their data-collection systems to answer new and increasingly difficult questions, a potentially arduous and expensive task.

For many measures, state officials say they lack the infrastructure to collect enough reliable information to attach high stakes. Many districts’ data-collection sytems are scattershot and outdated. Scores of technicians responsible for processing data have been laid off in recent years amid budget cuts. And local superintendents have complained that they’re already required by states to collect an inordinate amount of data.

In addition, states must navigate a myriad of data privacy laws passed in recent years.

So what will become of ESSA’s promise to provide new and creative metrics? Mr. Burdette does not forecast any substantive changes, but he does note that ESSA does require the collection of new data points:

In the meantime, ESSA requires the collection and public reporting of several new data points, including student arrest rates, teacher experience and average pay, and school-by-school spending. While these data points will be collected and reported, schools will not be held accountable for disparities.

What do these data points have in common? They are easy to measure and cheap to collect… And what is the biggest problem with these data points? “…Schools will not be held accountable for disparities.”  And when it comes to arrest rates, teacher pay, and per pupil spending school districts shouldn’t be held accountable because they are all beyond their control. But, as I’m sure someone will find, there will be a correlation between these data and the affluence of the school districts.


%d bloggers like this: