The “common” part of the Common Core is essentially dead.
No, not the standards themselves. With 42 states and D.C. still sharing the same standards, those are still pretty common. I mean the other promise of the “Common” Core, that parents, taxpayers, and policymakers would be able to compare schools across state lines.
That promise was always tenuous, and we were never that close to having full commonality anyway. From the beginning, the federal government funded two consortia, not one, to design assessments aligned to the Common Core (PARCC* and Smarter Balanced). Not all states joined the assessment consortia in the first place, and states slowly trickled away from them over time.
Still, this year was supposed to be the first where at least some states would share common assessments and common cut scores. But even these smaller groups can’t agree on common cut scores or common ways to share the results. EdWeek’s Catherine Gewertz has all the messy details of the latest delays and breakdowns. PARCC originally wanted to keep its cut scores private (!) and, although it has sample parent reports, states aren’t required to use them. The same is true for Smarter Balanced: Gewertz notes that “the 18 states that used the Smarter Balanced exam last spring are each reporting results their own way.”
So the political process to negotiate comparability is effectively dead. But where politics and compromise failed, leave it to American ingenuity to come up with a clever workaround.
It turns out we do have comparable test score results. It just didn’t come the easy way. States aren’t using the same tests, and they aren’t using the same cut scores. But that didn’t stop researchers Jacob Vigdor and Josh McGee* from creating SchoolGrades.org, the first site to create a user-friendly, searchable database with nationally (and internationally) comparable school results.
SchoolGrades takes each state’s own test results, on whichever particular test they use, and makes three adjustments to create comparability. First, because states vary in how tough or easy they rate schools, SchoolGrades adjusts state test scores based on how rigorous they are compared to a national standard (NAEP). Second, because higher-poverty schools tend to have lower test scores, SchoolGrades adjusts school-level results based on the percentage of students who qualify for free or reduced-price lunch. Third, to put American schools in an internationally competitive context, SchoolGrades adjusts each school’s results based on how well American students perform compared to international peers (using the PISA exam).
The result is a comparable, A-F grading system for all public elementary and middle schools in the U.S. The grades are equivalent across state lines, allowing me to objectively compare the school I went to as a kid in West Des Moines, Iowa with theschool my daughter will go to in a few years in Fairfax, Virginia. Parents can also look at how all the schools in their area compare.
To be clear, this is not the ideal situation. Statistical equivalencies are imperfect substitutes for having all students nationwide to be tested on the same content using the same test. Nor does it make sense for all states to create and administer their own test, particularly in a world where 43 states have voluntarily chosen the same set of math and reading standards. But this is the USA we live in, and this is the best we have in our de-centralized, federalist society.
So if you haven’t yet, go check out SchoolGrades.org and look up your local schools. While you’re at it, marvel at both our political obstinacy and the ingenuity of clever workarounds.
– Chad Aldeman
*Disclosures: PARCC is a Bellwether client, but I’m not part of the project team. Josh McGee works at the Laura and John Arnold Foundation, one of the funders of our work on teacher pensions. This first appeared on Ahead of the Heard.