Big Data Wins the War on Christmas



By 12/19/2013

1 Comment | Print | NO PDF |

The holiday seasons of 2012 and 2013 in the education world have been dominated by the release of new international test-score data, and the accompanying hand-wringing about the performance of the U.S., with advocates of every stripe finding a high-performing country with existing policies that match what they always thought the U.S. ought to do. Here at the Chalkboard we often take on the dangers of analyses that draw causal conclusions from correlational data, particularly when the analyst is free to keep mining the data until the desired pattern is revealed. There are certainly many real examples that illustrate this important point, but today I’d like to illustrate it with a frivolous example about the Yuletide, based on real data and analyses.

With one week to go before Christmas, most Americans are too busy going to holiday parties and shopping for last-minute gifts to worry about college ratings, teacher evaluation systems, and the other education policy issues of the day. Could the festive holiday spirit get in the way of putting students first? Or is it possible that a little holiday magic might increase student achievement?

I can’t prove that Santa is real, but with enough persistence can I come up with a rigorous empirical analysis that measures the causal effect of Christmas on student achievement? I began my search with the recently released PISA data for 2012. Do countries where Christmas is a public holiday, as listed in Wikipedia, score higher than countries where it is not? At first glance, the data do not look good for Kris Kringle. Education systems where Christmas is a public holiday score 3 points lower on PISA than other systems, a statistically insignificant difference.

But then I remembered my colleague Tom Loveless’s warning about how PISA scores for China are likely misleading, with a selected group of kids from a few wealthy provinces included as separate systems and the rest of the country left out of the data. What if I only examine education systems that are entire countries? The situation starts to improve, with Christmas now estimated to raise PISA scores by 22 points! Unfortunately, this difference is still not statistically significant, which may not mean much to a layperson but means a lot to people who once took a statistics course in college.

There must be a better, more accurate, method to measure the effect of Christmas. Perhaps the analysis should be weighted by the inverse of the standard error of the test score for each country, to give more weight to countries with more precise estimates? Bingo! Adding this bit of rigor to the method produces an estimated “Christmas effect” of 48 points on the PISA test, nearly half of a standard deviation—and it is statistically significant at the 5 percent level!

But I knew that there are some grinches out there who will gripe and groan about omitted variable bias, so I needed to confirm that the effect was real by finding it in another data source. A time-series analysis just might do the trick! I gathered NAEP scores on fourth-grade math performance from the US from 1990 to 2013 and linked them to data on how good of a holiday season was had in the previous year (operationalized as consumer spending in November and December as measured by the ICSC-Goldman Sachs index).

Do students learn more in years when there’s more holiday cheer? The figure below shows that they do—student learning rises more or less in lock-step with the amount of holiday spending, a causal effect that is statistically significant at the 1% level. Standardizing the NAEP scores and putting the spending index on a logarithmic scale implies that if we could just have about 30% more holiday spirit, our students would do as well as those in Finland!

Click to enlarge

Now I know that there will still be some Scrooges among you who will worry that all I’ve documented here are two correlated secular trends, especially because the consumer spending metric is not population-adjusted. Others will complain that I’ve degraded the true meaning of Christmas by measuring it as consumer spending, and that test scores do not fully capture student performance.

But I’d like you to consider the possibility that maybe, just maybe, there’s some holiday magic at work here.

—Matthew M. Chingos

This post originally appeared on the Brown Center Chalkboard blog.




Comment on this article
  • Ze'ev Wurman says:

    Beautiful!

    But not as beautiful as what Bill Schmidt has done in his Education Research 2012 paper, where he attempts to prove that the Common Core standards are likely to cause higher achievement. First, he graded state standards in terms of alignment to (congruence with) the Common Core. Oops … not good. Many more laggers than leaders in the “most aligned with the Common Core” category. Then Schmidt tried to look at some notion of “focus.” Oops … doesn’t look too promising either, particularly because states have improved since 1995 when he coined his “mile wide, inch deep” slogan. Perhaps “rigor” will save the day for the Common Core? Nope, rigor (aka “cognitive demand”) looks unpromising, about the same as Andy Porter found. What to do? What to do?

    But Schmidt is a statistician for a good reason (remember “lies, damn lies, and statistics”?) So, first, he separates out the states that have standards aligned with the Common Core and low NAEP achievement – they must outliers, must they not? Hey, the results start looking good now! What if we added a pinch of cut-scores impact? That should work well, because higher scoring states tend to have higher cut scores so it will make the correlation stronger! (If you detect a whiff of circular thinking here, shame on you for having such a low level of trust.) Wow, the results look quite good now. Perhaps a teaspoon of socioeconomic status will help even more? After all, we can’t rely on data from poor states that need a lot of federal funds to prove Common Core excellence, right? (Again, if you happen to notice that this pushes down the Southern states whose standards are very much like Common Core but lag on NAEP, perish the thought!)

    Victory! We now have scientifically,/b> proven that the Common Core standards are predictive of future success in mathematics! Even better, we can now publish our rigorous research in the flagship publication of the American Education Research Association! Ain’t statistics wonderful? Life in the Academe is great!

  • Comment on this Article

    Name ()


    *

         1 Comment
    Sponsored Results
    Sponsors

    The Hoover Institution at Stanford University - Ideas Defining a Free Society

    Harvard Kennedy School Program on Educational Policy and Governance

    Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

    Sponsors