Revelations from the TIMSS

Education Next Issue Cover

Half or more of student achievement gains on NAEP are an illusion



By

2 Comments | Print | PDF |

Spring 2013 / Vol. 13, No. 2

Over the past two decades, gains of 1.6 percent of a standard deviation have been garnered annually by 4th- and 8th-grade students on the math, science, and reading tests administered by the National Assessment of Educational Progress (NAEP), known as the nation’s report card. An upward trajectory of 1.6 standard deviations cumulates over 20 years to 32 percent of a standard deviation, well over a year’s worth of learning. That striking result is given in a recent report in this journal by Eric Hanushek, Ludger Woessmann, and me (see “Is the U.S. Catching Up?features, Fall 2012).

Half those gains are probably an illusion, however. The latest results from the math and science tests administered by the Trends in International Mathematics and Science Study (TIMSS), the respected international testing agency, show gains of only 0.8 percent of a standard deviation yearly between 1995 and 2011. Further, another respected international assessment of student performance, the Program for International Student Assessment (PISA), found gains of only 0.5 percent of a standard deviation annually for U.S. students over roughly the same time period. (For specifics, see page 19 of our full report, Achievement Growth: International and U.S. State Trends in Student Performance [PEPG, 2012].)

In other words, NAEP has been identifying gains that are somewhere between two and three times as large as those recorded by two respected international testing agencies that do not have a political stake in showing rising levels of student achievement in any particular country.

For some time, analysts have been wondering whether NAEP tests have become easier. Those who construct the main tests that NAEP administers frankly admit that they have adapted questions over time to meet the changing curricula offered by contemporary schools. NAEP has also introduced special accommodations for those who say they are in some way disabled and need additional time or other modifications of the standard testing protocol. Have testing changes and administrative innovations softened tests so that they now indicate higher levels of student achievement than would be the case if older practices had been retained?

It is well known that when measuring economic change it is critical to adjust for inflation so that real growth is not confused with nominal growth in prices. An entire bureau within the U.S. Department of Labor is devoted to measuring the extent to which prices for the same commodities are rising or falling. With that information ready at hand, economists can ascertain whether the economy is actually moving forward or whether nominal growth in the GDP is simply the result of inflation.

Nothing similar exists in education. The U.S. Department of Education does not have an agency that inspects NAEP tests or state tests to ascertain whether questions on the tests have been eased with the passage of time.

It is remotely possible that TIMSS and PISA have revised their tests so that they have become more difficult over time, thereby underestimating U.S. student gains. But few believe that any testing organization in the late 20th and early 21st centuries has actually made its tests more challenging over time. All the social and political pressures operate in the opposite direction.

We do know one thing for certain: U.S. students are not closing the international achievement gap. Our study shows that even when measured by NAEP criteria, the United States stands at the 25th rank among 49 countries in achievement growth. Similarly, the recent TIMSS data show the United States to be the middle-ranked country among the 11 for which the organization could fully track student performance since 1995. U.S. students are making middling gains that are keeping them on par with students in other countries. In comparative terms, the United States is not making any progress at all.

— Paul E. Peterson




Comment on this article
  • Morgan Polikoff says:

    Isn’t this the exact purpose of the long-term trend NAEP, which shows roughly similar gains to main NAEP? http://nces.ed.gov/nationsreportcard/ltt/interpreting_results.asp

  • Stephen says:

    I wonder if the bigger issue is the age at which children are being tested–the TIMSS evaluates 4th grade and 8th grade students, which roughly matches up with the NAEP 9 and 13 year old evaluations. Between 1995 and 2007, 4th and 8th grade student performance improved on the TIMSS (by roughly 10-15% of a standard deviation), matching a several decade long trend in NAEP 9 and 13 year old improvement in mathematics. However, on the PISA (which tests 15 year olds), the United States’ 2000 and 2003 scores in mathematics (483 in 2003) are not statistically significantly different than 2009 performance (487). This seems to align with 17 year old performance on the long-term trend NAEP, which shows that 17 year olds have made NO GAINS over the past 30-40 years.

    It seems that both domestic and international testing regimes have documented improved performance in the lower grades in the US that completely wash out by the end of high school–I can’t find any research documenting why this might be the case; this seems like an important question to answer if it is true that we have seen real performance gains that later wash out by the end of high school.

  • Comment on this Article

    Name ()


    *

         2 Comments
    Sponsored Results
    Sponsors

    The Hoover Institution at Stanford University - Ideas Defining a Free Society

    Harvard Kennedy School Program on Educational Policy and Governance

    Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

    Sponsors