Revelations from the TIMSS

Author/s: 
Paul Peterson
Year of publication: 
Spring 2013
Publication: 
Education Next
Volumne/Issue: 
13(2)

Over the past two decades, gains of 1.6 percent of a standard deviation have been garnered annually by 4th- and 8th-grade students on the math, science, and reading tests administered by the National Assessment of Educational Progress (NAEP), known as the nation’s report card. An upward trajectory of 1.6 standard deviations cumulates over 20 years to 32 percent of a standard deviation, well over a year’s worth of learning. That striking result is given in a recent report in this journal by Eric Hanushek, Ludger Woessmann, and me (see “Is the U.S. Catching Up?features, Fall 2012).

Half those gains are probably an illusion, however. The latest results from the math and science tests administered by the Trends in International Mathematics and Science Study (TIMSS), the respected international testing agency, show gains of only 0.8 percent of a standard deviation yearly between 1995 and 2011. Further, another respected international assessment of student performance, the Program for International Student Assessment (PISA), found gains of only 0.5 percent of a standard deviation annually for U.S. students over roughly the same time period. (For specifics, see page 19 of our full report, Achievement Growth: International and U.S. State Trends in Student Performance [PEPG, 2012].)

In other words, NAEP has been identifying gains that are somewhere between two and three times as large as those recorded by two respected international testing agencies that do not have a political stake in showing rising levels of student achievement in any particular country.

For some time, analysts have been wondering whether NAEP tests have become easier. Those who construct the main tests that NAEP administers frankly admit that they have adapted questions over time to meet the changing curricula offered by contemporary schools. NAEP has also introduced special accommodations for those who say they are in some way disabled and need additional time or other modifications of the standard testing protocol. Have testing changes and administrative innovations softened tests so that they now indicate higher levels of student achievement than would be the case if older practices had been retained?

It is well known that when measuring economic change it is critical to adjust for inflation so that real growth is not confused with nominal growth in prices. An entire bureau within the U.S. Department of Labor is devoted to measuring the extent to which prices for the same commodities are rising or falling. With that information ready at hand, economists can ascertain whether the economy is actually moving forward or whether nominal growth in the GDP is simply the result of inflation.

Nothing similar exists in education. The U.S. Department of Education does not have an agency that inspects NAEP tests or state tests to ascertain whether questions on the tests have been eased with the passage of time.

It is remotely possible that TIMSS and PISA have revised their tests so that they have become more difficult over time, thereby underestimating U.S. student gains. But few believe that any testing organization in the late 20th and early 21st centuries has actually made its tests more challenging over time. All the social and political pressures operate in the opposite direction.

We do know one thing for certain: U.S. students are not closing the international achievement gap. Our study shows that even when measured by NAEP criteria, the United States stands at the 25th rank among 49 countries in achievement growth. Similarly, the recent TIMSS data show the United States to be the middle-ranked country among the 11 for which the organization could fully track student performance since 1995. U.S. students are making middling gains that are keeping them on par with students in other countries. In comparative terms, the United States is not making any progress at all.

Type: 
Topics: