Common Core State Standards: Do You Need To Be Proficient In Order To Be Proficient?



By 07/12/2013

4 Comments | Print | NO PDF |

In response to the article on the disparity in state proficiency standards that Peter Kaplan and I published earlier this week, one reader, Scott McLeod, referred (in a comment) to an article arguing that that “proficiency” as defined by the National Assessment of Educational Progress (NAEP) does not really mean proficiency.  That article, by James Harvey, originally appeared on Valerie Strauss’ Answer Sheet blog.

Not to worry when states set much lower and widely diverse proficiency standards, McLeod implies. After all, as James Harvey argues in his article, NAEP didn’t mean “proficiency” when it used that word. NAEP was simply being “aspirational,” much like the mercenary preacher at the revival meeting who calls for a transformation of the heart and soul when he expects only a contribution to the collection plate.

Citing a study by Gary Phillips, Harvey says that most students around the world cannot reach the NAEP proficiency standard. But for the most part, Phillips compares the United States with countries in the developing world, not with peers at similar levels of economic and social development.

It is true that students in the United States show quite well when they are compared to the students of Brazil, Albania, Jordan, Peru, Columbia, Panama, Tunisia, Indonesia and Kyrgyzstan. (Those were the lowest-performing of the countries in the world who participated in the international testing administered by the Program on International Assessment (PISA), the official assessment system of the Organization of Economic Cooperation and Development.)

But comparisons with U. S. peers reveal a quite different story.  As Eric Hanushek, Ludger Woessmann and I show in our book Endangering Prosperity: A Global View of the American School (publication date: September 3, 2013), the United States, with 32 percent of its students proficient in mathematics, comes in Number 32 among political jurisdictions that take the PISA test.  Here are just some of the countries that have much higher math proficiency rates than the meager 32 percent garnered by the United States:  Singapore (63%), Korea (62%), Finland (56%), Switzerland (53%), Canada (50%), New Zealand (47%), Germany (45%), and Australia (44%)).

Too many people ignore these international comparisons and set low expectations for U.S. students and their schools.   Unfortunately, even committees of the National Academy of Education (of which I am a member) and the National Research Center, a bureaucratic arm of the National Academy of Sciences, downplay the problematic state of American education.

But whatever one thinks of NAEP’s definition of proficiency, nothing in Harvey’s article begins to touch on the central point of our essay:  U.S. states, by committing themselves to implement Common Core State Standards, have promised to set standards benchmarked at international levels, while in fact they have put into place actual standards at diverse—and embarrassingly low—proficiency levels. If Apple were as inaccurate in its description of the iPad, it would be hauled before a federal judge for false advertising.

-Paul E. Peterson




Comment on this article
  • jean sanders says:

    There is support for Scott in the literature. I fear that the students you are working with at Harvard are being denied the access to sound educational research. In building a separate “silo” of public policy you are not introducing the students to the school effectiveness research. I would suggest that a graduate seminar at your program be instituted on the school effectiveness research and school improvement research that has been reported in the literature since the 1970s at least. Sam Sava at Kettering Foundation was one of our admired individuals. In my comments I look to other universities across the river such as the work of Marilyn Cochran Smith that your students do not seem to have any knowledge of or the work of Madaus at BC on the important economic issues with school effectiveness and what can be determined as direct instructional effects. I know when I taught for 10 years in public schools before moving into administration, curriculum and teacher preparation, there were students coming out of Harvard that wanted to go directly to Washington to work on policy without the experiences that I obtained. This is the separate “silo” program and I don’t think you can layer on these professions with computers on their laps who don’t have a sound grounding in the research literature of the institution they are examining.

  • jeansanders says:

    PISA has been criticized in Italy (Cornoldi) because rsults were artefacts of the testing; southern italian students were being penalized for speed/accuracy tradeoffs. If this is happening you cannot compare their students with Finland for example. The major problem is you are making major generalizations from the NAEP or PISA data that would be inappropriate for policy decisions. In order to measure the academic achievement of students in Massachusetts we have gone through iterative stages to align the curriculum with MCAS so that we have more sensitive outcome measures; outcomes more directly linked to the courses and curriculum the students receive (and actually attend). It is unfair to take that and then slap it on top of another state; neither is it just to have Massachusetts destruct this process in order to conform with Arkansas. These are very important issues and your information is being reduced to UPI headlines such as “teachers are pricey” and that is why I have been objecting in my comments so loudly for the past year.

  • jean sanders says:

    quote: “If they looked at the literature they would find that student achievement is “unrelated” to the height of the skill bar set by the various states. To spend this time on highlighting standards by grading the states camouflages the issues that determine the quality of education in American schools, and that is poverty.”
    And, it demeans teachers. Harvard is cheating their own students if they are unaware of the literature of educational KP&U and the decades of school reform….

  • jean sanders says:

    Don’t use Massachusetts to “bash” other states. Hanover Research reports that Massachusetts utilized NAEP-based standards and benchmarks for approximately ten years before adopting standards based on the Common Core State Standards. This is largely the reason for the ranking you prepared; notice that educational reform with these complex issues took 10 more more years and cannot be done overnight. I participated in the very first iterations of MCAS that were reworked and revised. I do not believe that all states need to conform to the NAEP standards and they should be encouraged to provide examples of standards (and correlative test measures) that are even better for the educational achievement of future students. (Test data from NAEP 2011 don’t tell us what is happening today if we get into a “mixed up” set of standards that are lowered to conform to one size fits all. NAEP testing data just as MCAS testing data do not provide a picture of the student’s trajectory through a developmental process or curriculum because each cohort tested is a different group/population /sample . The third grade class of 2011 is not the same third grade class of 2013 because they are now older and , with such a large mobility rate, a large percentage have moved to other districts. Your grading system does not account for any of these factors.

  • Comment on this Article

    Name ()


    *

         4 Comments
    Sponsored Results
    Sponsors

    The Hoover Institution at Stanford University - Ideas Defining a Free Society

    Harvard Kennedy School Program on Educational Policy and Governance

    Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

    Sponsors