When the Best is Mediocre

Education Next Issue Cover

Developed countries far outperform our most affluent suburbs



By and

31 Comments | Print | PDF |

Winter 2012 / Vol. 12, No. 1

View the Global Report Card
View the Methodological Appendix
Video: Jay Greene discusses the study
Podcast: Marty West interviews Jay Greene about the Global Report Card


American education has problems, almost everyone is willing to concede, but many think those problems are mostly concentrated in our large urban school districts. In the elite suburbs, where wealthy and politically influential people tend to live, the schools are assumed to be world-class.

Unfortunately, what everyone knows is wrong. Even the most elite suburban school districts often produce results that are mediocre when compared with those of our international peers. Our best school districts may look excellent alongside large urban districts, the comparison state accountability systems encourage, but that measure provides false comfort. America’s elite suburban students are increasingly competing with students outside the United States for economic opportunities, and a meaningful assessment of student achievement requires a global, not a local, comparison.

We developed the Global Report Card (GRC) to facilitate such a comparison. The GRC enables users to compare academic achievement in math and reading between 2004 and 2007 for virtually every public school district in the United States with the average achievement in a set of 25 other countries with developed economies that might be considered our economic peers and sometime competitors. The main results are reported as percentiles of a distribution, which indicates how the average student in a district performs relative to students throughout the advanced industrialized world. A percentile of 60 means that the average student in a district is achieving better than 59.9 percent of the students in our global comparison group. (Readers can find all of the results of the Global Report Card at http://globalreportcard.org. The web site contains a full description of the method by which we calculated the results. For a summary, see the methodology sidebar.)

For the purposes of this article, we focus on the 2007 math results, although the GRC contains information for both math and reading between 2004 and 2007. We focus on 2007 because it is the most recent data set, and we focus on math because it is the subject that provides the best comparison across countries and is most closely correlated with economic growth. Readers should feel free to consult the GRC web site to find reading results as well as results for other years.

The Example of Beverly Hills

It is critically important to compare exclusive suburban districts against the performance of students in other developed countries, as these districts are generally thought to be high-performing. The most wealthy and politically powerful families have often sought refuge from the ills of our education system by moving to suburban school districts. Problems exist in large urban districts and in low-income rural areas, elites often concede, but they have convinced themselves that at least their own children are receiving an excellent education in their affluent suburban districts.

Unfortunately, student achievement in many affluent suburban districts is worse than parents may think, especially when compared with student achievement in other developed countries. Take for example Beverly Hills, California. The city has a median family income of $102,611 as of 2000, which places it among the top 100 wealthiest places in the United States with at least 1,000 households. The Beverly Hills population is 85.1 percent white, 7.1 percent Asian, and only 1.8 percent black and 4.6 percent Hispanic. The city is virtually synonymous with luxury. A long-running television show featured the wealth and advantages of Beverly Hills high-school students (as well as their overly dramatic personal lives). If Beverly Hills is not the refuge from the ills of the education system that elite families are seeking, it’s not clear what would be.

But when we look at the Global Report Card results for the Beverly Hills Unified School District, we don’t see top-notch performance. The math achievement of the average student in Beverly Hills is at the 53rd percentile relative to our international comparison group. That is, one of our most elite districts produces students with math achievement that is no better than that of the typical student in the average developed country. If Beverly Hills were relocated to Canada, it would be at the 46th percentile in math achievement, a below-average district. If the city were in Singapore, the average student in Beverly Hills would only be at the 34th percentile in math performance.

Of course, people don’t think of Beverly Hills as a school district with mediocre student achievement. This is partly because people assume that affluent suburbs must be high achieving and partly because state accountability results inflate achievement by comparing affluent suburban school districts with large urban ones. According to California’s state accountability results, the average student in Beverly Hills is at the 76th percentile in math achievement relative to other students in the state. But outperforming students in Los Angeles, which is only at the 20th percentile in math relative to a global comparison group, should provide little comfort to Beverly Hills parents.

Los Angeles Unified is not the main source of competitors for Beverly Hills students, so the state accountability system encourages the wrong comparison. If Beverly Hills graduates are to have the kinds of jobs and lifestyles that their parents hope for them, they will have to compete with students from Canada, Singapore, and everywhere else. Beverly Hills students have to be toward the top of achievement globally if they expect to get top jobs and earn top incomes.

Results from Affluent Suburbs Nationwide

We can repeat the story of Beverly Hills all across the country. Affluent suburban districts may be outperforming their large urban neighbors, but they fail to achieve near the top of international comparisons (see Figure 1). White Plains, New York, in suburban Westchester County, is only at the 39th percentile in math relative to our global comparison group. Grosse Point, Michigan, outside of Detroit, is at the 56th percentile. Evanston, Illinois, the home of Northwestern University outside of Chicago, is at the 48th percentile in math. The average student in Montgomery County, Maryland, where many of the national government leaders send their children to school, is at the 50th percentile in math relative to students in other developed countries. The average student in Fairfax, Virginia, another suburban refuge for government leaders, is at the 49th percentile. Shaker Heights, Ohio, outside of Cleveland, is at the 50th percentile in math. The average student in Lower Merion, Pennsylvania, near Philadelphia, is at the 66th percentile. Ladue, Missouri, a wealthy suburb of St. Louis, is at the 62nd percentile. And the average student in Plano, Texas, near Dallas, is at the 64th percentile in math relative to our global comparison group.

All of these communities are among the wealthiest in the United States. All are overwhelmingly white in their population. All of them are thought of as refuges from the dysfunction of our public school system. But the sad reality is that in none of them is the average student in the upper third of math achievement relative to students in other developed countries. Most of them are barely keeping pace with the average student in other developed countries, despite the fact that the comparison is to all students in the other countries, some of which have a per-capita gross domestic product that is almost half that of the United States. In short, many of what we imagine as our best school districts are mediocre compared with the education systems serving students in other developed countries.

Pockets of Excellence

While many affluent suburban districts have lower achievement than we might expect, some districts are producing very high achievement even when compared with that of students in other developed countries. For example, the average student in the Pelham school district in Massachusetts is at the 95th percentile in math. That means that if we were to relocate Pelham to another developed country in our comparison group, the average student in Pelham would outperform 95 percent of the students in math. That’s very impressive.

Of course, Pelham is a small district that is home to Amherst College, among other institutions of higher learning, and serves a rather select group of students. But not all college-town school districts are equally high achieving. As we have already seen, Evanston, Illinois, is at the 48th percentile in math in a global comparison. Palo Alto, California, the home of Stanford University, is at the 64th percentile. And the average student in Ann Arbor, Michigan, home to the University of Michigan, is at the 58th percentile in math relative to students in other developed countries. So, the 95th percentile math achievement in Pelham is outstanding, even for college towns.

Spring Lake, New Jersey, has a similarly impressive record of having the average student at the 91st percentile in math. It is a very small and affluent community on the New Jersey shore that has somehow escaped the influence of Snooki and The Situation. Waconda, Kansas, a small rural community, also is at the 91st percentile. Highland Park, Texas, an affluent community near Dallas, is at the 88th percentile.

Interestingly, of the top 20 U.S. public-school districts in math achievement, 7 are charter schools (some states treat charter schools as separate public-school districts). And most of the 13 traditional districts remaining are in rural communities rather than in a large suburban “refuge” from urban education ills.

Pools of Failure

In total, only 820 of the 13,636 public-school districts for which we have 2007 math results had average student achievement that would be among the top third of student performance in other developed countries. That is, 94 percent of all U.S. school districts have average math achievement below the 67th percentile. There aren’t that many truly excellent districts out there.

Of the 13,636 districts, 9,339, or 68 percent, have average student math achievement that is below the 50th percentile compared with that of the average student in other developed countries. Most of our large school districts are well below the 50th percentile. This is especially alarming, because these lower-performing large districts comprise a much greater share of the total student population than do the relatively small higher-performing districts.

The average student in the Washington, D.C., school district is at the 11th percentile in math relative to students in other developed countries. In Detroit, the average student is at the 12th percentile. In Milwaukee, the average student is at the 16th percentile. Cleveland is at the 18th percentile. The average student in Baltimore is at the 19th percentile in math relative to students in other developed countries. In Los Angeles, the average student is at the 20th percentile. The average student in Chicago is at the 21st percentile in math. Atlanta is at the 23rd percentile. The average student in New York City is at the 32nd percentile in math. And in Miami-Dade County, the average student is at the 33rd percentile in math.

Not 1 of the largest 20 school districts is above the 50th percentile in math relative to other developed countries. Those districts contain almost 5.2 million students or more than 10 percent of the country’s schoolchildren. The rare and small pockets of excellence in charter schools and rural communities are overwhelmed by large pools of failure.

Previous Research

The Global Report Card is not the first analysis to compare the performance of U.S. students to international peers. Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (see “Teaching Math to the Talented,” features, Winter 2011) used a very similar method to compare the performance of students in each state to students in other countries and arrived at similarly gloomy conclusions. Using state NAEP results for 8th-grade students and PISA results for 15-year-olds internationally, the researchers focused on the percentage of students performing at an advanced level in math. In almost every state, they found that we had far fewer advanced students than most of the countries taking PISA. They also narrowed the comparison to white students in the U.S. and to students whose parents had a college education to show that even advantaged students in the U.S. failed to achieve at an advanced level in math relative to their international peers. More recently, Hanushek et al. updated their analysis to examine the percentage of students in each state and across countries performing at the proficient level in math and reading.  The results were similarly disappointing.

The main difference between the GRC and the Hanushek et al. analyses is that in our study we push the comparison down to the district level. By focusing on white students and children of college-educated parents, Hanushek et al. clearly mean to convey that even students in elite suburban districts have mediocre achievement. Our contribution with the GRC is to name the districts so that people do not indulge the fantasy that their suburb’s record is somehow different from the disappointing performance of others with advantaged students in their state.

There are other important differences between the GRC and the Hanushek et al. analyses. We incorporate test results for U.S. students in all available grades (typically grades 3 through 8 and grade 10) rather than focusing on the grade closest to the 15-year-olds in the PISA sample. We could have focused only on 8th-grade results, as Hanushek et al. did, but in doing so we would have greatly reduced the number of test results on which we were doing the calculations for school districts. We preferred to gain precision in estimating the achievement in each district by increasing our sample size rather than restricting the sample to 8th graders in order to gain comparability in the age of the students under review.

The GRC analysis also differs from those of Hanushek et al. in that the latter focus on students performing at the advanced or proficient level, while we focused on the average student performance in both math and reading. Hanushek et al. concentrated on advanced or proficient performance because they were trying to compare our best students with the best abroad to show that even our best are mediocre. We did the same by highlighting the results for elite suburban school districts. Focusing on the average also avoids any dispute about how “advanced” or “proficient” are defined across different tests.

Gary Phillips at the American Institutes for Research has also conducted a series of analyses comparing state achievement on NAEP to international performance on a different international test, the Trends in International Mathematics and Science Study (TIMSS). Phillips arrives at somewhat less gloomy conclusions about U.S. performance, but that is because the countries included in TIMSS differ from those covered by PISA. Hanushek et al. rightly note that PISA provides a much more appropriate comparison for the U.S.: “Put starkly, if one drops from a survey countries such as Canada, Denmark, Finland, France, Germany, and New Zealand, and includes instead such countries as Botswana, Ghana, Iran, and Lebanon, the average international performance will drop, and the United States will look better relative to the countries with which it is being compared.”

This has sparked a debate among researchers about whether TIMSS or PISA provides a better set of countries against which we should compare the U.S. The Global Report Card circumvents this dispute by developing its own set of countries against which we compare U.S. students. The comparisons provided by TIMSS and PISA depend on which countries decide to take each test each time it is administered. And PISA scales its scores against the results for members of the OECD, which excludes countries like Singapore while including countries like Mexico. Our comparison group depends on PISA results, but it is also based on objective criteria, like per-capita GDP, to identify a set of developed economies that can be reasonably compared with that of the U.S. Our comparison group is a significant improvement on the self-selection of countries that choose to take a test as well as an improvement upon arbitrary membership in an organization like the OECD.

No Refuge

The elites, the wealthy families that have a disproportionate influence on politics, clearly recognize the dysfunction of large urban school districts and have sought refuge in affluent suburban districts for their own children. But the reality is that there are relatively few pockets of excellence to which these families can flee.

In four states, there is not a single traditional district with average student achievement above the 50th percentile in math. In 17 states, there is not a single traditional district with average achievement in the upper third relative to our global comparison group. And apart from charter school districts,  in over half of the states, there are no more than three traditional districts in which the average achievement would be in the upper third.

The elites in those states have almost nowhere to find an excellent public education for their children. But state accountability systems and the desire to rationalize the lack of quality options have encouraged the elites to compare their affluent suburban districts to the large urban ones in their state. These inappropriate comparisons have falsely reassured them that their own school districts are doing well.

This false reassurance has also perhaps undermined the desire among the elites to engage in dramatic education reform. As long as the elites hold onto the belief that their own school districts are excellent, they have little desire to push for the kind of significant systemic reforms that might improve their districts as well as the large urban districts. They may wish the urban districts well and hope matters improve, but their taste for bold reform is limited by a false contentment with their own situation.

But the elites should not take comfort from the stronger performance of affluent suburban districts relative to large urban districts. As the Global Report Card reveals, even our best public-school districts are mediocre when compared with the achievement of students in a set of countries with developed economies.

Of course, the Global Report Card does not isolate the extent to which schools add or detract from student performance. Factors from student backgrounds, including their parents, communities, and individual characteristics, have a strong influence on achievement. But the GRC does tell us about the end result for student achievement of all of these factors, schools included. And that end result, even in our best districts, is generally disappointing.

Jay P. Greene is professor of education reform at the University of Arkansas and a fellow at the George W. Bush Institute. Josh B. McGee is vice president for public accountability initiatives at the Laura and John Arnold Foundation.




Comment on this article
  • Anne Clark says:

    I only had to check the NJ Report card for the first 3 NJ towns on your “Best in Math” list to see that there are some serious flaws in your work. Is it because you assume normal distributions? Is it because you assume comparability in standard deviations? Is it because you don’t differentiate between proficient and advanced proficient? I’m sure you can find a reputable numbers person to help you answer these questions.

    I pulled 2009-10 NJ Report Card data for grades 3-8 for Spring Lake(#2 in the nation???, 91%), Cranbury (#9, 87%), and Princeton Charter (#15, 86%). This raw data shows they should be ranked in the exact opposite order you place them in.

    For grade 8, for instance, the percentages for Partially Proficient/Proficient/Advanced Proficient for the 3 schools are: Spring Lake = 22/29/49; Cranbury = 0/25/75; Princeton Charter = 4.4/11.1/84.4.

    I’m not going to waste my time digging up more numbers to prove that going down to the district level with this analysis obviously produces bizarre results. There are similar numbers for all 3 schools for all the other grades.

    Governor Christie’s local public school doesn’t even get a mention with 8th grade results of 2.5/24/73.6 – yet Bernards does with scores of 5.9/39.8/54.2

    Please, go back to your spreadsheets and reexamine the logic – or lack thereof – behind your methodology. You should have spent less money on fancy graphics, and more on PhD ed researchers.

  • Will Fitzhugh says:

    The Concord Review has shown, since 1987, that some of our public high school students, many doing independent studies, are as capable of writing exemplary history research papers as many of their peers from the other 38 countries from which our submissions have come. But their teachers and their curriculum usually do not afford time (the AP has no term paper) to work on such papers, leaving too many of our HS graduates unready for college reading lists and papers.

    Will Fitzhugh; fitzhugh@tcr.org; http://www.tcr.org

  • Jay P. Greene says:

    Anne — The results you list are from 2010, while the results we feature are from 2007.

    In addition, we use the NJ test information for our calculations, so the rank ordering of those districts in our results will mirror exactly those on the NJ test.

    Lastly, the key issue here is whether we are identifying high-achieving districts, not the exact ordering of the schools. The ordering may shift from year to year, but we are correct in identifying Spring Lake as a high-achieving district. According to the numbers you describe, 78% of Spring Lake students are proficient or better in math on the NJ test.

  • Anne Clark says:

    So when assigning a% to a district you give a higher rating to a district with 10% Partially Proficient, 90% proficient than a district with 11% Partially Proficient, 29% Proficient, 60% Advanced Proficient.

    And you want “elite” parents to take this study seriously?

    You don’t identify Spring Lake “as a high-achieving district”.

    You rank it #2 in the US.

  • Anne Clark says:

    From the homepage of the study:

    Ever wonder how your public school district stacks up when compared to the rest of the world? What about how your district compares to your state or even the nation?

    Your claim, not mine.

  • Anne Clark says:

    And how do you explain the weird numbers on this chart?

    http://www.globalreportcard.org/docs/Wealthiest-School-Districts-in-the-United-States.pdf

    It just so happened that the towns whose rankings differ by 25 just happened to have the same median household income?

  • Jay P. Greene says:

    Anne –

    Spring Lake is #2 as of 2007. You attempt to discredit that by showing that their ranking would be different in 2010.

    Obviously, the rankings among over 13,000 school districts will not remain identical over time. My point is that Spring Lake was near the top in 2007 and is still performing well in 2010, even if it is still not #2.

    So, if you want to see how your district stacks up, the GRC helps you do that. You just have to understand that results can change from year to year.

    And thank you for pointing out an error in the median household incomes pdf. In very large projects, there are occasional typos.

  • Lois Stoner says:

    The statement that all of the communities cited are overwhelmingly white is incorrect. Montgomery county, MD population is majority minority as of 2010 and the school system enrollment has been majority minority since at least 2000.

  • Anne Clark says:

    Alright – I’ll take your challenge and see how my district “stacks up”. Will I be “surprised”?

    Chester Township gets a 73% in math.

    So our JUST AVERAGE kid is better than 72.9% of the kids in the top performing nations of the world. Pretty good! Better than I expected. We have shifted the curve 23 points to the right. From what I have seen with ed research, a 23% shift to high achievement is out-of-this-world!

    But wait – how did you include the High School data? We’re part of a regional high school district with 2 high schools that serve 5 towns. Hmm…

    If I look at your “Best” list, it includes districts without High Schools.

    If I look at your “Wealthiest” list, it assigns countywide measurements to smaller census tracts, For instance, to you Bethesda = Montgomery County.

    And – yeah – I know – its 5 year old data.

    But how does my high school really stack up, and are you comparing apples with oranges?

    In your best in math list who else do you rank in NJ? Spring Lake is #2 in the country. Does it have a HS? Nope. Only about 250 kids in Pre-K-8.

    Cranbury Township – #9 in the US. HS? Nope. Pre K-8 with about 600 kids. Yep – that’s a district here in NJ. Really unbelievable.

    Who’s next on your list of best math in the US from NJ? Princeton Charter School at #15. HS? Nope – just 344 kids in K-8.

    Robert Treat? 500 kids in K-8.

    (Yep – charter schools are each their own district by NJ standards.)

    Watchung? About 750 kids in K-8. It’s own district.

    Stillwater? 400 kids in Pre-K-6. It’s own district.

    Bernards. Finally a K-12. And they are a great school district. But how did they get ranked above Millburn, which is sitting atop the latest NJ Monthly rankings?

    You give Millburn an 80%. You give Bernards an 82%.

    Millburn’s 11th grade math scores (I know 2010 vs. 2007) are Partially Proficient 4.7%/Proficient 26.4%/Advanced Proficient 69%. Wow! Bernards are 5.2%/38.8%/56%.

    So should I really be “stacking up” and paying any attention to your best of list?

    And even though these are incredibly wealthy places, they don’t make your list of “wealthiest” because you restrict it to towns over 50,000 – thereby excluding most of NJ. If you had made the list with 10,000 or more people, 9 of them would be NJ – and probably would have messed up your conclusion that “these 50 US school districts, on average, ranked behind nearly half of their international competitors in mathematics”. Lots of qualifiers in there.

    I’ve said more than enough. I don’t think US math performance is ANYWHERE near where it should be, even in our high performing districts. But such questionable work doesn’t help advance our cause when it is presented in this manner.

    You aim to excite the natural competitiveness in the audience this is aimed at – and the quality of the work doesn’t support such a comparison.

    Comparing Spring Lake to Montgomery County? Really?

  • [...] areas and on closing achievement gaps. Well, almost as if on cue, Jay Greene and Josh McGee write in Education Next about their new study on how suburban U.S. school districts compare internationally in math (based [...]

  • Nathaniel says:

    To Anne Clarke: Great comments (and work)…your analysis and conclusions are right on…

  • Imagineer says:

    The definition of “elites” is meaningless. I can’t imaging children of politicians doing very well. I think you need to map the density of scientists and engineers in an area. You will find significant correlation to student performance. Also, I hate to say it, but the study needs to normalize for racial composition. It is my understanding that white children do pretty well in world comparisons. Not a great thing but interesting.

  • Ann F. says:

    As a resident of Evanston (IL), I can tell you (proudly) that neither that neither the community nor the public school system here are “overwhelmingly white in their population,” as the authors state. Our K-8 district is 43% White, 33% Black, 15% Hispanic, 5% Asian, and the rest other races. The high school demographics are similar. There is great wealth here, to be sure, but 40% of students qualify for free/reduced lunch. Cities to the north, like Wilmette, Winnetka, and Glencoe are far more racially and economically homogeneous.

  • everyonesfacts says:

    And this statement is bogus too:

    “The elites, the wealthy families that have a disproportionate influence on politics, clearly recognize the dysfunction of large urban school districts and have sought refuge in affluent suburban districts for their own children. But the reality is that there are relatively few pockets of excellence to which these families can flee.”

    Although one would guess that all these areas would have wonderful public schools, it is common at least in MA that the wealthiest towns lose a greater % of their school population than less wealthy towns. For instance Concord (where the Concord Review gets its name) loses something close to 25% of its students to private schools whereas Chelmsford, a middle class town loses about 2-3%.

    My guess is most of those “flee”ing from these schools are often among the top students and wealthiest families.
    Any word on this?

  • Anne Clark says:

    everyonesfacts – I have a question.

    In Massachusetts, how many of the “wealthiest” communities have over 50,000 people?

  • [...] well does that perception stand up to reality? Not very well, according to a new report, “When the Best Is Mediocre,” published by Education Next. Instead of subjecting a city or county’s educational [...]

  • [...] When the best is mediocre: Of course, the Global Report Card does not isolate the extent to which schools add or detract from student performance. Factors from student backgrounds, including their parents, communities, and individual characteristics, have a strong influence on achievement. But the GRC does tell us about the end result for student achievement of all of these factors, schools included. And that end result, even in our best districts, is generally disappointing. [...]

  • [...] Jay Greene’s Global Report Card and his chilling article in Education Next entitled “When the Best is Mediocre” (more on both of those in future posts, I’m [...]

  • [...] not exactly.  Jay P. Greene and Josh B. McGee have developed a first-of-its-kind “Global Report Card” that looks at how each U.S. school district performs in relation to its international [...]

  • Bev says:

    Although the article makes a compelling case that America’s schools need to improve, the author did not base his/her conclusion on the actual school demographics of each area. As others have mentioned, these so called ‘overwhelmingly white’ school districts are not. I live in White Plains, NY and the high school demographics are as follows: 45% Hispanic, 32% White, 21% Black. Consequently, I have a hard time accepting the final analysis when the raw data used to produce the conclusion is fundamentally flawed.

  • Jay P. Greene says:

    Bev,

    White Plains is the 53rd wealthiest place in the US with a population over 50,000 (which excludes tiny places that may not be realistic suburban options). And the demographics of the city are “64.93% White, 15.91% African American, 4.50% Asian, 0.34% Native American, 0.07% Pacific Islander, 10.37% from other races, and 3.88% from two or more races. Hispanic or Latino of any race were 23.51% of the population.” I understand that the school demographics may be different, but there is no denying that White Plains is not a central city where we imagine all of our educational ills are concentrated.

    Leaving aside White Plains, among the school districts serving the 50 wealthiest places with populations over 50,000 the average math percentile is only 52.

  • Kate says:

    FYI, in Beverly Hills most of those “white” students are in fact Persian, and often the children of immigrants.

  • [...] it isn’t just students in the inner cities who are affected by underperforming schools. A study released just two weeks ago by Jay P. Greene and Josh McGee reveals that many schools in even the [...]

  • Peter Pappas says:

    The Organisation for Economic Co-operation and Development (PISA) test results have generated much discussion – is it “a Sputnik” moment or are international comparisons invalid? Rather than wade into that debate, I’d rather look more closely at the questions in the PISA test and what student responses tell us about American education. You can put international comparisons aside for that analysis.

    Are American students able to analyze, reason and communicate their ideas effectively? Do they have the capacity to continue learning throughout life? Have schools been forced to sacrifice creative problem solving for “adequate yearly progress” on state tests?

    Your readers might enjoy answering one of PISA questions. It offers insights into the demands of higher order thinking. Do American students learn how to sequence (higher order thinking) or simply memorize sequences provide by the teacher?

    See my post for the question, answers, and PISA data – “Stop Worrying About Shanghai, What PISA Test Really Tells Us About American Students” http://bit.ly/tPE1YE

  • Joe Dantone says:

    I’m not an educator, but it’s clear that the main points of this study were that more money spent on schools buys the country nothing, that large districts are a good way to dumb down education based on results, and we should take a look at charter schools or other alternative schools if we really want to educate the children of this country.
    I also find it incredible that someone would go to a great deal of effort to criticize the report using data from a different year and noone will speak up about it.
    If that is the attitude, its no wonder kids today are so ignorant.

  • Karen says:

    I’d like to see a comparison using private school data combined with public school data in these “wealthy suburbs.” How can you compare our international ranking (PISA) w/out using these numbers?

  • Steve says:

    The bottom line in all of this testing is this – in this country there are NO consequences for the individuals taking the test. In most cases there score on these tests means absolutely nothing as far as they are concerned.

    If we want to improve the education in this country, we need to find a way to make society value the education that they receive. I would be a rich man if I had a $1 for every time a parent has said to me in a parent teacher conference ‘This math you are teaching is way beyond what I had in school. I haven’t been able to help my son or daughter for years.’ If parents are conveying these feelings to their kids, how do you suppose that makes them feel about what they are doing in school?

    There are certainly things that we can improve in our educational process in this country. But we as a society are foolish to think that the problem lies only with the system. If we want to base our assessment of our educational system on high stakes tests, then those need to be high stakes tests for everyone involved, not just the schools that are administering them. Until there is something to be gained or lost by the students that take the tests, the results will not improve. Furthermore is that what we really want in this country – kids that can do well on a high stakes assessment of what they have supposedly learned in a handful of subjects?

  • Steve says:

    Please excuse my incorrect use of ‘there’ in my comment. I meant it to be ‘their’. I suppose that is what happens when commenting on an article with your 9 month old son on your lap!

  • Alex Griffin says:

    I see two serious flaws in the methodology of this study. The first, which others have pointed out, is the old data. Keep in mind No Child Left Behind was just being implemented when the data used in this report were captured. Until you’re able to use fresher data, I don’t think you can make the types of assumptions you’re attempting to make on what’s going on in our schools.

    The second flaw is that you are not comparing apples to apples when you compare American scores to scores from other countries. American schools educate EVERYBODY, and test EVERYBODY – the excellent, mediocre, and failing students. Other countries weed out the mediocre students, sometimes as early as age 8, so the students taking the math and language tests would be the cream of the crop. Therefore, you’re measuring all American students against the best of the best from other countries.

    What would be helpful is a comparison of how our best students stack up against other countries’ best students – and how our mediocre students compare to others’ mediocre students. To my knowledge, there don’t exist the data sets to do this type of comparison. Until we better understand how other countries educate their kids, I don’t think we can have an informed conversation about how to educate our own.

  • PhilipMarlowe says:

    What would be helpful is a comparison of how our best students stack up against other countries’ best students – and how our mediocre students compare to others’ mediocre students.
    Excellent comment.
    We can wait for Dr. Jay P Greene (who despite a previous comment I left on his blog in March, does care about black students. Sorry Jay) to mine that data for us

  • [...] nationwide figures upon broader comparisons that have long demonstrated our mediocrity. Its authors give Beverly Hills as one example. It represents most affluent suburban districts, which Americans typically think contain great [...]

  • Comment on this Article

    Name ()


    *

         31 Comments
    Sponsored Results
    Sponsors

    The Hoover Institution at Stanford University - Ideas Defining a Free Society

    Harvard Kennedy School Program on Educational Policy and Governance

    Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

    Sponsors