Comparing PDK and Education Next Polls



By 08/21/2014

0 Comments | Print | NO PDF |

Just released this week are two major education polls, one by Education Next (EdNext), a journal of opinion and research, and the other by Phi Delta Kappan (PDK), a journal that serves the alumnae of schools of education. Both survey nationally representative samples of the U. S. adult population. EdNext polls about 5,000 respondents, including a nationally representative sample of teachers, by means of an online survey administered by Knowledge Networks. PDK poses questions to about 1,000 respondents in a poll administered by Gallup.

(Caveat: Michael Henderson, Martin West, and I are responsible for the design, management and interpretation of the EdNext poll.)

Even before the polls have been officially released, I have been asked by reporters to explain differences in some of the findings from the two polls. My answer, in a nutshell: Except for small sampling errors, both surveys provide accurate information on the state of public opinion, given the questions that have been asked. When identical or very similar questions are posed, and the response categories are the same, both polls find much the same thing. For example, both surveys find that about 50% of Americans give their local schools a grade of A or B, but only about 20% give the nation’s schools as a whole one of those two grades.

Differences between the two polls derive from the questions that are asked and the way in which they are posed. Numerous topics are covered by one poll but not the other. EdNext asks the public (and teachers) to evaluate teachers at local schools. It identifies differences in support for teacher salary increases, depending on whether respondents know current salary levels in their state. It probes public support for teacher tenure and merit pay. It also asks a wider range of questions concerning school choice, gauging support for tax credits and vouchers for students attending failing schools, for instance. It looks at support for preschool programs and student readiness for college. Instead of these topics, PDK asks the public to identify the biggest problems schools face, their estimate of President Obama’s performance on education, and the appropriate division of responsibilities between levels of government.

Some topics are covered by both polls, however. It is with respect to these topics– common core, charters, vouchers, international standards, and student accountability–that reporters have raised questions about seemingly conflicting results. Let’s take a look at the specifics.

Knowledge about Common Core State Standards: Essential Agreement

When it comes to Common Core State Standards, both surveys suggest that about half the public has little, if any, knowledge about the initiative. In the EdNext poll, respondents are bluntly asked if they had previously heard of the standards, without giving respondents any specific information. 57% admit they have not heard of Common Core. PDK provides more context when it asks whether the respondent had “heard about the new national standards for teaching reading, writing, and math in grades K through 12, known as the Common Core State Standards?” Even with this help, 53% of respondents say they have heard “only a little (34%)” or “nothing at all (19%).”

Support for National Standards: Different Questions Get Different Responses

On support for Common Core State Standards, the two surveys generate seemingly conflicting results. EdNext reports that 53% (or even as much as 68%) back national standards, but PDK seems to find that only 33% do. The reader may be excused for shaking his or her head, wondering why two polls asking about the same policy should get such different responses. The explanation is fairly straightforward: the two polls asked different questions.

EdNext split its sample in half, asking the first group to evaluate the Common Core and the second to evaluate the idea without identifying it as the Common Core. The two versions of the question were worded as follows, with the wording in brackets included only in the question posed to the first half:

As you may know, in the last few years states have been deciding whether or not to use [the Common Core, which are] standards for reading and math that are the same across the states. In the states that have these standards, they will be used to hold public schools accountable for their performance. Do you support or oppose the use of these [the Common Core] standards in your state?

When the phrase Common Core was used, 53% said they favored the standards; when the phrase was not used, support jumped to 68%. Opposition to the Common Core is limited to 26% of the public when the standards are so identified and to just 16% when national standards are mentioned without regard to the Common Core.

Clearly, the words “Common Core” are toxic. Even so, support exceeds opposition by a 2:1 ratio, and when the phrase is eliminated, it leaps to 3:1.

So why does PDK find that only 33% favor the Common Core? Answer: It did not ask directly about support for Common Core, but rather, about the use of Common Core to guide teaching at the local level, namely:

“Do you favor or oppose having the teachers in your community use the Common Core State Standards to guide what they teach?”

Proponents of Common Core insist that nothing in the standards will dictate how and what curriculum is to be used locally, so they might object that PDK’s question is misleading. Still, the PDK question is of interest, because it tells us that opposition to Common Core is likely to rise if it is perceived as interfering with local curricular decisions.

(Methodological note: Also, the EdNext survey asked a representative sample of the public whether or not they supported the Common Core, but the PDK poll asked its question concerning teacher use of the Common Core only to those who had said they had some knowledge of the Common Core. The PDK report does not state whether those excluded from the question are just the 19% who said they had heard “nothing at all” or also the 34% who said they had heard “only a little.”)

Charter Schools: Essential Agreement

Responses to the charter school question create the illusion that PDK and EdNext obtain conflicting results, even though both surveys asked about charter schools in roughly the same way. PDK found 70% of the public in support of charters, with just 29% percent in opposition, while EdNext identified a lower level of support (54% in favor to 28% opposed). Differences are due to the fact that EdNext allows respondents to say they “neither support nor oppose” charters, an option selected by 18 percent of the total. PDK does not allow that option, though it does allow respondents to say they don’t know, an option selected by just one percent. If we ignore those who don’t take a side in the EdNext poll, the percentage of support for charter schools is 66%, roughly the same as the 70% support level identified by PDK. Properly understood, the two polls show much the same thing—though it is of interest to learn that about a fifth of the public don’t appear to have a definite view.

School vouchers: PDK Loads the Deck

PDK finds much less support for vouchers than does EdNext, but the wording of the PDK question is strongly biased against vouchers. Here are the two questions:

EdNext: A proposal has been made that would give families with children in public schools a wider choice, by allowing them to enroll their children in private schools instead, with government helping to pay the tuition. Would you favor or oppose this proposal?

50% favor, 39% oppose.

PDK: Do you favor or oppose allowing students and parents to choose a private school to attend at public expense?

37% favor, 63% oppose.

Notice PDK’s use of the phrase “at public expense.” People don’t like letting others do something “at public expense.” Support shifts upward when the public is asked about vouchers in a less pejorative manner.

International Tests: Public Pretty Much Knows the Truth

Does the public think the United States does well on international tests as compared to other countries? On this matter, questions are very different but both polls show that the public pretty much understands how American students perform relative to those in other industrialized nations. EdNext asked the following:

“Please tell us your best guess for this question: A 2012 government survey ranked the math skills of 15-year-olds in 34 industrialized countries. With 1 meaning the best and 34 meaning the worst, what is your best guess of where American 15-year-olds ranked on this test?”

On average, the public guessed that the U. S. ranked 19th out of 34. The guess is on the high side, as the Program on International Student Assessment (PISA) actually found that the United States was tied for 23rd place. Still, all things considered, that is one fairly accurate guesstimate.

PDK asked a more general question, as follows:

“How do you think students in the United States performed on this international test (PISA) as compared to students in other countries? Do you think that U. S. students scored higher than students in most of the nations, U. S. student scores were in the middle of the rankings of student scores, or U. S. students scored lower than students in most of the nations?”

To which the 50% of the public responded “lower,” 46% responded “middle” and only 3% said higher.

The PDK question does not make clear whether it is referring to the math, science or literacy PISA test, so it is hard to know what the “correct” answer to this question is, as U. S. performance was higher on literacy test than on the math and science tests. A further difficulty is that PISA was administered to 65 nations, including many in the developing world, and we don’t know whether the public, when asked the PDK question, was thinking about the developing world when asked about “most of the nations.” If the comparison is with the industrialized world (that is, the 34 members of the Organization for Economic Co-operation and Development, with which the United States ordinarily compares itself), then the public is pretty much correct to say that the United States is somewhere in the middle or below.

Student Accountability: Surveys Agree

When it comes to asking students to pass a test to get a high school diploma, both surveys find overwhelming support for the practice.

EdNext asked:

“In some states, students must pass a state exam before they are eligible to receive a high school diploma. Do you support or oppose this requirement?”

71% think this is a good idea, with only 20% opposed.

PDK asked:

“How supportive are you of the following tests….. Tests used to determine whether a student can be awarded a high school diploma.”

About 80% are favorable.

The small difference between the two polls is probably due mainly to the 8 percent of EdNext respondents who said they “neither support nor oppose” the requirement. All PDK respondents took a position, as none are identified as saying they don’t know.

Conclusion

The EdNext and PDK polls are complementary, not conflicting, surveys of public opinion. When questions are worded in the same way, similar results are obtained. When questions are worded differently, the way in which the public responds to alternative phrasing provides instructive information about the nuanced way in which the public looks at issues. For example, EdNext identifies majority (but declining) support for Common Core, while PDK finds a strong majority opposed to using those standards to determine curricular and instructional decisions at the local level.

The two polls do differ significantly in the topics they explore. PDK provides information on how the public views the proper roles of each level of government and the challenges local schools face. EdNext covers public evaluations of teachers, knowledge of and support for school expenditures, and levels of support for teacher tenure, merit pay, preschool programs, readiness for college, tax credits, and vouchers for families with children attending failing schools. EdNext also identifies trends over time when shifts in public opinion are detected.

Readers can take their pick—or, better yet, check out both–in full confidence that it is not the survey methodology but the questions that shape the results.

—Paul E. Peterson




Comment on this article

Comment on this Article

Name ()


*

     0 Comments
Sponsored Results
Sponsors

The Hoover Institution at Stanford University - Ideas Defining a Free Society

Harvard Kennedy School Program on Educational Policy and Governance

Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

Sponsors