PARCC and Massachusetts state exams predict college success equally well



By 05/17/2016

Print | NO PDF |

Summer 2016 / Vol. 16, No. 3

Contact:
Ira Nichols-Barrer: (617) 674-8364, inichols-barrer@mathematica-mpr.com, Mathematica Policy Research
Amanda Olberg: (617) 496-2064, amanda_olberg@hks.harvard.edu, Education Next Communications

PARCC and Massachusetts state exams predict college success equally well

In math, PARCC’s college-ready cutoff score is set at a higher level than the MCAS proficiency cutoff

May 12, 2016—In a first-of-its-kind study, researchers from Mathematica Policy Research evaluate how accurately a “next generation” high school assessment designed for the Common Core, as compared to the state assessment in Massachusetts, predicts college success. This research, commissioned by the Massachusetts Executive Office of Education to inform the state’s decision this past fall as to which test to use going forward, provides important evidence for any state considering whether and how to upgrade their assessment systems.

In a new article for Education Next, Ira Nichols-Barrer, Erin Dillon, Kate Place, and Brian Gill report that scores on the Partnership for Assessment of Readiness for College and Careers (PARCC) exam and the Massachusetts Comprehensive Assessment System (MCAS) exam do equally well at predicting students’ success in college, as measured by first-year grades and by the probability that a student needs remediation upon entering college. But in mathematics, PARCC set a higher standard for college-ready performance than MCAS’ “proficient” standard—and meeting the PARCC standard provided a better indication of whether a student was prepared to earn a “C” grade in a college math course.

For the purpose of the analysis, over 800 college freshman were randomly assigned to complete one component of either the MCAS or PARCC exam, and college transcript data were collected for all students in the sample. Correlations between MCAS and PARCC scores and college grades are not statistically distinguishable. In both subjects, scores on both exams are at least as strongly correlated with college grades as the SAT, a widely used indicator of college readiness. MCAS and PARCC scores also do equally well at predicting which students will need remedial coursework in college.

The study also evaluates the utility of the cutoff scores that define performance levels on each exam. In math, PARCC’s cutoff score for college- and career-readiness is set at a higher level than the MCAS proficiency cutoff and is better aligned with what it takes to earn “C” grades in college math. Students at PARCC’s college-ready cutoff score in math have an 85 percent probability of earning a math “C” average or better, whereas students at the MCAS cutoff score for proficiency in math have only a 62 percent probability of doing so. This finding indicates that meeting the PARCC college-ready standard in math provides a better signal that a student is indeed prepared for college-level work than does achieving proficiency on the math MCAS. Differences between the two tests in the utility of the cutoff scores in English language arts are not statistically significant.

Because the underlying scores on the MCAS and PARCC assessments are equally predictive, the authors emphasize, it is possible to better align the MCAS math exam with college readiness by simply setting a higher score threshold for college readiness. Since states using PARCC have discretion in setting their performance levels, it is important for policymakers in these states to note that PARCC chose appropriate thresholds for deeming a student “college-ready,” giving students good information about whether they are prepared to succeed in college courses.

To receive an embargoed copy of “Testing College Readiness: Massachusetts compares the validity of two standardized tests” or to speak with the authors, please contact Amanda Olberg at amanda_olberg@hks.harvard.edu. The article will be available Tuesday, May 17 on educationnext.org and will appear in the Summer 2016 issue of Education Next, available in print on May 23, 2016.

About the Authors: Ira Nichols-Barrer is a researcher at Mathematica Policy Research, where Erin Dillon and Kate Place are analysts and Brian Gill is a senior fellow.

About Education Next: Education Next is a scholarly journal committed to careful examination of evidence relating to school reform, published by the Hoover Institution at Stanford University and the Harvard Program on Education Policy and Governance at the Harvard Kennedy School. For more information, please visit educationnext.org.




Sponsored Results
Sponsors

The Hoover Institution at Stanford University - Ideas Defining a Free Society

Harvard Kennedy School Program on Educational Policy and Governance

Thomas Fordham Institute - Advancing Educational Excellence and Education Reform

Sponsors