Fulfilling the Promise of Community College: The ASAP Demonstrations

By and 12/05/2016

Print | NO PDF |


Community colleges, which enroll nearly 40 percent of undergraduates, have very low graduation rates. Only 20 percent of full-time community college students who seek a degree manage to graduate within three years. That rate rises to 35 percent after five years, but by then another 45 percent of degree-seekers have given up and dropped out of college.

How can we improve these troubling statistics? Early last year, MDRC, a respected research firm that specializes in evaluations of social policies, released the results of a randomized trial at the City University of New York (CUNY). In partnership with MDRC, CUNY tested an innovative program of wrap-around support services known as Accelerated Study in Associate Programs (ASAP), which provides intensive academic supports and incentives for its students. The ASAP program requires that students enroll full time and attend advising and tutoring sessions. All of their financial need is covered, and they receive free textbooks.

The results of the CUNY trial were impressive: ASAP doubled the three-year graduation rate of students seeking an associates’ degree (from 22 percent to 40 percent), while also increasing the share of students who transferred to four-year colleges to seek a BA (from 17 percent to 25 percent). And while the program increased annual costs per enrolled student (by about $5,400), it actually reduced the total cost of producing each additional graduate.

For both policy and science, replication of any finding is critical. Education research is littered with innovative programs that performed beautifully when tested by their inventor in small settings, but failed when expanded to a broader set of schools. Perhaps the ASAP trial results were a statistical fluke, never to be repeated. Or, more optimistically, the ASAP approach may work consistently at CUNY, but can’t be replicated at other colleges. Perhaps CUNY’s staff is especially adept at running this sort of intensive program, and its students particularly responsive to its supports.

The only way to determine whether ASAP can work in more community colleges is to test it. MDRC did exactly this, launching randomized trials of the ASAP approach at several community colleges in Ohio. Results from the first year of the pilots are positive. The Ohio programs increased enrollment intensity, with more students attending full-time. They also boosted credit accumulation, and persistence into a second semester. ASAP-style programs work not only in Manhattan but also in the Midwest.

In order to test the effectiveness of ASAP in other settings, MDRC and CUNY worked with partners in Ohio to develop, implement, and evaluate new wraparound programs similar to ASAP. Three Ohio community colleges participated in the trial: Cincinnati State Technical and Community College, Cuyahoga Community College, and Lorain County Community College.

Ohio also has a much more decentralized community-college system than does New York City. A central office at CUNY ran ASAP at the system’s many community colleges. In Ohio, by contrast, the programs are managed at the individual colleges (with support from a network of participating administrators and the Ohio Department of Higher Education). Could the ASAP approach work in a setting where individual colleges have almost complete autonomy over their academic and counseling services?

The ASAP program was intended to help students get over any barriers that lie between them and graduation. The barriers depend on the context, which is why CUNY and MDRC did not attempt to simply duplicate the ASAP system in Ohio. Instead, they tailored the programs to the setting.

In New York, students who attended all assigned mentoring and tutoring sessions received a MetroCard (worth about $100 a month), which gave them free access to the city’s extensive subway and bus network. Students at the Ohio colleges had access to a much more limited public-transportation network. The schools therefore decided to offer students a monthly $50 gift card that could be used for groceries or gas.

CUNY is an enormous system, and was able to reserve entire course sections for ASAP participants, thereby giving them priority for registration as well as a shared academic experience. In Ohio, the colleges reserved seats but not entire sections for treated students. The Ohio schools also enrolled students in existing, first-year seminar courses instead of creating a new course specifically for program participants. These changes show that ASAP programs can be worked into the existing community college curricula.

Students at the Ohio colleges differ from those in New York: they are older, more likely to have children, and more likely to be working.  Again, it’s an open question whether an ASAP-like program can work for such students, who tend to have even higher dropout rates than traditional college students. Study participants were recruited from Pell-eligible students who had indicated interest in enrolling full-time at the participating community colleges.  Participants were randomized into treatment and control groups before enrollment.

As in New York, all financial need of students in the treatment group was covered. The additional aid for the treated students was actually quite small, since in both settings the vast majority of students already had their full need met. Students received textbook waivers for the campus bookstore as well as the gift cards described above. Program participants also received early access to registration, valuable when courses are over-subscribed.

To receive these benefits, students had to enroll full time and participate in a set of advising activities. Students were required to enroll in a seminar on successful college strategies during their first semester. They were required to visit their academic advisor at least twice a month in the first semester, and then as needed in subsequent semesters. Students had to attend tutoring sessions if they were either enrolled in developmental education, identified by a faculty member as having difficulty, or on academic probation. Students had to meet with career-services staff (or attend a career event) each semester.

What were the results? Treated students were far more likely to enroll full-time (85 percent vs. 67 percent). Note that all participants had indicated an intent to enroll full-time. The results show that community colleges can make that intention a reality with intensive supports and incentives.  Treated students both attempted, and earned, 1.5 more credits (about half a college course) than those in the control group during their first semester.

Effects persisted into the second semester, which is as far as the study has followed students (MDRC continues to track results and will release additional reports). Treated students were twelve percentage points more likely to enroll (82 percent vs. 70 percent) and 25 percentage points more likely to enroll full-time (73 percent vs. 48 percent). Treated students attempted two more credits than the control students during the second semester.

MDRC, which has considerable experience in running trials in community colleges, says the programs in Ohio and New York are producing some of the largest effects they have seen in postsecondary education. This is particularly impressive given that the Ohio program cost an average of $3,000 per student, well below the $5,400 per student of the New York program.

Stakeholders in New York are now scaling up the program to include more students. This early evidence from Ohio indicates that that the ASAP approach can be replicated in a different setting, with a different population. The Ohio colleges, in close consultation with CUNY and MDRC, successfully adapted the ASAP model of incentives, supports, and requirements to fit the needs and resources of individual colleges. It’s especially promising to see the model succeed in a system with decentralized governance and with a population of non-traditional students.

The Ohio and New York initiatives did not hinge on new legislation, or funding, from states or the federal government. Foundations are paying for the Ohio pilot, for example, with the expectation that the colleges will assume the costs in later years. These partnerships among schools, and with foundations, provide a model for community colleges to launch programs that improve outcomes for their students, even in the absence of any new state or federal policies.

— Susan Dynarski and Meghan Oster

Dynarksi, a nonresident senior fellow at Brookings, is a professor at the University of Michigan and a member of the board of MRDC.  Oster is a graduate student in Higher Education, Research, Evaluation, and Assessment at the School of Education at the University of Michigan.

This post originally appeared as part of Evidence Speaks, a weekly series of reports and notes by a standing panel of researchers under the editorship of Russ Whitehurst.

The author(s) were not paid by any entity outside of Brookings to write this particular article and did not receive financial support from or serve in a leadership position with any entity whose political or financial interests could be affected by this article.

Sponsored Results

The Hoover Institution at Stanford University - Ideas Defining a Free Society

Harvard Kennedy School Program on Educational Policy and Governance

Thomas Fordham Institute - Advancing Educational Excellence and Education Reform