Considering Outcomes, Cost, and ROI in Developmental Education Reforms

By Dan Cullinan

Students studying

CAPR research on developmental education reforms has provided the evidence that many institutions need to determine what kinds of changes can benefit their students and institutions.

In developmental education and other areas ripe for reform, college administrators are better positioned than ever before to make decisions about adopting programs based on the effectiveness and cost of interventions. There is, however, a third piece of information critical to decision-making: the revenue an intervention could generate through increased student retention, coursetaking, and degree attainment.

It is rare for studies to estimate the amount of revenue generated by an intervention. Furthermore, colleges differ in their tuition rates, and states differ in their higher education funding formulas, making revenue estimates from one institution only partially applicable to other contexts. As a result, college administrators are often left without the necessary data to make fully informed decisions.

MDRC’s Intervention Return on Investment (ROI) tool is a free interactive web application that allows community college administrators to estimate the costs and revenues associated with implementing an intervention at their college. The tool uses customized inputs such as regional prices, college expenditures, tuition prices, and state funding models to generate tailored estimates. State policymakers can also use these forecasts to make the case for an intervention’s financial benefits for students and the state and to identify gaps in funding that they can fill so colleges are able to implement and sustain interventions.

How to Use the ROI Tool

  • Go to www.mdrc.org/intervention-roi-tool.
  • Click the Start button to begin.
  • Choose from 21 interventions, each evaluated with a randomized controlled trial, which come pre-loaded with their costs and estimated effects on student outcomes. You may also input information about an intervention of your choosing.
  • Click the Next button that appears on the left.
  • Select a state and college. The tool estimates the costs of the intervention based on regional prices and college expenditures, as well as the revenue generated by the intervention based on tuition prices and state funding models.

A video demo of the tool is also available.

The tool comes pre-loaded with 21 community college interventions—including comprehensive student support programs, developmental education reforms, financial aid reforms, learning communities, communication campaigns, and mentoring programs—that MDRC has studied in randomized controlled trials. Recently, two interventions studied by CAPR were added to the tool: Dana Center Math Pathways (DCMP) and multiple measures assessment (MMA).

DCMP is focused on the development and implementation of new accelerated developmental and college-level math pathways that are better aligned with social sciences, health, and liberal arts professions and that allow community college students who place into developmental mathematics courses to complete a credit-bearing, transferable mathematics course within one academic year. Long-term outcomes of DCMP show a sustained positive impact on students’ completion of their first college-level math course for five years after random assignment.

MMA uses not only standardized placement test scores but also indicators of student preparation such as high school records, noncognitive assessments of behaviors and attitudes, background questions, or other test results to assess students for developmental education. A combined analysis at 12 community colleges across three states (New York, Wisconsin, and Minnesota) shows that MMA improved academic performance when it was designed to allow students to bypass a developmental course they otherwise would have been required to take.

DCMP and MMA helped students progress in college and saved them money. But what was the impact on college finances? Both interventions led to reduced developmental coursetaking, lowering costs but also decreasing revenue for colleges. In both cases, the loss of revenue associated with course attempts was almost as large as the reduction in costs to the college. The projected size of the gap between costs and revenues is calculated differently depending on the state in which the program is run. As an example, let’s project the ROI of both programs at the average college in Kansas, a state outside the CAPR studies. We look at both direct costs and indirect costs. Direct costs are associated with implementing and sustaining an intervention, such as staff, office space, computers, and financial support to students; indirect costs are associated with changes in student behavior that result from an intervention, such as attempting more credits, which might cause a college to need additional faculty, staff, and classroom space. Net costs in this example are the sum of these.

Using the ROI tool, we find that DCMP’s direct cost to each college in Kansas would be $153 per student. Because DCMP students take fewer developmental courses, there is an indirect savings of $13 per student over five years. The net cost of these is $140 per student. Decreased developmental coursetaking reduces tuition revenue by $12 per student in Kansas. This means the college would have to make up the $12 per student loss in tuition, putting the net cost back at $152 per student from the college perspective.

MMA’s direct cost to each college would be $53 per student. Because MMA students take fewer developmental courses, there is an indirect savings of $107 per student over three years. So the net amount is actually a savings of $54 per student. Because of decreased developmental coursetaking, there is also a $102 decrease in tuition revenue. The college would have to make up the $102 per-student loss in tuition, putting the net cost at $48 per student for the college.

There is an important caveat to these findings: Interventions may generate benefits to students and society that are not captured in the savings and costs to a college, so just because an intervention costs money does not necessarily mean it’s not worth implementing. This is certainly the case for DCMP and MMA, where students save money by taking fewer courses. On top of this, they benefit from positive impacts on academic outcomes—a win-win!

The results of the ROI tool depend on a number of assumptions. For example, the default assumption is that the marginal cost of offering more (or fewer) credits is only a third of the average cost. If the assumption had been a marginal cost of 50%, for example, the total cost of DCMP would be lower ($139 instead of $152), and MMA would be higher ($59 instead of $48). Furthermore, in states with performance funding, the positive impacts on student outcomes could trigger an increase in revenue that could offset some of the loss of tuition revenue from reduced developmental coursetaking.

MDRC’s ROI tool equips colleges and policymakers with data-driven insights to evaluate the financial and societal impacts of interventions. By assessing costs and revenues in the context of institutional goals and regional funding models, stakeholders can prioritize reforms that improve student outcomes. Policymakers can also use the tool to pinpoint funding gaps and advocate for resources that support impactful reforms. We encourage policymakers and college leaders to explore MDRC’s ROI tool and take actionable steps toward advancing equitable and effective improvements in student outcomes.