Evaluation of Alternative Placement Systems and Student Outcomes

College students taking a standardized test

July 2014–March 2024

Research suggests that the standardized tests used by most institutions to place students into developmental education or college-level courses do not always accurately determine which students will benefit from developmental education or are better off in college coursework. CAPR’s assessment study was developed to help colleges improve upon current assessment practices to make sure students are placed into the appropriate level of courses by addressing two questions:

  1. How do alternative assessment and placement strategies affect students’ overall academic performance, persistence, and progress toward college degrees?
  2. What does it take to implement alternative assessment and placement strategies? What does it cost?

This study, initially funded for five years but extended to eight, is being conducted in collaboration with seven community colleges in the State University of New York (SUNY) system.

The Study Design

This random assignment study evaluates a placement method where colleges use a multiple measures algorithm to predict students’ performance in college-level math and English courses. In addition to placement test scores, predictive measures may include high school GPA, time out of high school, and other relevant data points. Data on these measures were used to develop a predictive model, and the colleges then determined a cut score to assign students to college-level math and English courses or developmental courses in math, reading, and writing.

A total of 13,000 students entering one of the SUNY colleges in the fall of 2016 through the fall of 2017 were randomly assigned to be placed via either multiple measures or existing placement practices. The students’ college performance was initially tracked for three to five semesters following placement, and the follow-up study tracked students for five additional semesters, through spring 2021. The outcomes of primary interest in the first phase of the study were completion of the first college-level courses in math or English and total college-level credits earned. The extension allowed researchers to examine students’ degree attainment at any SUNY institution. It also allowed for additional data collection on outcomes previously considered and more precise calculations of costs and the cost-effectiveness of multiple measures assessment and placement.

What We’ve Learned

A CAPR working paper released in February 2017 discusses the history of college placement testing, the limitations of placement tests, the consequences of placement errors, and the movement toward multiple measures placement. It also describes several approaches to multiple measures assessment and placement.

In September 2018, CAPR researchers published a report on the SUNY random assignment study with promising early findings. Notably, students placed using multiple measures were more likely to be assigned to and complete college-level courses in their first term. Though using multiple measures impacted both math and English placement, researchers found that the effects were especially large in English. All seven colleges in the study successfully implemented the alternative placement system, even though doing so was more difficult than initially expected.

A report released in 2020 followed students in the study for up to five semesters. It found that multiple measures placement allowed many more students to be assigned to college-level courses. In math, gains in college-level enrollment and completion were small and short-lived. But in English, the effects were much larger and lasted through at least three semesters. Regardless of whether they were predicted to succeed, students did better when they were allowed to start in college-level courses.

A brief and working paper released in October 2023 detailed the results of the long-term follow-up. After nine terms, researchers found that students who were bumped up into college-level math and English courses through MMA were much more likely to have enrolled in and completed a college-level course than similar business-as-usual group students. The benefits of MMA were likely driven by increased access to college-level courses rather than by improved placement accuracy from MMA.

MDRC’s Intervention Return on Investment (ROI) Tool for Community Colleges can be used to estimate the costs and revenues associated with multiple measures at a specific college or the average college in a state.

This research is supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305C140007 to Teachers College, Columbia University. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.