SUNY Colleges Are Using Homegrown Multiple Measures Systems to Place Students During the Pandemic

By Elisabeth Barnett

A student carries a backpack on campus

It’s becoming increasingly apparent that traditional placement tests do a poor job of determining which students should be allowed to enroll in college-level courses and that multiple measures assessment and placement systems do much better. In particular, the high school GPA has been found to better predict success in college courses, possibly because it captures not only content knowledge but also the ability to be a competent student. Now that colleges have moved most of their operations online in response to the pandemic, there is extra motivation to find alternatives to in-person placement tests.

New research findings

Our new CAPR report, entitled Who Should Take College-Level Courses? Impact Findings From an Evaluation of a Multiple Measures Assessment Strategy, shares findings of a multi-year experimental study of a data analytics, multiple measures assessment and placement system. Though they’re still not perfect, we found that multiple measures systems do a better job of placing students than standardized tests alone.

Beginning in 2015, CAPR obtained historical student data—such as high school GPAs, ACCUPLACER scores, and time since high school graduation—from each college in the study and used it to develop an algorithm that would predict success in college-level math and English courses. Students entering in fall 2016, spring 2017, and fall 2017 were randomly assigned to be placed using either traditional methods (usually scores on placement tests) or the new algorithm. After three semesters, it was clear that the multiple measures system gave many more students a chance to take college-level courses, especially in English. And program group students had a rate of completion of a college-level English course 2.9 percentage points higher than their peers after three semesters. Improvements in college math outcomes in early semesters didn’t hold up.

One of the most interesting things that emerged in this research, however, was the impact on students who were “bumped up” or “bumped down.” Program group students who were bumped up into college-level courses were 8–10 percentage points more likely to complete a college-level math or English course within three terms. Students who were bumped down into developmental courses were 8–10 percentage points less likely to complete a college-level math or English course within three terms.

Our takeaway? When in doubt, give students the opportunity to enroll in college-level courses.

What colleges are doing now

Our multiple measures research was conducted during the “before times,” and things have changed. While many colleges were already moving toward the use of multiple measures placement systems, COVID-19 greatly accelerated the transition.  It is more difficult to administer placement tests when students are not on campus and are unable to access testing labs, and this rapid shift to multiple measures makes the lessons from our research all the more important for colleges.

Colleges in the SUNY System have responded to COVID-19 by streamlining and redeploying many of their operations, including placement, to support students during this crisis. Beginning in the spring of 2020 and spurred by the shift to remote learning, colleges involved in SUNY’s Strong Start to Finish initiative began to rethink placement systems. In a remarkably short period of time, almost all of them have developed systems that do not rely on single placement tests, and some are planning to forego placement tests altogether going forward. Here are examples of approaches that were developed for students entering college in the fall of 2020:

  • Westchester Community College created a questionnaire that includes items on writing and math content knowledge, as well as students’ perceptions about how ready they are for college-level courses. Based on their responses, a score is generated to indicate course placement. Students then have a conversation with an advisor to review the placement and select courses.
  • Orange County Community College determines English placements based on students’ high school GPAs and/or their scores on the New York State Regents exam in English. Math placements are based on a mathematics self-assessment that asks students quantitative and qualitative questions about their performance in—and comfort with—math.
  • At Fulton-Montgomery Community College, academic advisors meet with each new student to review high school transcripts, SAT/ACT scores (if available), and New York State Regents exam scores to determine math placement. They then discuss students’ academic comfort level, career goals, and program requirements to arrive at a final placement determination. Similarly, an English faculty advisor reviews multiple measures for English course placement recommendations, including a writing sample, high school transcripts, and New York State Regents exam scores.
  • Schenectady County Community College identified four introductory college math courses that are available to any student with no placement required. For other math courses, placement is based on the student’s high school GPA and a conversation between the student and an advisor, who uses a flow chart as a jumping off point. If a student does not agree with their placement, which may be in a corequisite course, they can take an advancement test to verify their readiness.

As a next step, colleges will assess how well their new systems work and whether they support—or hinder—student completion of gatekeeper courses and progression through college. Our research suggests these new placement systems may give more students access to college-level math and English. But colleges will have to consider how to shape placement rules and what kinds of supports students might need so that the benefits don’t stop with their first college courses.

Elisabeth Barnett leads CAPR’s assessment study and is a senior research scholar at CCRC.