Video: The Evolution of Multiple Measures Placement at CAPR Study Colleges

By Elizabeth Ganga

In one of CAPR’s major studies, researchers worked with seven State University of New York (SUNY) community colleges to set up multiple measures placement systems that used an algorithm to predict success in college-level courses. The study tested whether the algorithm or traditional test-only systems led to more accurate placement in initial math and English courses and allowed students to make more progress to a degree.

Though the new systems increased the number of students who took and passed college-level English and math, non-experts struggled to maintain and update the algorithms without the technical help provided by the researchers. After the study ended, some of the colleges turned to simpler multiple measures placement systems based on high school GPA and sometimes other factors; these systems are similar to those being evaluated in a separate CCRC and MDRC study of multiple measures. In interviews during the CAPR conference in November 2019, representatives from three SUNY colleges explained how their placement systems evolved after the study.

Transcript

Title cards: SUNY Colleges on CAPR’s Multiple Measures Study
Seven SUNY community colleges set up multiple measures placement systems.
The CAPR study compared multiple measures placement to traditional test-based placement.

Stephen Burke, dean of the School of Arts and Humanities, Rockland Community College: Before we got involved with the CCRC, MDRC, the CAPR study, we had already had a number of concerns about how we were placing students. We were particularly concerned with how accurate the Accuplacer test was.

In retrospect, looking at it now, I realize that we were just substituting or adding onto it, more high-stakes tests that were probably very predictive of how students were doing on high-stakes tests.

Malkiel Choseed, writing program coordinator, Onondaga Community College: We had been [doing] what we sort of jokingly referred to as multiple measures light, our own sort of in-house attempt to try to correlate placement with GPA. And for English, we were coordinating it with a reading score.

We had a sense that we were casting our net too wide, and this confirmed it, I think.

Michael O’Connor, chair of the Department of English, Integrated Learning Studies, and Communication Studies, Onondaga Community College: We wanted to really prioritize only getting students into developmental courses if there was no way they were going to be successful without that support. And so we felt that the CAPR study was a nice tool for us to do that and get a little more fine-tuned and accurate in our placement models. And we did see a fairly significant drop in placement rates.

Burke: At that time—back, we’re talking about 2012—some students were placing into what we called English 080. Those students would have to take English 080; they’d have to pass that and take English 090. Those students would have to pass English 095, and those students would then, if they passed English 095, would be allowed to take English 101, which was the first transfer-level course.

Students had, if they placed at the lowest developmental writing class, they had about a 20 percent chance, at best, of being allowed to take a college-level writing class. And that meant that 80 percent of our students were never allowed to take that class, which was really telling them, “You cannot pass go, you’ll never collect your $200, and you will absolutely never be able to obtain an associate’s degree.”

The impact of multiple measures placement

Burke: Within a year, once the multiple measures sorting was happening at full bore, I started seeing some really significant changes to how students were being placed. Many more students were placing at the college writing level.

To my great joy, the students were passing pretty much at same rate as the students who had been passed using the Accuplacer test alone.

Scott Putorti, coordinator of testing and assessment, Westchester Community College: At the point I came in, it was pretty seamless for a student. They didn’t know that all these implementation processes happened behind the scenes. For that student, they came in, they took a placement exam, we used their assessment, their GPA, and their other measures that we were using, and it was pretty seamless for them.

What we’ve seen is that, at the college, the algorithm did absolutely place a larger, substantial amount of students into college-level English. But the exciting part, and the more important part, is that those students were succeeding at the same level or, if not, even a little higher.

For mathematics, it was a slightly different story. We didn’t see those numbers, that big bump. And it’s actually created a lot of conversation, though, at the college, which is a good thing. Because now we’re saying, “What can we do? What’s our next step? Are we connecting this to guided pathways? Are we going to be using these other conversation pieces?” And we’re going to continue to explore that.

How placement changed after the study

Choseed: Well, this is where the story gets a little complicated. After the study officially ended, we had talked to our various administrators and people in our placement office about trying to continue it. And I think the best way to put it is, at the time, the people in those on-the-ground positions who would actually do it didn’t have the support or knowledge to make it work. They had said, “We’ll give you the algorithm. If you can figure out how to do it, then we’ll do it.” And I remember they gave us a piece of paper that had it sort of laid out, and it was meaningless to us.

We felt very confident moving forward that we could take the principles that we had learned and apply them. We decided that we would move to an overall high school GPA of 85 places you in college English, and below that, you have the opportunity to test using the WritePlacer exam. And then we came to the conclusion that, I think it’s 4 or higher puts you directly into credit-bearing, 3 or below you went into the coreq. At this point, we didn’t have any standalone sections. And I think—how many years ago did we switch to 75?

O’Connor: Just a year ago; last year it was 75.

Choseed: A year ago we switched to 75, again, based off the work they’ve been doing in California.

Putorti: We have actually stopped using the algorithm when we were forced to move over to Next-Generation Accuplacer. And that is a challenge when you have these historical algorithms; you really need someone with that data knowledge to be able to do that kind of regression analysis. So what we’ve done is, we kind of are continuing, as I said, to reevaluate. We’ve really been pushing more of an exemption system at this point, so we’re using high school GPA, we’re using these other measures, others, SAT, ACT, Regents scores, student classwork, and we’re integrating that into a more tiered system of exemptions and waivers. But that doesn’t mean we’re going to stop there. We’re still very interested in saying, “How else can we make this a more holistic approach?”

We didn’t veer away from multiple measures. It’s still something we’re absolutely using and continue to fine-tune for our institution.

Burke: It’s been a very great success, and this whole process has really been transformative. As of this year, this past September, we’ve gotten rid of standalone developmental writing classes. We just don’t think that they’re really helpful to the student.

For students who have gone to a high school in the United States, right now, our major determinant is if you can show me your transcript, even if it’s an unofficial transcript, just show me your high school GPS [sic], I can place you right away into your college-level writing course.

I’m not hearing anecdotally right now any great concerns, or anybody telling me about all these students are underprepared, or that everything’s totally changed from last semester. So I anticipate by the end of December, January, when we get the results of the fall semester, find out what’s really happening, that it’s going to be similar results that we’ve had in the last year or two.

Elizabeth Ganga is the communications manager at CCRC.