Podcast: The Benefits of Using Multiple Measures Assessment to Bump Students Into College-Level Courses

Two students walk together

We knew from earlier CAPR studies that multiple measures assessment (MMA) resulted in more students being assigned to college-level math and English—instead of remedial or developmental courses—and more students passing those required courses.

But now CAPR has released the results of a long-term follow-up that tracked students for nine semesters after they were randomly assigned to developmental or college-level courses using MMA. The result: It’s clear that students have better short-term and longer term outcomes if they are moved up to college-level courses and given a chance to show what they can do.

In this podcast discussion with CCRC’s Elizabeth Ganga, CCRC and CAPR researchers Elizabeth Kopko and Hollie Daniels explain what they found when they looked at the data after nine semesters and what educators and policymakers should take from the research.

Transcript

One of the most important things to be taken from this research is the fact that MMA should be used to expand access to college-level courses.

Elizabeth Ganga, CCRC senior communications manager: I’m Elizabeth Ganga, and I’m talking with Beth Kopko and Hollie Daniels. They lead CAPR’s study of multiple measures assessment that tracked students for nine semesters to understand the long-term effects of this important developmental education reform. Can you introduce yourselves and tell me what you do at CAPR? Hollie, do you want to start?

Hollie Daniels, CCRC research associate: Sure, I’m Hollie Daniels. I’m a research associate at the Community College Research Center. Most of the work I do is on CAPR projects related to multiple measures. So, I’m excited to be here today and to talk more about this project.

Elizabeth Kopko, CCRC senior research associate: Hi, and I’m Elizabeth Kopko, but I go by Beth, and I’m a senior research associate at CCRC. And a lead researcher for the Center for the Analysis of Postsecondary Readiness or CAPR.

Ganga: Great! So tell me about the study. What were you trying to figure out in the beginning and then with the long-term follow up.

Kopko: So, the study seeks to really understand the impact of MMA on student outcomes. So back in 2014, when the project was first launching, I think it’s important to understand that many institutions around the country were still relying solely on standardized placement tests to determine whether students were ready for college-level courses. But at that same time, there was a lot of emerging research that was putting into question whether or not these high-stakes placement tests were actually predictive of students’ ability to perform well in those college-level courses. Many incoming college students who would have succeeded in entry-level, college-level courses were being referred to take developmental education or remedial courses in math and/or English. And that was unnecessarily costing them a lot of money and time. And of course, at the same time, there may have been students who would have benefited from developmental education courses, right, that were instead being placed into college-level courses based on those test scores. So, there was a lot of what we call misplacements, or we thought there might be.

So, in response to that research and that line of thinking, CAPR became interested in understanding whether combining multiple measures, including things like high school GPA, as well as those test scores, might allow colleges to more accurately predict students’ performance in those college-level math and English courses—and thus better place them in the course most appropriate to support them and put them on a path toward success. So, to do that, we implemented an experimental study of MMA at seven community colleges in the State University of New York system or SUNY. And, as you mentioned, there’s been many phases of this study. It started in 2014, initially as a five-year study, and then we got more funding to extend it and look at longer term outcomes, which is, you know, the product that we’re talking about today. Up until now, findings from CAPR, as well as research across the country from our colleagues in the field, their findings and our findings have really focused on shorter term outcomes measured, you know, in our case, only up to three terms past testing. And we really wanted to know: Can those positive impacts sustain for the long term?

Ganga: So, Hollie, can you summarize for us what we found in the long-term follow-up?

Daniels: In general, among our whole sample, we found that the MMA method used at the seven colleges in our sample improved access to and success in college-level courses. Faculty members at each college were actually involved in designing their specific MMA systems and the cut scores or thresholds that students had to meet to enter college-level coursework. Across most of those colleges, those cut scores were higher in math than in English. So, in other words, fewer students were able to enter college-level math than college-level English. And we actually saw some differences and outcomes because of that. Specifically, we saw larger and longer lasting impacts on the completion of college-level English rather than math. And while we did find improved outcomes among subgroups, we did not find evidence that gaps between those groups shrunk much. In other words, students who were eligible for Pell Grants and students who were not eligible for Pell Grants both performed better under MMA, but the difference in performance between those two groups did not improve. So, while MMA can lead to better outcomes for students, there’s still work to be done to decrease inequities between underserved and minoritized students.

We examined a subset of students who were bumped up into college-level coursework by MMA when their test scores alone suggested that they should enroll in developmental education. On the flip side of that, we also examined students who were bumped down into developmental coursework when their test score alone suggested that they should enroll in college-level coursework. So, we have these two bumped zones of students: Those who were given a chance and bumped up into college-level courses and those who were bumped down and required to take developmental coursework. The students who were bumped up had much better outcomes in both math and English, while bumped-down students had substantially worse outcomes. We saw that bumped-up students in either subject attempted and earned more college-level credits on average, and these students either transferred to a four-year institution or earned a credential at higher rates than students who weren’t placed using MMA. The results of our analysis of students who were bumped down suggest that developmental education can hinder success, even among those who have a low probability of success in college-level coursework. We do acknowledge, however, that students arrive at college with varying levels of preparedness and agree that there are students who would certainly benefit from additional support. The results from this study suggest that developmental education itself, at least as delivered in this context, may not be necessary or even advisable for some students.

Ganga: So, you mean prerequisite developmental education in that case, right?

Daniels: Right. So, prerequisite developmental education was the model that was being used in this study at SUNY institutions. And so, we kind of have that caveat that, as it was delivered in this context, it may not be necessary or beneficial for students.

Ganga: I know you developed some recommendations in your paper. So, how would you sort of summarize those?

Kopko: I think one of the most important things to be taken from this research is the fact that MMA should be used to expand access to college-level courses, or, using the language that Hollie introduced, MMA should be used to bump up students into college-level courses whenever possible. I think that the findings from this study overwhelmingly suggest that increased access to college-level courses was what was the driving factor in the observed impacts on student success that we saw. This point was best illustrated among students in the bumped-down zone, i.e., those students whose algorithm score predicted low probabilities of success in the college-level course. If the positive impacts of MMA were due to an increased accuracy of placements, then we should have seen improved outcomes among those students who were bumped down into developmental courses by their algorithms scores because, presumably, they’d now be getting the extra support that they need to succeed in college. But we really don’t see that at all. Instead, as Hollie just discussed, students who were predicted to have low probabilities of success actually did better when they were directly placed into that college-level course, as compared to their peers who were required to take that developmental course. And we see this phenomenon playing out also in the bump-up zone, which further just supports the idea that access is a key driver to student success. And I don’t think we can underscore that enough. And I’m really grateful to this study for kind of showing that in a very concrete way.

Ganga: And I guess that kind of raises the point, too, that colleges make choices about how they want to set up their MMA system. There’s not just one way to do MMA; you have choices about which system to use, and you have choices about where you’re going to set your cut scores. I mean, you all saw differences in math and English. I think you’ve mentioned that, Hollie.

Daniels: Absolutely. So, after nine terms, the program-group students remained more likely to enroll in and complete college-level English, whereas the impacts on college-level math completion were short-lived. So, more students gaining access to college-level English is what we think kind of led to these longer term outcomes, whereas just fewer students were able to take college-level math. And we actually see this in other systems, other states that we’ve worked with—you know, faculty designing these systems, they tend to be a little bit more cautious when it comes to math and letting more students enroll in college-level math than they do in English. And so we hope that the results from this study just show how crucial it is to be really intentional in setting your cut scores and to kind of give students a chance, to set them a little lower than you may first want to, because we do see evidence from this study that those students who are bumped up and just given the chance to enroll in college-level coursework can really rise to meet that challenge.

Kopko: Another important takeaway from this study, I think, is the fact that while underserved students do benefit from MMA, MMA alone is unlikely to eradicate long-standing disparities in student outcomes. This is because in most cases, including in the SUNY study, MMA is not really designed or implemented explicitly with equity in mind. So, while MMA is likely to improve overall student outcomes, including among specific student subgroups, there’s a risk that the MMA system may be designed and implemented in ways that could further disadvantage certain groups of students. In our study, for example, MMA incorporated high school GPA from official high school transcripts. However, official high-school-transcript GPAs may not be readily available or accessible for some students, for example, immigrant students or older students who might have moved away from their high school or who have been away from their high school for quite some time. And while, of course, it’s going to always be the case that some students will not have access to all the measures being used in an alternative placement system, I think what this study really highlights is that we need to be careful to recognize when and where this might happen, particularly if it might happen systematically for certain groups. Because if it does happen systematically, we need to take steps to ensure that those students aren’t left to be placed by the status quo system that we’ve already identified as being less beneficial for students. Because if we don’t do that work upfront, we’re going to run the risk of not only sustaining existing disparities but possibly even exacerbating them. So, I really urge colleges, states, institutions, policymakers to keep that in mind when they’re at the design phase of this work.

Ganga: I know you all are working in some other states now. You’re kind of expanding out our work on multiple measures, CAPR’s work on multiple measures assessment. What are you seeing in the field? What are colleges doing with multiple measures? Is it becoming widespread? How did the pandemic affect it? What’s going on out there in the world?

Daniels: Yeah, so while we saw some colleges adopting MMA prior to COVID-19, the pandemic really ushered in MMA out of necessity. So honestly, it wasn’t safe for large groups of students to come together to take standardized tests like the ACT or SAT. So, colleges had no choice but to examine alternate measures like high school GPA and high school coursetaking to place students. So, in our opinion, this really opened doors for MMA to be adopted permanently because colleges saw that they didn’t actually need test scores to place students accurately. That’s really encouraging. And if we look across the country, states like California and Florida are making really great progress with MMA. In California, Assembly Bill 705 seeks to maximize the probability that a student will enter and complete college-level coursework within the first year. So, in practice, this legislation requires colleges to use high school coursework, high school grades, or high school GPA in placing students. And in Florida, Senate Bill 366 that was passed in 2021 allows colleges to develop and implement alternative methods for assessing and placing students in lieu of standardized tests. We know that there’s more work to be done to scale MMA in more systems and in more states, but at least some progress is being made currently.

Ganga: And just tell me a little bit about what CAPR is doing now in Texas and Arkansas.

Kopko: So yeah, right now on our second big MMA CAPR project, we’re working with two states, Arkansas and Texas, to help colleges design, implement, and scale multiple measures with the goal of bringing multiple measures assessment to scale in both of those states by the end of the project period, which is at the end of 2024. And right now, we’re working on a report summarizing findings from that work, and we expect that to be released by the end of this calendar year. So, definitely some things to be looking out for.

Ganga: I imagine we will be learning some interesting things out of those states. What else do you all feel like we need to know about MMA? What do the colleges need to know? What do researchers need to know, for the next steps?

Kopko: So, in my opinion, I think it’s really important for colleges, again, to remember that multiple measures was really just one piece in a bigger, or what should be a bigger, effort to support student success. One of the things that I think is interesting is that we’re seeing these pretty significant, as we were saying, impacts, and in some cases, they’re pretty long-lasting. And I find that incredibly promising. Because if you think back and realize, oh, this is just a one-time intervention; we’re giving students access to college-level courses who wouldn’t have otherwise been given that opportunity. But we’re not doing anything else, right? The classroom experience isn’t changing, we’re not providing extra supports, etc. Just imagine how much more of an impact we could be making if we did some of those things. So, we really see this as a potential opportunity to further improve student outcomes beyond those positive gains made by MMA alone; for example, a lot of colleges are adopting and experimenting with corequisite courses, and we think combining MMA with corequisite courses is one of those opportunities to further support students and, you know, see even larger gains in student success. So, to that end, CAPR is actually launching a multi-college randomized controlled trial that seeks to uncover the effect of MMA in those contexts. That project is just kicking off now, but in the coming year or so, we’re hoping to release some results there.

Ganga: Excellent! Well, thank you very much for talking to me. I really appreciate it.

Daniels: Thanks, Lisa.