Student Responses to Artificial Intelligence
By Nancy Murray and Jay Trucker
In the spring of 2023, we created, implemented, and evaluated a theme-based unit about understanding and using artificial intelligence in some of our composition course sections at the Community College of Baltimore County. We were intrigued by the potential of this emerging technology and believed we could learn alongside our students about how it may contribute to learning.
We discussed the origins of this project about using artificial intelligence (AI) for writing and about understanding AI more broadly in a previous CAPR blog post, and in a second post we shared results and implications from a survey of our students in spring 2023.
Our intention with the survey was to see if concerns commonly expressed by faculty about students’ use of AI are warranted. Are students who are using AI for their assignments able to learn and connect with their learning? Do they still think critically and use their authentic voices? We continued to refine our lessons and study how our students responded in the following term, fall 2023. In this blog post, we want to share what we’ve learned.
We learned that there were gaps in our understanding of how our students felt about the technology because there were flaws in our original survey. Students’ answers to questions such as “Are you proud of the essay you wrote?” led to more questions: What, specifically, were they proud of? Did they learn about the subject and were proud because of that, or was there another reason they were proud? These were important questions, so we revised our survey for fall 2023 to give the students the opportunity to expand their answers.
We also realized that the assignments designed to help students understand how large language models work, what natural language processing is, and how algorithms are created were not uniformly taught across our English 101 sections. This might have impacted students’ understanding, so we made sure that the fall 2023 unit and inquiry into AI were more uniformly administered. When designing the unit for fall 2023, we reused some of the foundational learning resources from our spring 2023 content and added readings that highlight developments in AI that occurred between January 2023 and August 2023, which strengthened the lesson plan.
Students’ outlook on generative AI evolved quickly in the time between the two semesters. Some of these changes could be tied to new lessons and readings, but we noticed that students also brought a higher level of AI knowledge into the classroom in the fall 2023 semester.
In our analysis, we evaluated survey results completed at the end of each unit—one AI-based unit and three non-AI-based units—in the fall 2023 semester and cross-analyzed them with the initial spring 2023 results. Additionally, we examined qualitative responses gathered in our revised survey. We also added demographic questions to the survey instrument to provide a clearer sense of our student body.
Our results suggest that students in AI units expressed less pride in their AI-assisted essays in fall 2023 than students in AI units had the spring, and fewer reported that the quality of their writing increased. Conversely, more students reported that they felt they could “express themselves freely” in the fall 2023 AI units, and more students noted that they could “explain” their AI-assisted essays in fall 2023.
When discussing their non-AI-assisted essays, roughly the same percentage of students in spring 2023 (87%) and in fall 2023 (85%) non-AI units agreed with the statement “I would be able to explain the main point of the essay to someone who has not read it.” However, the proportion of students agreeing with this statement for their AI-assisted writing assignment increased by 8 percentage points from spring to fall (68% in spring 2023, 76% in fall 2023). The qualitative findings from the survey indicate that student effort may correlate with understanding. The most common explanation students offered for being able to explain their essay was that they spent a substantial amount of time working on the AI-assisted essay “to make it sound right.”
Statement (answered on a Likert scale) |
|
Percentage point change | |
Spring 2023 AI unit | Fall 2023 AI unit | ||
I am proud of the essay I wrote. | 72 | 61 | -11 |
I felt like I could express myself freely while writing this essay. | 56 | 63 | +7 |
I would be able to explain the main point of the essay to someone who has not read it. | 68 | 76 | +8 |
The quality of my writing improved because of the work I did this unit. | 76 | 59 | -17 |
The essay captured my authentic voice and ideas. | 76 | 52 | -24 |
The proportion of students agreeing with the statement “I felt like I could express myself freely while writing this essay” was significantly lower for students in AI units compared with students in non-AI units in the spring of 2023. Whereas 78% of students agreed with this statement in non-AI units, only 56% of students who were in AI units did so. However, in fall 2023, more students responded positively to this statement in AI units than in non-AI units (63% versus 61%). This may have less to do with AI than that a new non-AI unit taught in several sections required media analysis rather than opinion-based writing. In the qualitative responses, students using AI in the fall noted that they appreciated drafting an opinion-based essay.
In the spring of 2023, a higher percentage of students in AI units (76%) versus non-AI units (71%) agreed with the statement “The quality of my writing improved because of the work I did this unit.” However, in the fall, students in non-AI units (66%) responded more positively than students in AI units (59%) to this statement.
Similarly, pride in AI submissions was down in fall 2023. During spring 2023, 67% of students in non-AI units and 72% of students in AI units agreed with the statement “I am proud of the essay I wrote.” However, in the fall, 74% of students in non-AI units took pride in their essays, whereas only 61% of students in AI units did so.
When our findings are considered together, it seems possible that as students became more knowledgeable about generative AI, they become more willing to trust it for thesis building but less so to generate full essays.
We suspect that these findings are also tied to the fact that we designed AI lessons so that more students would engage in specific exercises intended to show them the intricate components of natural language processing, algorithms, and large language models. That learning may have affected students’ understandings of their own AI-assisted writing.
In the future, faculty and institutions will have to decide how they will incorporate artificial intelligence into their coursework. We strongly recommend that they also incorporate instruction about the mechanics of the technology as it continues to evolve. This understanding removes the magical aspect of the technology, enables students and faculty to grasp its potential problems, and gives everyone a healthier approach to its use in the classroom.
Nancy Murray and Jay Trucker are professors at the Community College of Baltimore County. |