Researching AI for Learning in the Community College Classroom

By Nancy Murray and Jay Trucker

Woman gazes at laptop

This is the second post in a series on using AI as a learning tool. The first post explains how the authors introduced ChatGPT into their classrooms.

How can artificial intelligence (AI) become a tool for learning instead of a threat to the integrity of college writing instruction? This question abruptly confronted community college instructors last year when ChatGPT came on the scene. As instructors of first-year and corequisite English at the Community College of Baltimore County, we were particularly interested in the implications for our students and decided to do some research.

In the Spring semester of 2023, we taught the unit in four classes affecting 60 students. In each class, we sought to answer the question, “Should ChatGPT be allowed in a composition classroom?” To answer this, we first had to investigate the following lines of inquiry with our students:

  • What is ChatGPT?
  • How does ChatGPT do what it does?
  • What are the pros and cons of using ChatGPT for classwork?
  • Is ChatGPT trustworthy?
  • How can we use ChatGPT in the English classroom?

Introducing ChatGPT was easy enough: we watched a short YouTube video on Day 1 of the unit. But it was more important for students to really understand how the technology works, so we devised lessons on algorithms, natural language processing, and predictive models.

People often conflate a digital native’s comfort with technology with their actual understanding of it. We experienced this with most of our students. While our students had heard of algorithms, they didn’t know much about what an algorithm is or how one works. We think that is of critical importance because it is central to questions of reliability and bias within technology.

A predictive model algorithm is equipped with the information it’s been fed, which translates the level of bias in the programmer to bias in the program, and it is essential that our students understand this. It is also essential to understand that in order to answer questions, ChatGPT draws from the whole Internet, which is not always trustworthy.

In our introduction, students asked ChatGPT to write their personal biographies, which ChatGPT appeared to answer based on the etymology and cultural indicators of their names. The students were delighted to discover that, according to ChatGPT, they had accomplished great things and were making stacks of money.

In another lesson, we devised a mock Internet wherein our students, in groups of four, were given the role of ChatGPT and tasked with integrating information from multiple sources into one response to a prompt. They quickly learned how mixing fact with fiction can produce answers that appear credible but are far from truthful. Students were sure their answers were reliable until they saw the sources of their data bytes.

As researchers, we wanted to understand how using ChatGPT affects a student’s learning and their level of pride and connection to their work. We studied the strength of our students’ writing through comparative analysis of AI comp writing and non-AI comp writing, and we examined students’ connections to their work through a survey of the opinions of students in AI and non-AI units.

There was a slight difference in our approaches. While Professor Murray assigned an essay that asked students to answer the question “Should ChatGPT be allowed in the English 101 Composition class?”, Dr. Trucker’s essay assignment was more specific to the students’ major fields of study, allowing the students to learn more about technology than he could have possibly taught them in the English classroom. ChatGPT enabled them to go down the rabbit holes where a generalist could not go, exploring the impact of the technology for students in different fields.

Professor Murray learned that ChatGPT could help students explore the English 101 course objectives through a completely different—and in many cases more successful—approach than she had used before. Exploring the resources made up by ChatGPT gave the students hands-on experience with finding resources and verifying their credibility. The study of natural language processing helped develop students’ understanding of previously abstract ideas like syntax, emotional context, and rhetoric. This proved especially true for Professor Murray’s corequisite class supporting students with reading and writing skills and for the students who needed additional support due to divergent learning styles. In a busy classroom, ChatGPT can give students more detailed and immediate responses to the questions they struggle with than a professor or other learning support could offer.

As the survey results indicate, there was a 5-percentage-point higher positive response rate among students in AI units to the question of whether their writing improved, which we thought was borne out in the quality of their essay organization and sentence structure. When it came to the question of pride in their work, there was also a 5-percentage-point higher positive response rate among students in AI units. Conversely, fewer students in AI units (by 7 percentage points) felt the AI-assisted essays captured their authentic voice and ideas, which makes us wonder what they were proud of. Our future surveys will include an opportunity to explain their responses in detail.

A notable survey response disparity came in a 19-percentage-point difference between students in AI units (68%) and non-AI units (87%) in feeling that they would be able to explain the details of their essay. There was also a 22-percentage-point lower positive response rate among students in AI units in feeling that they could express themselves freely while writing. To address that concern, we are working on assignments that will help develop a deeper connection to the work they are producing with ChatGPT.

We feel that even the lower positive responses among students in AI units are not entirely bad news. For example, though there was a 19-percentage-point difference in students’ self-reported ability to recite the details of their work, there was still a strong majority (68%) of students in AI units who reported that they could.

We are encouraged and excited to teach the AI unit again, with some revisions. While we have learned from our initial study, in the upcoming semester we plan to further align our class lessons, add new AI-related readings for students, revise our survey for better data, and devise means to study the quality of AI-assisted student writing versus student writing without AI. It is clear upon reflection on our first effort that students can benefit from AI, including—and perhaps especially—corequisite students who need additional support in reading and writing. Our job as English faculty is to develop strategies for implementing AI usage effectively and responsibly.

Nancy Murray and Jay Trucker are professors at the Community College of Baltimore County.