Skip to main content

ChatGPT in Classroom Environments - Early Adopter Experiences (Sept 2023)

Teaching and Learning with Generative AI logo with a classroom of human-like figures of which one is clearly a robot
Around the world, generative AI is changing how we work, how we teach, how we learn, and how we think. The rapid proliferation of AI technologies and their initial hype suggests that the future of teaching, learning and writing will require some integration and collaboration between humans and AI content generators. In these early days of AI, instructors and researchers are already encountering and exploring ways AI technologies emerge in the context of teaching and learning.

On September 29, 2023, Academic Technology Support Services, the Center for Educational Innovation, and the Writing Across the Curriculum program hosted instructors from several disciplines to share their experiences using (or responding to) ChatGPT in their classrooms in Spring 2023.  

Jay Coggins - Applied Economics

Jay Coggins, Professor of Applied Economics, inaugurated the panel by observing that our students will likely use AI tools throughout their careers, and explained that responsible use would be a necessary component of professional education. In his writing-intensive Applied Economics class,  Jay and his co-instructor, Justin Johnson, learned about ChatGPT in collaboration with their class of almost 40 students, by requiring students to use ChatGPT on one of their writing assignments. The instructors established a threshold in which 25% of the student's written assignment was required to be generated by ChatGPT (but no greater than 75%). He described three students' responses in detail (see slides shared below).  Students' initial reactions ranged from enthusiastic to nervous, and the outcomes suggested that students were intrigued by the capabilities of generative AI but profoundly skeptical of its ”confidently wrong” written products. Coggins emphasized that using generative AI in a course on natural resource economics required an emphasis on learning goals related to course content but that he was not in a position to claim expertise on the pedagogical implications of AI more broadly.

Eric Shook - Geography, Environment and Society

By contrast, Eric Shook explained that using new technologies and AI was central to the work of his computational geography courses and research and that he was an enthusiastic proponent of generative AI as a pedagogical innovation. Eric had several suggestions and prompts for anyone to use when thinking about ChatGPT (see slides shared below). Eric shared ideas for uses, benefits, and drawbacks. He regularly has students use AI in his coding class (of around 60 students) but explained that ChatGPT has limitations for more advanced applications, noting that “It is good at basic coding, but once you get into advanced programming topics, it starts being ‘subtly wrong.’" Eric also mentioned being inspired by Specifications Grading by Linda Nilson and Atomic Habits by James Clear (and suggestions from Clare Forstie from UMN’s Center for Educational Innovation). Eric shared that he wants to get students to experiment more with generative AI to practice getting better at identifying issues with advanced programming (rather than aiming for the one perfect outcome).  

KC Harrison - Youth Studies

The next panelist, KC Harrison, offered a critical lens on the value of generative AI as a learning tool. In KC’s home department of Youth Studies, the processes and practices of research are closely tied to relationship building and trust. KC and her students found that generative AI interferes with the human connections vital to an interdisciplinary field. KC detailed how academic integrity concerns that emerged in the Spring semester turned into a critical assessment of AI in the subsequent semester of the same course.  One assignment asked students to use ChatGPT (if they already had access) to define key terms (such as queerness, immigrant, social construct, neocolonial, and citizenship) and analyze the AI-generated definition in the context of what they had learned in class. They found that course concepts were often far more specific and nuanced in the hands of subject matter experts, compared to what ChatGPT generated.  Student responses clearly showed that they also have mixed feelings about using AI-generated text in classes.  KC compiled this AI pros and cons list based on student responses and then shared the co-created AI policy for their course (also in the slides linked below).

Department of Writing Studies

Lee Ann Breuch, Kathleen Bolander, Stuart Deets, Asmita Ghimire, Alison Obright, and Jessica Remcheck offered the final presentation. The Department of Writing Studies team conducted research on how undergraduate students use ChatGPT.  Their research questions included:

  • How are undergraduate students understanding ChatGPT as an academic writing tool?  
  • To what extent are students incorporating ChatGPT into their writing product(s)?  
  • How are students thinking about ChatGPT in their writing process?

They found that students had many questions about ChatGPT and could see its benefits and drawbacks. For example, one key finding “Students saw great potential for using ChatGPT as a generative tool that could assist writing processes” suggests the need for instructors to build AI literacy with students. More key findings and advice are in their slides linked below.

Discussion

Post-panel discussion addressed how students could still learn the basics concepts in our classes if generative AI can do it for them.  One participant said 

When they [students] say it is useful to learn something new or gain new content knowledge [through ChatGPT]…. this worries me because it doesn’t necessarily provide valid information. Wikipedia would be far superior for this purpose, and we remember how controversial that technology was!

Concern was also expressed about how often non-native English speakers were disproportionately flagged for perceived use of generative AI and academic misconduct, even when they were not using generative AI.  The consequences of a false allegation make instructors reticent to file formal accusations with the Office of Community Standards.

The panel concluded with a preview of additional UMN opportunities to continue exploring generative AI, such as:

 Slides from presenters

Resources shared in the chat during the session

Acknowledgments

Thanks to Dan Emery and Clare Forstie and all of the panelists for their contributions and assistance with the panel and this post!