This fall and spring Extra Points will feature a series of conversations focused on how faculty and staff around the University of Minnesota are using Generative AI to do University work.
Colin McFadden, Technology Architect for the College of Liberal Arts, presented to the Emerging Technologies Faculty Fellows on the topic of GenAI in September. His comments are edited for clarity and length.In your role as Technology Architect for CLA, how do you use Generative AI?
Colin McFadden: I use Generative AI day-to-day for both technical and non-technical work. When I’m working on software or hardware development, generative AI provides both code suggestions, and can help me think through complex requirements. I also love to have generative AI do an initial proofreading pass of documents or articles that I’m working on. I’m still learning where these tools are useful and where they’re not - for example, some programming languages or frameworks still present a challenge, and you can easily spend a lot of time without making progress.
Change like the Industrial Revolution? Or change like smartphones?
CM: We're in a moment of really big change. I don't know if this is upheaval akin to the Industrial Revolution, or a flood of conveniences akin to the arrival of smartphones. Probably somewhere in between. The Industrial Revolution resulted in overall longer lives and higher GDPs…but many workers were left behind, starved or otherwise suffered during that period of rapid change.
It's really not clear where we're going to land on the other side of all of this. We need to make sure that everyone in the University community is involved in this new AI conversation and are helping chart the course forward as students will experience both the positive and negative impacts.
How do instructors navigate GenAI in the classroom?
CM: I'll share CLA's guidance for AI in the classroom.
Don’t fall for the false promise of AI detectors
The University does NOT offer a tool that promises to detect AI-generated text. The reality is AI-generated text is just text, and the AI makers see the ability to tell human-generated text from AI-generated text as a bug that they should fix. So this is an arms race between the most powerful corporations in human history, and higher education.
To instructors using free tools to detect AI-generated text:
- The tool may sometimes get it right, but will often falsely detect AI-generated text. Non-native speakers using these tools for grammar and structural input can also be unfairly flagged by these plagiarism detectors.
- Instructors do not have rights to share students’ intellectual property with an unvetted company that will likely use it to train their model.
So talk to your colleagues and tell them, please don't use those random ad-laden 3rd party websites that promise to detect AI-generated papers.
Mind the AI technology gap
Start with your device expectations for your students. For example, students don't need to own a laptop or a mobile phone in order to take classes in CLA. Irrespective of the AI conversation, a few times a semester we will get an email from a student saying, “my teacher says I need to have a laptop in order to run this piece of software in the classroom, but I only have a desktop, and I can't afford a laptop right now. Can you help me? Or do I need to drop the class?” That's a really bad situation to put a student in. If there's a technology requirement for the class, consider offering it through a computer lab where everyone has access to technology. Or you might set up your classroom activity so that students are able to get access to technology in another way, maybe partnering with peers. As the technology landscape shifts towards what’s called on-device AI, meaning that more of the AI is done right on the device instead of in a data center somewhere in the cloud, expect the technology gap to widen between students who have access to technology that can take advantage of generative functionality, and students who don’t.
To access cloud-based Generative AI tools, you need to create an account. But not all of our students can sign up for accounts that require a license agreement. For example, not all of our students are over 18. Not all of our students are citizens of a country that is covered by the license agreements from the generative AI tools, and not all of our students, irrespective of those legal requirements, are comfortable with creating accounts in order to use these tools. When we're bringing tools into the classroom that are not supported by the University, we need flexibility. Is this a partnered activity? Is this something we can do as a group, working under a faculty member's account? Can I provide alternatives for my students so that they can have the same experience, even if they can't or are unwilling to create an account?
Have a conversation
A good starting point for communicating with your students about AI is the syllabus statements that are available from the Provost. But syllabus statements are kind of like those license agreements that we all click through without reading or processing. It’s important to have a conversation with your students. Talk with them about what you're expecting from them, and about the appropriate and inappropriate uses of AI in your class. Many faculty are excited to share with students their own relationship with these tools and where they're seeing the value in their discipline, research and work.
It’s important to keep in mind that you and your students approach AI very differently. Instructors bring a wealth of digital literacy, media literacy, search literacy, information literacy… all the literacies…we're bringing all of it to our relationship with generative AI. And we bring a ton of knowledge about our subject matters, about how we get work done, and about how we solve problems. All of this informs our relationship with Generative AI.
But our students are here to build those critical thinking skills and literacies. And that means they have a really different experience when they're faced with a chat window that gives them easy answers that seem definitive. I see this time and time again when working with students. I’ll jump into a screen sharing session with a student, and I see they've gone deep down an unproductive rabbit hole with ChatGPT, maybe for hours. I see students getting dragged deeper and deeper down unproductive paths of inquiry because they haven't yet built important critical thinking skills and contextual knowledge. This is why we need to have a conversation with our students about why it's important to put AI aside a lot of the time and build those foundational skills in other ways, through practice, repetition, and struggle.
What’s on the horizon?
CM: Expect more change. Large language models are built with a very liberal interpretation of copyright. Over the next year we're going to see that play out in the courts. We could end up in a world in which there are tools we use in the United States, and different tools that are used in Europe, and maybe different tools that are used in China.
That's going to have implications for our student populations. Students who are not from the United States, for example, may be falling under a different regulatory regime. If it turns out that ingesting all of the world's knowledge without paying anyone is actually not allowed, we may be back to a different way of thinking about these things.
We’re also going to see these tools built into the platforms we use, often in subtle ways. Already, generative AI tools are being built into our mobile and desktop operating systems. The autocomplete tools in our word processing and email applications are using Generative AI to suggest sentences or paragraphs. Increasingly, we won’t be in a world where “Generative AI” or “no Generative AI” is a binary - these tools will be diffused throughout our computing ecosystem.
How do we keep up?
CM: I look for trusted voices that I think are aligned with my interests. I’m looking for some people who can give me that longitudinal perspective. In my case, I found theverge.com. I appreciate that they bring a healthy level of skepticism to their reporting. Be sure to familiarize yourself with ai.umn.edu as well. It includes both policy information as well information about groups around the University which are exploring how these tools impact our research, teaching, and administrative lives.