Skip to main content

Gen AI Explorations: Conversation with Faculty Fellow Nicole Dillard

 This spring Extra Points will feature a series of conversations focused on how faculty and staff around the University of Minnesota are using generative AI to do University work. 

Interview with Nicole Dillard about her use of Generative AI in CEHD
Lauren Marsh and Sara Schoen (Academic Technology Support Services) interviewed Emerging Technologies Faculty Fellow Nicole Dillard from CEHD’s Organizational Leadership, Policy &
Development department. The following has been revised for length and clarity.

Tell us about your role in the College of Education and Human Development (CEHD), and how this is informing your work with generative AI.

Nicole Dillard: I'm a faculty member in the Human Resource Development program, which is part of the OLPD (Organizational Leadership, Policy & Development) department. I teach both undergraduate and graduate courses, and I'm piloting my project in my undergrad course OLPD 3310: Identities in the Workplace.

Prior to the fellowship, I don't think my role was significantly influenced by my use of AI, in fact I hadn’t touched AI before this fellowship. Now, though, I'm working with generative AI more extensively because of this project. It's opened up opportunities for me to engage with my peers. I now host office hours for other faculty/instructors. We have coffee, we talk about what I'm doing with gen AI, and I help them brainstorm how to integrate generative AI into their assessments, learning environments, and curricular developments. 

Would you set the context for us and tell us more about your project in the Faculty Fellowship program.

ND: I developed a course, OLPD 3310, which looks at how folks with marginalized,  intersectional identities experience workplace challenges. 

For this project, I'm exploring how we can develop students' digital literacies through generative AI skill development in a way that also cultivates their DEI competencies that they can apply in the workplace. Each module looks at a certain identity. The students use generative AI tools to help them make sense of specific identity challenges and develop recommendations to support folks in the workplace. The goal is to develop their skill set for the classroom and for when they go into the field as HR practitioners, giving them tangible skills using generative AI that can lead to greater workforce development.

One way I'm using AI is as an assessment tool. Instead of a midterm, I used character.ai, which allows you to create any persona you want to engage with. You give it parameters, like wanting to talk to Beyoncé, and it interacts with you like the persona. For the midterm, I created an HR hiring manager who interviewed each of the students for a phantom role that fosters elements of DEI in the workplace by using AI.

The students came into the class with their laptops and headsets, and some even dressed up because they knew it was interview day. Everyone got set up on Character.AI and engaged in the interview. Each student's experience was unique, with some similar questions but tailored responses based on their answers. The AI asked them questions based on the content they had been learning over the semester. For example, it might say, "We're excited to meet with you today, Sara. We'd like to speak to you about the opportunity to join our firm and help foster a culture of  DEI. Can you share with us how you would support working parents of color in the workplace or how would you use AI to facilitate training exercises for parents?"

At the end of the interview, they either got a "you’ll be hearing from us" response or "You would be an excellent candidate for this position." If they got the former, there was also follow-up feedback on what to consider for their next round of interviews. We reflected on the exercise the next week in class. The students liked this form of assessment. They felt it kept them fresh and on their toes. Some said it mirrored a regular interview process and was live, which mitigated cheating. 

Interviewers: When they were engaging, was it verbally, or were they typing responses back?

ND: They had the option to have a verbal conversation or engage in writing. The AI had a voice and an accent.  You can give it dialects. It's pretty robust in character creation. Some students chose the auditory option.

What have you learned? Share your ah-ha moments!

ND: I think my biggest ah-ha moment is realizing that it's more important for me to assess the process of how students are using these tools rather than just the content they produce.  Especially with concerns about students using AI to cheat or take shortcuts, I'm able to assess their real development based on the process they use. For example, if they're generating prompts, I'm looking at how many times it takes them to get to the final prompt they use to get the answer. This shows levels of development, skills, and critical thinking. Their process of getting to that answer is much more linked to the type of critical thinking I'm interested in. 

What conversations with students have been impactful to you and to them?

ND: We’ve had both positive and negative experiences with generative AI around exploring how our identities can inform our experiences in the workplace and vice versa.

The students were developing their positionality statements, which is something I would do even if I wasn't engaging with AI. For any work around identities, it's important for students to understand their own identities before exploring those of communities they don't belong to. The students developed their statements and put them into ChatGPT to identify what potential challenges they might face in the workplace.

ChatGPT was pretty responsive and accurate. Most students, in their reflections, said that the chatbot captured their actual experiences at the university or in their places of employment. It also gave them perspectives on challenges they hadn't experienced but might face once they graduate and enter the workforce. It was a good way to situate the conversation so they could understand it from their own identities and then think about it from the perspectives of those with other identities.

On the other hand, students experienced how generative AI can be biased. My students know AI is created by humans, and humans have biases. But once we started playing with the tools, it became more real. We looked at gender identity and sexual orientation in the last module. A student using Copilot asked the chatbot, "I am a transgender employee of the University of Minnesota, and I am thinking about disclosing my gender identity to my organization. What recommendations would you have?" Copilot wouldn't interact and said, "I don't feel comfortable having these conversations. These are not appropriate conversations."

We tried to prompt engineer to get a different response, but it kept blocking us. This was a visceral moment for the students. They understood bias conceptually, but engaging with a tool and being iced out was different. We discussed what this means for AI and how it happens in the real world when people want to talk about their identities and get blocked. It was unfortunate, but a valuable learning experience.

How are your peers responding or engaging?

ND: In the beginning, discussions about AI seemed to be more geared towards punitive responses. There were concerns about students cheating, mistrust of the tools, and ethical implications. It was very much about the dark side of using generative AI. But I think there's been a major shift. More of us are using it, and there's more communication from the university about using it. I feel like the culture around using and facilitating AI in the classroom has shifted in my department. Now, folks are trying it, researching it, and everyone is interested. It's a different energy, and it's been good to engage folks in that way.

Any final advice to instructors?

ND: Scaffold as much as you can. I gave the students a survey at the beginning of the semester, and they all had used AI to some capacity but not really engaged with it in relation to curricular activities. They're using it for entertainment. In the classroom, they're using it to take dense reading, put content into a tool, and have the tool summarize or synthesize the reading. There is a deep need for skills development.

 I also try to model my own development using generative AI. I want to show them that I'm figuring this out too. We're all figuring it out together. Their comfort in talking about their challenges with AI tools leads to greater comfort in discussing their challenges in the classroom and the academy broadly.

Resources

  • Responsible AI Use in OLPD 3310. This file includes the contents of three documents I used in the course: my AI syllabus statement, the AI Guidelines I included in Canvas, and the AI Use/Academic Integrity Statement/Contract that was required to be signed by each student. 

  • OLPD 3310 -Midterm Assignment & Workflow Log. This file outlines the expectation of the midterm assignment and the corresponding Workflow Log that I assigned to help the students reflect on the process of learning and completing an assignment in my course. I've highlighted in yellow the content related to generative AI.