Valpo Faculty Respond to Generative AI

By Cynthia Rutz, Director of Faculty Development, CITAL

At the faculty workshop this August, our keynote speaker Derek Bruff spoke about four possible faculty responses to AI:

  • Red Light: I’ll prohibit the use of AI.
  • Yellow Light: I’ll permit the use of AI, but with limitations.
  • Green Light: Let’s see what AI can do!
  • No Light: I’m not sure yet.

Where do you fall on this spectrum?  We interviewed four Valpo faculty who have different views on AI. Kelly Belanger (University Director of Writing) is comfortable being yellow light, while Sedefka Beck (Economics), a fan of new technology, is green light and hopes to lead a future FLC with others who plan to use AI in their classes. Sarah Jantzi (Communication and Visual Arts) considers herself as red light for introductory classes, while her colleague Ben Brobst-Renaud is green light and already planning a spring class on AI and Art. 

Kelly Belanger (University Director of Writing): Yellow Light

Kelly is very comfortable as a “yellow light,” since she thinks that AI is going to be part of the toolbox for writers from now on. In fact, most writers are already using AI, whether they call it that or not. It started with spell check, then Grammarly, now generative AI, and it will continue to advance. Therefore students need to understand the ethical issues with AI so that they can use it responsibly.

In her current editing class, AI is used as a tool, but she emphasizes its limitations as well. She shared with her students an article about how employers, even tech companies,  want workers with  “soft skills” such as presenting ideas orally and responding to questions on the spot, asking questions based on critical thinking, and working as a team. These uniquely human skills can’t be duplicated by AI.  

She asks her students to play with AI, to give it prompts. For example, she will give them a poorly written draft from an exercise book and ask them to edit it themselves and then to ask AI to edit it. Students will ask AI to tailor the text to different audiences and then assess how well it did. 

AI and CORE

Kelly works with Lisa Jennings on the major writing assignments for Core.  At the Core faculty workshop this fall they talked about AI for over an hour.  They discussed this document about how different faculty have handled AI in their classes: Classroom Policies for AI Generative Tools

Here is the default syllabus language about AI that Core will use this fall:

In the broadest sense, plagiarism is the passing off of someone else’s writing or ideas as your own. With this in mind, plagiarism now includes presenting “AI” or computer-generated text as your own work. Writing is more than a matter of the final product—a “piece” of writing. Writing is a process—of discovery, creation, analysis, and synthesis—that empowers us to think clearly and creatively. As it advances, AI will likely be part of that process, but it must not replace the work we do. For now, use of AI or other generative software when not explicitly allowed as part of the assignment, or at any time without citation, will be considered a violation of the Honor Code.

Some Core instructors will not permit AI, others will allow students to use it and cite it. However, citation is a problem because AI does not acknowledge its own sources. Some Core writing prompts will also ask students to write from their own specific perspective (however, CHAT GPT can be prompted to do that too!).

For Core this year specific rubrics were created for each assignment, including a process rubric.  AI is not built into these rubrics, but Core instructors can opt to add it. For example, they could have students run their thesis by AI, or ask AI for feedback on their first few lines.  But in every case, AI must be cited, even to students providing the prompt that they gave to AI.

Some Good Consequences of AI?

Kelly believes that AI could actually have a positive effect on the teaching of writing. For example, it might cause faculty to think more about crafting writing assignments that are very specific to the context of the class. 

AI might also prompt faculty to create and use rubrics better:  This will make us more careful readers. Instead of accepting glib, surface writing–which AI does well–we should look for evidence that the paper, for example, reflects specific classroom conversations. A good rubric will also cause us to look more closely at the logic of the paper, use of sources, and evidence.  Thus AI could make us hold student papers to a higher standard. 

Sarah Jantzi (Communication and Visual Arts): Red Light

Sarah is a “red light” when it comes to using AI in introductory classes. 

In her Drawing from Life class, for example, students must work to develop their ability to see in three dimensions and then translate what they see into a two-dimensional drawing. She does not let the students draw from photographs because she wants them to develop their eye-hand coordination and visual perception skills.

Sarah does teach some advanced classes and she can picture some of those students using AI as a tool. She loves learning from students and is very open to them using all kinds of tools and sources of inspiration. However, there is a right way to borrow and use preexisting imagery.  So she gives her students a list of things to consider when they are appropriating, i.e. using the work of other artists in their work. For example, they may not use more than 10% of another artist’s work and they must also recontextualize the image. That would apply to AI as well.

For making her own art, Sarah prefers the cognitive benefits of not using computers to create. Similarly, in the classroom she plans to continue to place focus on students building the skills they need to interpret the world around them. Those skills will always be valuable.

Sedefka Beck (Economics): Green Light

Sedefka is excited about and motivated by new technology, so generative AI intrigues her.   Moreover, she does not think we can stop our students from using AI any more than we can stop them from using Google or a calculator.

What is important to her is to teach her students critical thinking; she wants them to engage with AI in a way that will demonstrate both its benefits and its limitations. So for one assignment, she will have each student define an economic concept, and then ask AI to do the same. (In her own experiments with AI she has found that it provides good definitions, but does not apply them well.) In class discussion students will be asked to talk about the gaps between their own definition and AI’s and to explain how each could be improved.  In her class lectures Sedefka will then give the accepted definitions of these concepts.

Part of the point of this exercise is to teach students to use AI responsibly.  As she points out, you can prevent them from using AI (and Google and a calculator) during an in-class exam. But for any take-home assignments, they will have access to these tools, so she wants them to think critically about how to use them.

In  her own field Sedefka notes that AI is popping up everywhere. For example, there are new job positions for “AI professionals.”  She herself is working on a paper on AI in the classroom and has found dozens of academic papers on AI posted in online repositories, bypassing the slower process of getting into official journals. 

Next year Sedefka would like to organize a Faculty Learning Community (FLC) on “Using AI in the Classroom.”  Faculty would share specific AI-related assignments and activities. Participants would receive feedback on their own ideas and also benefit from learning about how other faculty are incorporating AI.

Ben Brobst-Renaud (Communication and Visual Arts): Green Light

Ben considers himself a “green light” on AI because he sees it as another language for his students to work with as they create. However, he views AI as neither a tool nor a collaborator.  As such, it can easily cause you to stray from an assignment and go down a rabbit hole. He finds that AI works best if you use it to brainstorm rather than approaching it with a specific idea such as: “I want to make this.”

He used AI in his course in Ideation, a class that deals with the creative process and includes non-majors. For example, students were asked to design a character with a costume.  Some students asked AI for images to start with. This was most helpful for students who were not comfortable drawing by hand. However, later on some students said they wished they had drawn it themselves after all.  

In his Media and Storytelling class Ben has his students use AI to generate six-word stories. It is pretty good at those. For example, here is how AI summarized Moby Dick in six words: “Obsessed sea captain hunts vengeful whale”.  He also asked his students to create a script for a fantasy book using several AI sources. They discovered that most AI sources are not good at creating stories.

Ben finds that student responses to AI range from total lack of interest, to negativity, to enthusiasm. Much like faculty, the majority of them are wary of AI.  Some of the art majors take pride in what they produce and they worry about whether AI will take jobs away from them.  

Ben plans to teach a course on Art and AI in the spring. The class will be open to all, not just art majors.  Students will work with AI, but the focus of the class will really be on thinking, iterating, and collaborating. 

Overall, Ben says he has not yet found the sweet spot for how best to infuse In his classes. 

The future of AI and Art 

Ben expects that AI for artists and graphic designers will soon be monetized. He could imagine developers going the route of subscription, charging as high as $200 per month to use AI as a high-end graphic tool.  If that happens, the free versions, by comparison, will be pretty inferior. He can also see AI generators working with existing image and photo libraries. Since those will require a fee, that cost will be passed on to users.

Brauer Exhibit:  AI: Drawing Back and Forth

Ben has put together a body of work that he created using AI for a show at the Brauer Museum.  He hopes that the show will stimulate discussion around AI.  He especially wants to engage community members with ties to Valpo and Core students.  Look for information coming from the Brauer about the opening of this exhibit on Wednesday, October 4th.