Skip to Content, Navigation, or Footer.
The Daily Cardinal Est. 1892
Friday, November 07, 2025
Spinney_Morgridge_Hall_Interior-7.jpg

‘A new era of learning’: Professors grapple with AI in the classroom

Professors at the University of Wisconsin-Madison use artificial intelligence in their classrooms to aid student learning. They also grapple with how to teach their students how to ethically use AI while not circumventing their learning.

As artificial intelligence infiltrates nearly all facets of society, an education at the University of Wisconsin-Madison is no exception. While professors use AI to provide accessible materials, they want their students to understand how to use AI tools without circumventing learning.  

“Any class in 2025 that doesn’t use AI in some way, shape or form will do so within five years,” Dietram Scheufele, a life sciences communication professor and expert on misinformation and AI, told The Daily Cardinal.

While professors vary on their openness to AI tools, most agree on the need to teach students how to use AI tools ethically and identify misinformation and the dangers of using AI.  

“If we don’t have people who are informed about AI and don’t know how to use AI, we’re setting ourselves up for the next problem with technology,” David WIlliamson Shaffer, an education psychology professor said. 

UW partners with AI tools like Microsoft Copilot, allowing students and staff free access to  Copilot and Google Notebook LM. Now, nearly every course provides an AI policy on the syllabus, since current institutional-wide policies allow for professors and departments to develop their own guidelines. 

UW-Madison released revised generative AI policies in August 2024 detailing general rules and regulations as well as providing trustworthy AI tools. 

“AI at UW is not any different than what any other schools are facing, but I think the more we can, as a university, I think, play a leadership role in really realizing that we’re entering a whole new different era of learning,” Scheufele said. 

AI in the classroom

Scheufele is teaching Life Sciences Communication 251 this semester, an introductory class on science, media and society. He uploads all his course materials including lectures, notes and readings to Google Notebook LM, allowing students an AI tool that relates directly to course materials. 

Scheufele said Notebook LM assists students better than other chatbots because of its direct connection to the course. “It will give you an unbelievably good answer because it has listened to every word I’ve said all semester and has read everything that I’ve assigned,” he said.

Notebook LM is a large language model (LLM) UW-Madison provides as a free and vetted resource for students. The LLM assists in various ways, including generating audio overviews and creating visual aids. 

“The system learns from the [resources] I upload. When students interact with Notebook LM, they can ask any question that you would ask a normal LLM, but they know that the content is instructor approved,” Scheufele said.

Enjoy what you're reading? Get content from The Daily Cardinal delivered to your inbox

Scheufele has struggled to figure out how to test students with this system.

“How do we test students’ abilities to recognize patterns and basically put their degree to use 20 years from now, when everything will be AI?” Scheufele said.

Instructors are increasingly returning to bluebooks and pen and paper exams. 

Dr. Ann Wieben, assistant professor in the School of Nursing, said she utilized AI to reform an annual project she assigns to her students. Students interview healthcare professionals for the project, but some students found arranging interviews difficult. 

Wieben wanted to provide students with the option to use an interview transcript if they were not able to conduct live interviews. This year, she used Microsoft Copilot’s AI agent to help create a transcript based on a document that laid out the details of a healthcare process. 

Wieben noticed the AI output only needed a few minor adjustments to ensure factual integrity, but otherwise worked well. “It would have taken me a few hours to create a synthetic transcript from scratch, which may have prohibited me from providing this assignment alternative to students,” she said.

In other classrooms, professors more tightly regulate AI use as something to work with rather than rely on.

Rachel Truitt, an art professor, said design instructors came together at the beginning of the semester to create their AI policy. “We allow AI for the use of editing, idea generation and producing small amounts of design assets, but it absolutely must be disclosed so we can evaluate assignments appropriately, she said”

Truitt is teaching graphic design this semester and believes “AI is a creative partner, not a producer.”

“As AI grows, we're going to have to push back on the idea that the bots can do everything our brain can do. It's just not true,” Truitt said. “We need to test things out, talk to others and collaborate with actual people. If we don't, we won't learn.”

On top of aiding student learning and staff resourcefulness, AI can assist students whose first language isn’t English.

Multilingual learner students are able to ask AI to read an assigned article and discuss it with AI, Shaffer said. “They’re able to ask questions and clarify things in their first language,” he said.

“Sometimes, in many cases, they’re also composing in their first language and then getting a translation. Whatever the assignment is, they compose it and then AI can translate it, and they can read it over to make sure that it’s actually saying what they were intending to say,” he said. 

Shaffer acknowledged the additional challenge of predominantly English instruction that multilingual learners face. “[AI] helps close that gap.”

Professors have urged caution when employing AI despite its uses in providing better resources and facilitating learning.

Yonatan Mintz, an engineering professor and AI expert, told the Cardinal AI use and reliance can cause cognitive shortcutting. 

“If students just put their assignment questions into [AI], then they will not learn the underlying material and will have a more difficult time in the future understanding the difference between good and bad output,” he said. 

Mintz has also noticed students are sometimes embarrassed to ask faculty for help and go to AI instead. “This is really detrimental and leads to a lot of confusion on the part of the student, and is harmful for both professional and academic development,” he said.

Excessive use of AI by staff can also lead to bad outcomes, Mintz said. “Faculty that use AI to grade assignments or take notes are not devoting the energy needed to properly assess students, which could be detrimental in the long run,” he said.

AI agents source the entire internet to answer questions posed. This can cause a mixup of information, leading to a spread of misinformation, which can result in faculty communicating faulty material during class or students submitting inaccurate work. 

Scheufele said AI will summarize, attribute and cite wrong. “AI systems typically or routinely hallucinate, meaning they give what they think are correct answers because AI doesn’t know truth — it can only make probabilistic judgements,” he said. 

Scheufele noted an instance where AI predicted events that hadn’t happened yet. He asked ChatGPT to tell him what the natural disasters were this year, and while it did complete that task, it also gave answers for what natural disasters “happened” in December. 

Shaffer said AI poses a huge challenge at every level, but not the challenge many are worried about. 

He said AI is often biased in its algorithms and that understanding AI comes before using it. “You get many instances where it’s reinforcing systemic racism and other biases. You need to learn how to use AI well, as opposed to just taking its biases [and] not worrying about what it misses when it summarizes things,” Shaffer said.

Mintz said courses need to be designed to give students the background they need to identify misinformation in AI outputs, not to avoid AI outright. “Since using LLMs is going to be a predominant thing in the workforce, students need to learn how to incorporate these tools into their workflows and be proficient in them,” he said.

Support your local paper
Donate Today
The Daily Cardinal has been covering the University and Madison community since 1892. Please consider giving today.

Powered by SNworks Solutions by The State News
All Content © 2025 The Daily Cardinal