Local media outlets have a powerful question on their minds: Is artificial intelligence working for or against them?
Tomas Dodds, journalism professor and founder of the Public Media Tech Lab at the University of Wisconsin-Madison, hopes to help local journalists understand the implications of AI in the newsroom by providing a variety of resources, such as training sessions and workshops.
“In the newsroom, you don’t know how your colleagues are using AI,” Dodds said, adding that the Public Media Tech Lab would facilitate discussions in the newsroom where coworkers could disclose how they use it and create AI usage policies from these discussions.
In facilitating these discussions and hosting these workshops, Dodds hopes to help local media organizations develop clear guidelines for AI use that aligns with their newsroom’s specific values. Dodds said it is problematic when media organizations lack newsroom-specific AI guidelines, largely because it creates what he describes as a sense of “professional dissonance.”
“There is this mismatch that happens to journalists between the way that they think they should be doing their job and the way that they actually get to do their job,” Dodds said, stressing how unregulated AI use can put a given newsroom’s values at risk.
The Public Tech Media Lab also wants to help local media companies develop newsroom-specific AI technology, such as personalized, in-house Large Language Models — a generative AI model akin to ChatGPT — that draws from a given newsroom’s article database as a tool for journalists.
“Personalized LLMs means being in conversation with your own archival information, with the history of all the publications you’ve ever done,” Dodds said, adding that the Public Tech Media Lab would help local newsrooms create these resources when they wouldn’t otherwise have the means to develop them on their own.
The Public Tech Media Lab has worked with a variety of local media organizations including Isthmus, where Dodds led a workshop with the staff.
Isthmus Editor Judy Davidoff said she is hesitant about AI and is still very much in an experimental stage, but is open to its possibilities as a resource, believing many of her colleagues feel the same way.
“I haven’t asked it to generate content,” Davidoff said, however she uses it to help create headline ideas for her articles. Even then, however, she never uses AI’s headlines verbatim, and at most uses it as a resource to brainstorm..
One way AI has been a tool for Davidoff and her colleagues is transcription software.
AI-based programs like Otter.ai create transcripts of audio recordings and allow for easy searching throughout the transcript to locate specific quotes — something that can save invaluable time in journalists’ frequently packed work schedules.
Even then, Davidoff conceded that AI transcription services can be harmful, and she doesn’t allow herself to rely on them too much. She said she typically takes notes by hand as well as an audio recording , adding Otter oftentimes doesn’t fully transcribe interviews correctly and isn’t always reliable on its own.
Despite this, she still notes how useful these types of programs can be for her and her coworkers.
Davidoff said she started wondering more about AI in her newsroom after a July Wisconsin State Journal article from reporter Audrey Korte was taken down due to “unauthorized use of AI.” The article featured AI-generated information and sourcing about the redevelopment of Madison’s Brayton Lot downtown that was factually inaccurate, leading to Korte’s dismissal.
Davidoff said that while she believes the incident was an unusual case, the experimentative nature of AI in the newsroom paired with a lack of understanding of the technology poses risks.
“It’s an organizational failure of a media organization that is not having a conversation with their workers about how they’re going to use artificial intelligence,” Davidoff said, adding that if news organizations don’t have these conversations with their journalists, the Brayton Lot piece could absolutely become more than just a journalistic anomaly.
“We have so few people in newsrooms all over,” Davidoff said. And as more newsrooms fall understaffed, Davidoff said AI misuse could be a threat.





