For Powell Tribune reporter CJ Baker, what first struck him as a bit odd were the comments made by the Wyoming governor and local prosecutor, and then some of the phrases in the article sounded almost robotic.
But the definitive reveal that a reporter from a competing news outlet was using generative artificial intelligence in their writing came in a June 26 article about comedian Larry the Cable Guy being selected as grand marshal for the Cody Stampede Parade.
“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence led by one of comedy’s most beloved figures,” The Cody Enterprise reported. “This structure ensures the most important information is presented first, helping readers get the gist quickly.”
Baker, a reporter for more than 15 years, conducted an investigation and then met with Aaron Pelzer, 40, who was new to the journalism world, who Baker said admitted to using AI in stories before leaving The Enterprise.
The publisher and editors of the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise editor Chris Bacon said the paper “failed to detect” the AI’s copying and false quotations.
“It doesn’t matter that the misquote was a clear mistake by a hurried rookie reporter who trusted an AI — it was my job,” Bacon wrote. He apologized, saying the AI ”was allowed to put words into the story that were never spoken.”
Long before the advent of AI, journalists have ruined their careers by fabricating quotes and facts in their articles, but this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out fake, if somewhat plausible, stories with just a few prompts.
AI has a role to play in journalism, such as automating certain tasks, and some newsrooms, including The Associated Press, are using AI to free up reporters to focus on more impactful work, but most AP staffers aren’t allowed to use generative AI to create publishable content.
The AP has been using technology to help with financial reporting since 2014 and more recently with sports reporting. It is also experimenting with an AI tool to translate some of its articles from English to Spanish. Each article includes a note at the end explaining the role of technology in producing it.
Being upfront about how and when AI is used has proven important. Sports Illustrated magazine was criticized last year for publishing AI-generated online product reviews that appeared to have been written by reporters who didn’t actually exist. After the story broke, Sports Illustrated said it was firing the company that produced the articles for its website, but the incident tarnished the once-powerful magazine’s reputation.
In a Powell Tribune article that broke the news that Pelzer was using AI in his article, Baker wrote that he had an awkward but friendly meeting with Pelzer and Bacon. During the meeting, Pelzer said, “Obviously, I have never intentionally tried to misquote anyone,” and promised to “correct, apologize, and state that I was wrong,” Baker wrote. Baker also noted that Pelzer insisted that his mistake should not affect the editors at Cody Enterprises.
After the meeting, The Enterprise launched a full investigation into articles Pelzer wrote during the two months he was with the paper. Bacon said Tuesday that they found seven articles that contained AI-generated quotes from six different people. He is still investigating other articles.
“That’s a very credible statement,” Bacon said, noting that people he spoke to while reviewing Pelzer’s article said it was something they would say, but that he had never actually spoken to Pelzer.
Baker reported that seven people had told him they had been quoted in articles written by Pelzer, but none had spoken to him.
Pelzer did not respond to a phone message from The Associated Press listed as his number to discuss what happened, and Bacon said Pelzer declined to discuss the case with another Wyoming newspaper that contacted him.
Baker, who reads the Enterprise regularly because it’s a competitor, told The Associated Press that a combination of phrases and quotes in Pelzer’s article raised his suspicions.
Pelzer’s article about the Yellowstone National Park shooting included the sentence, “This incident serves as a harsh reminder that human behavior is unpredictable, even in the most benign circumstances.”
Baker said the line sounds like a chatbot-generated summary of his story, with some kind of “life lesson” added at the end.
Another article about a poaching conviction quoted wildlife officials and prosecutors as if they were from a news release, Baker said, but there was no news release and the agencies involved didn’t know where the quotes came from, he said.
Two of the articles in question contained false quotes from Wyoming Governor Mark Gordon, but Baker said he only found out about them after contacting staff.
“In one case, (Pelzer) wrote an article quoting the governor’s statements about new OSHA regulations that were completely fabricated,” governor spokesman Michael Pearlman said in an email. “In the second case, he appears to have fabricated portions of a statement and combined them with portions of a statement that were included in a news release announcing the new director of the Wyoming Game and Fish Department.”
The most obviously AI-generated copy appeared in an article about Larry the Cable Guy, which ended with an explanation of the inverted pyramid, a basic approach to writing breaking news stories.
Creating an AI story isn’t hard: Users can feed criminal affidavits into an AI program and have it write a story about the incident, complete with quotes from local officials, said Alex Mahadevan, director of the Digital Media Literacy Project at the Poynter Institute, a prominent journalism think tank.
“These generative AI chatbots are programmed to give you an answer, regardless of whether that answer is complete nonsense,” Mahadevan said.
Cody Enterprises publisher Megan Barton wrote an editorial calling AI “a new and advanced form of plagiarism, and in the media and writing world, plagiarism is an issue that every outlet has to correct at some point. It’s an ugly part of the job, but the companies that try to correct (or literally write) these errors are the ones with the reputations.”
Burton wrote that newspapers have learned their lessons and will implement systems to recognise AI-generated articles and “engage in longer discussions about why AI-generated articles are unacceptable”.
While Enterprise didn’t have an AI policy, partly because it was clear that journalists shouldn’t use AI to write stories, Bacon said, Poynter has a template that allows news organizations to build their own AI policies.
Bacon plans to have it installed by the end of the week.
“This will be a topic of pre-employment discussion,” he said.