News publishers’ horror stories so far creating content with errors and generated by artificial intelligence are well known. Other uses of artificial intelligence have helped journalists make headlines for the right reasons – like The Modern York Times. using computer vision on satellite images to count bomb craters in Gaza.
But beyond the most high-profile victories and delights, prudent but impactful applications of generative AI are quietly transforming newsrooms around the world. Generative artificial intelligence creates fresh opportunities to improve the reporting process and storytelling itself. The latest artificial intelligence can, and in some cases already is, equipping journalists with powerful tools to elevate their craft, from helping them edit copy to extracting insights hidden in immense troves of data. As technology evolves, its potential to enhance content accuracy, efficiency and depth of reach has the potential to change the face of journalism and the news industry.
A 2023 survey of 105 news organizations in 46 countries found that most see potential benefits for journalism in generative AI tools like ChatGPT. In a survey conducted by JournalismAI, a global initiative to inform news organizations about artificial intelligence, almost three-quarters of respondents said that such applications of artificial intelligence create fresh opportunities in the field. And 85% of respondents – including journalists, technologists and newsroom managers – have experimented with using AI for tasks such as creating images and generating story summaries.
Semafor, which launched six weeks before ChatGPT, is one of the newsrooms experimenting. The media startup uses two AI tools – an internal editing bot and MISO. The latter helps you find stories that are powerful Signals, a breaking news feed powered by journalists and generative AI to highlight news from publications around the world. Semafor’s approach to artificial intelligence is to take a close look at its current capabilities and limit its applications to those areas, said Gina Chua, Semafor’s editor-in-chief.
“Basically you have an English major that can do a lot of things,” Chua said.
Beyond these tools, she is is experimenting with so-called recovery-assisted generationor RAG. This technique allows the AI chatbot to provide relevant information in real time, prioritizing the library of data provided by Chua. This approach helps combat the so-called hallucinations, i.e. events in which generative artificial intelligence models make things up.
Chua created the RAG tool using the Justice Department’s definition of a hate crime and fed the chatbot specific scenarios, such as a white man attacking an Asian woman. The chatbot may or may not consider it a hate crime and explain the reasoning behind the call. On the second model, she imposed the style and scope guidelines of the Trans Journalists Association. It then asked the AI to provide feedback on the published articles and the level of disinformation they contained about transgender issues. Both have proven successful, and Chua wants to explore the possibilities of building RAG models that could facilitate Semafor’s journalists figure out how to report on sensitive topics or quickly get up to speed on a given topic.
Chua built these chatbots in her spare time using easy-to-use tools. She said top journalists should experiment with artificial intelligence and not dismiss it as not being as good as them at certain tasks.
“He’s probably not going to be as good as a human at some things,” she said. “It’s a mistake to say, ‘I want it to be a human.’ The trick is to say, “I want this to be as good a tool as it can be, and how do I complement it and how do I complement it?”
Chatbots driving subscriptions
The most common way people interact with generative AI is through chatbots such as ChatGPT OpenAI Or Google’s twins. Editorial offices also create chatbots for their readers, including Skift, a business news website covering the travel industry.
ChatGPT had only been offline for a few weeks when Skift CEO Rafat Ali told an audience of travel industry professionals that “this is going to be huge, we need to start working on it now,” recalls Jason Clampet, co-founder and president of Skift .
Clampet took this advice to heart, and Skift engineers soon created an AI assistant calledAsh Shift” Trained on 30,000 Skift messages and reports, the chatbot can answer users’ travel questions and even suggest existing Skift stories for further reading.
“It’s a great way for us to learn how to cover stories,” Clampet said. “Another news outlet might just write about something as an experienced observer. We can say, “Oh, we get the idea.”
Ask Skift now answers thousands of questions every week. The chatbot works similarly to Skift’s paywall, allowing users to ask three questions for free before asking them to become a paying subscriber. Clampet said the Ask Skift paywall already provides about $365 in annual subscriptions.
He said Skift monitors reader questions to identify trends and potential story ideas. This leads to fresh coverage, like an article on a topic why travel costs have increased so much.
“Almost the same way someone might search Google Trends to find out what people are doing and what they are looking for,” Clampet said. “That way we can see what people have been asking for, and here’s how we can take it further.”
Ask Skift was just the first experiment in generative AI — the engineering team also released an app that lets users ask questions on Slack, a widely used office chat platform. Clampet said that as technology becomes easier to utilize every day, more ideas are being considered.
“Most of it is just trial and error and figuring out how to do it a little better each time,” he said.
“Hurry to Move Forward”
Deafening bugs in Illustrated sports this is just the tip of the iceberg. Nearly a dozen other sites published AI-generated articles that contained errors, and the issue was extensively documented on the technology site Futurism.
The missteps appear to be a result of companies rushing to be the first to implement AI’s generative capabilities without implementing thorough verification processes, said Felix M. Simon, a communications researcher and PhD student at the Oxford Internet Institute.
“From the outside, we see that in all cases there was a rush to make progress and implement artificial intelligence as quickly as possible,” he said.
Simon said it was imperative to involve journalists during production and then before publication. The need for human oversight is another reason why journalists will need to learn how to utilize artificial intelligence tools.
“There will be a learning curve,” Simon said. “We will have to get used to working with these systems if we want to work with them in the first place and identify their strengths, but also their weaknesses.”
“Artificial intelligence is a tool. “It’s just a tool”
While human oversight remains crucial, Simon cautioned against complacency about AI’s ability to move jobs. This won’t be a reason to replace journalists in the near future – deeper systemic problems, including cost-cutting pressures and a lack of sustainable business models for media, led to mass layoffs of journalists long before generative AI.
However, as AI technology advances, it may require less human assistance and, even with current capabilities, reduce the need for staff. Simon said management could also utilize artificial intelligence to justify further layoffs.
Others fear that reliance on giant tech companies could exacerbate the news media’s already significant problems. Rodney Gibbs, senior director of strategy and innovation at The Atlanta Journal-Constitution, wrote overdue last year in Nieman Lab that chat about ChatGPT has echoes of “switch to video” which has led many newsrooms astray in the Facebook era of digital media.
It’s a concern that Nikita Roy, who directs the AI Journalism Lab at the Craig Newmark Graduate School of Journalism, has heard, but she doesn’t pay much attention to it. This time, newsrooms are not dependent on tech companies to attract audiences, she said. Instead, newsrooms are customers buying the product, which gives news outlets more power in the coverage.
Importantly, AI tools are simple enough to utilize that even the smallest newsrooms can benefit from them, Roy said. If used properly, AI can bring benefits to an ever-beleaguered industry.
“Artificial intelligence is a tool,” she said. “It’s just a tool. But it’s a tool that will facilitate us do more work and reach more audiences, and those are two things we need to do.”