As artificial intelligence (AI) gains prominence, news agencies are exploring its potential in journalism, yet the industry grapples with trust and authenticity issues. The Associated Press (AP) recently outlined its stance on AI, signaling a cautious approach.

AP affirmed that it won’t use AI to create publishable content or images but will continue to experiment with the technology. This stance reflects concerns about audience trust in AI-generated news.

Conversely, Newsquest Media Group’s job listing for an “AI-assisted reporter” underscores a more open embrace of AI. The role involves using AI to create national, local, and hyper-local content while integrating AI-generated material into newsrooms. This mirrors the industry’s division regarding AI’s role in news creation, with specific courses now available to train journalists in AI implementation.

Charlie Beckett, who leads the LSE’s JournalismAI project, recognizes the evolving landscape of journalism influenced by AI. However, he emphasizes that AI is a “language machine” rather than a “truth machine,” underlining the enduring importance of human involvement in journalism.

Diverse news organizations are approaching the AI revolution differently. AP released guidelines for AI use, emphasizing careful vetting of AI-generated material and limiting its use to stories specifically about AI-generated content. They see potential in AI for tasks like assembling story digests for newsletters.

Reuters, AP’s rival, has adopted a “responsible approach” to AI, focusing on accuracy and trust. They commit to using AI editorially when it contributes to original journalism, always with human oversight and senior editor approval. AI is intended to aid journalists in tasks such as data analysis, corrections, and reducing workload.

The Guardian takes a similar approach, limiting AI use to editorial functions, especially when dealing with extensive data sets. Transparency, permissions, and fair compensation are crucial considerations when selecting AI tools.

While major news organizations are cautious about AI due to issues like copyright concerns, smaller newsrooms are finding opportunities. News Corp Australia uses generative AI to produce thousands of local stories weekly, covering topics like weather, fuel prices, and traffic conditions. A local newspaper in Nottinghamshire, UK, is trialing AI-generated bullet point summaries for some articles, ensuring transparency by adding a note explaining AI involvement.

The news industry’s foray into AI reflects its evolving nature, with organizations balancing innovation with credibility and accountability.

By Impact Lab