The difficulties with ethical guidelines for genAI in the newsroom

 The difficulties with ethical guidelines for genAI in the newsroom

Most newsrooms have already experimented with generative AI technologies like ChatGPT, but not necessarily to create content. Setting ethical guidelines and debiasing techniques seems to be the most difficult area for many newsrooms. The use cases are quite diverse: code writing, summaries, enhancing headlines and SEO, professor Charlie Beckett,  founding director of LSE JournalismAI project, writes in a blog post. 

These are preliminary results from a survey of small and large newsrooms in Latin America, Africa, the Middle East and North Africa, Asia Pacific, Europe, and North America. The final report is expected to be published this autumn.

A recent report by publisher organisation WAN-IFRA shows that almost half (49%) of newsrooms are using tools like ChatGPT but only 20% have management guidelines on when and how to use GenAI tools. 70% said they expect Generative AI tools to be helpful.

Read Also:  50% of newsrooms use artificial intelligence but only 20% have rules

One of the first newsrooms to use AI is US news agency Associated Press that since many years have used AI to write short financial news stories on companies financial reports. The agency has said the automated production allowed it to increase the number of news reports but also to limit mistakes.

Beckett writes that most respondents in the LSE survey agree that generative AI technologies present a new set of opportunities other AI technologies have not provided, but they are more divided as to whether they also bring a unique set of challenges..

 He reports that more than half the respondents said they hoped to automate mundane tasks and simplify workflows to free up journalists to engage in “more creative, relevant, and innovative work.”

The survey shows that limited resources and technical expertise are still the most significant challenges to AI adoption in the newsroom, compared to what respondents said in 2019 when the project made a similar survey, but this does not mean hiring more technical people. 

“Many respondents told us that achieving interoperability and synchronisation with other departments was a challenge.” 

“It’s also about bridging the knowledge gap between technologists and journalists, in terms of tech skills and journalistic skills as well. This requires a nuanced understanding by the newsroom leadership of the types of training needed for each department.” 

“For non-English speaking journalists, the challenges are more pronounced. Respondents highlighted language limitations of AI tools in other languages like transcription tools and algorithmic bias is experienced at seemingly higher margins than in English.”

 This has pushed some to develop their own tools in-house, which takes considerable time and resources, Beckett says. 

“Despite those challenges, the enthusiasm –  as well as the scepticism  – for AI in the newsroom remains high!”, Beckett writes.

Read Also:  AI-generated content farms attracting ad money from major companies
Read Also:  Is using generative AI moving too fast and risk breaking things?


Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.

Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!

We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.

[email protected]

Notify of
Inline Feedbacks
View all comments

    Do you want an experienced opinion on a job issue?
    Moonshot Manager is here to answer!

      Moonshot community sharing thoughts and ideas, in a anonymous, safe environment.