Deepfakes of special concern for women journalists

Deepfakes of special concern for women journalists

Deepfakes are among the more prominent examples of manipulated media today. Advances in technology are making it simpler to produce deepfakes, with less training. Perpetrators are better able to manipulate audio clips, photos and videos, as well as combinations of multimedia content. This is especially concerning for women journalists, as the most common form of manipulated media is falsified sexual images that can be used to silence them.

This is what Sam Gregory, program director for WITNESS, an organization that uses video and technology to defend human rights, told a seminar organised by International Center for Journalists and presented at the organisation´s website IJNet.

It’s an area in particular today where countermeasures fall short. “Detection is inadequate,” he said. ”It’s problematic and there aren’t a great range of solutions in the area. We need look at how this contributes to existing challenges for journalists who are under-resourced.”

The same tools and techniques used to create synthetic media can also be leveraged to detect them. Journalists should review glitches in videos, and apply existing verification and forensics techniques to spot manipulated media. They can also utilize encryption and emerging AI-based tactics like infrared detection.

Read Also:  Survey shows how hard it is to spot fake news

As they engage in efforts to combat deepfakes, journalists should keep in mind ethical considerations, as well. Deepfakes can be used for satire, and to protect identities, for instance, leading to some hesitation around efforts to stem their use, IJNet says:

In addition, journalists should ask themselves:

  • How can we teach people to spot deepfakes?
  • Do tools for detection exist, and who has access?
  • How do we build on existing journalistic skills and coordination?
  • Do tools for authentication exist, and who yet again may not have access?

Journalists, when they publish, should include evidence for their readers of how content is not false. They should also incorporate information about synthetic media in media literacy efforts for their audiences, Gregory recommended. One helpful approach to disseminate is SIFT: “Stop, Investigate, Find better coverage, and Trace claims, quotes and media to original context.”

Organizations like Google, Adobe and The New York Times are developing tools to help journalists identify and counter deepfakes, and to ensure their work includes evidence of legitimacy, IJNet says.

Read Also:  Media Literacy Index: education against fake news

“It is important to center journalists as one of the key groups who need to really identify what they need in this landscape,” said Gregory.

”Most of our work is helping people to create trustworthy information. When we look at deepfakes, part of the solution is how we reinforce an ecosystem of trustworthiness.”

As technology improves the ability to create deepfakes easily and cheaply, especially on mobile devices, journalists need to remain vigilant of future threats, is a conclusion.

Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.

Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!

We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.

[email protected]

Notify of
Inline Feedbacks
View all comments

    Do you want an experienced opinion on a job issue?

    Moonshot Manager is here to answer!

      Moonshot community sharing thoughts and ideas, in a anonymous, safe environment.