Skip links

AI-made fake nude pictures of celebrities – a case for Meta board

AI-produced fake naked pictures on social media is a growing problem. Two cases with fake pictures of celebrities are now up to be scrutinized. One shows a well known person naked and the other one is a pornographic picture of a woman and man. Persons who posted the pictures on Meta-owned Facebook and Instagram have turned to Meta’s oversight board complaining that the pictures were taken down.

The board says it selected these cases to assess if Meta’s policies and its enforcement practices are effective at addressing explicit AI-generated imagery. 

Meta took down both posts for violating policy on Bullying and Harassment, which prohibits “derogatory sexualized photoshops or drawings”. For one piece of content, Meta also said it violated Adult Nudity and Sexual Activity policy.

The oversight board has decided to address both cases together. For each case, the Board will decide if  the content should be allowed on Instagram or Facebook.

Read Also:  AI used for sophisticated advertising fraud

The first case involves an AI-generated image of a nude woman posted on Instagram. The image has been created using AI to resemble a public figure from India. The account that posted this content only shares AI-generated images of Indian women. 

“The majority of users who reacted have accounts in India, where deepfakes are increasingly becoming a problem”, the oversight board says.

In this case, a user reported the content to Meta for pornography. As a result of the board selecting this case, Meta decided that its earlier decision to leave the content up was in error and removed the post.

The second case concerns an image posted to a Facebook group for AI creations. It is an AI-generated image of a nude woman with a man groping her breast. The image has been created with AI to resemble an American public figure, who is also named in the caption. The majority of users who reacted have accounts in the United States.

“In this case, a different user had already posted this image, which led to it being escalated to Meta’s policy or subject matter experts who decided to remove the content as a violation of the Bullying and Harassment policy, specifically for “derogatory sexualised photoshop or drawings”, the oversight board explains. 

The image was added to a Media Matching Service Bank – part of Meta’s automated enforcement system that automatically finds and removes images that have already been identified by human reviewers as breaking Meta’s rules. Therefore, in this case, the image was already considered a violation of Facebook’s Community Standards and removed. 

Read Also:  EU Commission announcing a series of investigations against big tech

 

Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.

Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!

We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.

    Do you want an experienced opinion on a job issue?
    Moonshot Manager is here to answer!

      Moonshot community sharing thoughts and ideas, in a anonymous, safe environment.