Deep-fake video ads a threat to the big election year 2024
The elections of 2024 are at risk of manipulation from a large quantity of high quality AI-generated falsehoods. Over 100 deep-fake video advertisements impersonating Prime Minister Rishi Sunak were paid to be promoted on Meta’s platform in the last month, according to a study by London-based social media monitoring company Fenimor Harper Communications. 2024 is a gigantic election year with more than four billion people in 76 countries voting.
“Over the last year, Fenimore Harper has been monitoring advances in generative artificial intelligence and how it enables misinformation. Since the middle of 2023, we have observed dramatic advancements in voice cloning, generative video AI tools and ‘deep-fakes’.
“With the advent of cheap, easy-to-use voice and face cloning, it takes very little knowledge and expertise to use a person’s likeness for malicious purposes.”
“Unfortunately, this problem is exacerbated by lax moderation policies on paid advertising. These adverts are against several of Facebook’s advertising policies. However, very few of the ads we encountered appear to have been removed”, the company says in its study of AI made fake video advertising.
The deep-fake video advertisements impersonating Prime Minister Rishi Sunak may have reached over 400,000 people, despite explicitly breaking several of Meta’s ad policies, the report says.
143 individual ads were uncovered on Meta’s platform and up to £12,929 has been spent on these advertisements in the last month December 8 – January 8. They have been published by what appears to be a wide-spread network of Facebook pages.
Funding for the ads originates from 23 different countries including Turkey, Malaysia, the Philippines and the United States.
“It appears to be the first widespread paid promotion of deep-faked video of a UK political figure. This is likely to be the first of many AI-enhanced misinformation campaigns of 2024, raising questions about social media platform’s ability to moderate election interference at scale.”
“It is unclear whether the page owners are aware of the scam they are publishing. Many of the pages promoting the scam appear to be legitimate business, such as ‘Plural Brother Filmes’, a Sao-Paulo based film production company, and ‘Binimoy Printers’, printing company based in Bangladesh. This indicates that their Meta advertising credentials may have been compromised.”
“Meta does not provide specific numbers regarding how many people have been reached by political advertising, only estimated ranges. According to their ranges, up to 462,000 people may have been reached by this malicious advertising.”
“Most achieved only a couple of thousand ‘impressions’, however some stretched into tens of thousands. When taken as a whole, it adds up to a large amount of people reached by deep-faked endorsement from Prime Minister Rishi Sunak.”
As protection against being targeted, the company’s advices:
1) Invest in social media monitoring of public figures in relation to your organisation.
2) Teach your organisation how to spot ‘deep fakes’. While platforms themselves don’t appear to be able identify deep-faked content, a human can discern real humans from bots quite well.
3) Be mindful of AI-powered tools in your workflow. Many popular platforms for campaigning, such as Adobe’s Creative Suite or Canva, have AI built into their features. It’s important to ensure these aren’t used by people in your team to deceive the public. Make sure to define acceptable use within your teams.
Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.
Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!
We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.