Fake news manipulators benefit from global crises and breaking news
Times of crises and breaking news is when we have less defence against fake news and other forms of manipulated information. Information consumers are more open to deceptive content when scared or during fast-breaking events when the veracity of reported information may not yet be clear, a study from Microsoft shows.
Manipulators equipped with generative AI tools will be well-positioned to deceive audiences headed into elections, the company says in a report on threats against elections. Russia, Iran, and China are expected to increase the pace of influence and interference activity as the US presidential election in November approaches.
“Foreign malign influence in the US presidential election got off to a slower start than in 2016 and 2020 due to the less contested primary season. Russian efforts are focused on undermining US support for Ukraine while China seeks to exploit societal polarization and diminish faith in US democratic systems.”
The report says that fears that sophisticated AI deepfake videos would succeed in voter manipulation have not yet been borne out.
“Simpler “shallow” AI-enhanced and AI audio fake content will likely have more success.”
The report says Russian influence operations have increased in the past two months. Microsoft has tracked at least 70 Russian actors engaged in Ukraine-focused disinformation, using traditional and social media and a mix of covert and overt campaigns.
“China is using a multi-tiered approach in its election-focused activity. It capitalizes on existing socio-political divides and aligns its attacks with partisan interests to encourage organic circulation.”
“China’s increasing use of AI in election-related influence campaigns is where it diverges from Russia. While Russia’s use of AI continues to evolve in impact, People’s Republic of China and Chinese Communist Party-linked actors leverage generative AI technologies to effectively create and enhance images, memes, and videos.”
The report estimates that Iran will likely launch acute cyber-enabled influence operations closer to US Election Day.
“Tehran’s election interference strategy adopts a distinct approach: combining cyber and influence operations for greater impact. The ongoing conflict in the Middle East may mean Iran evolves its planned goals and efforts directed at the US.
Microsoft says use of high-production synthetic deepfake videos of world leaders and candidates has so far not caused mass deception and broad-based confusion.
“In fact, we have seen that audiences are more likely to gravitate towards and share simple digital forgeries, which have been used by influence actors over the past decade. For example, false news stories with real news agency logos embossed on them.”
“Audiences do fall for generative AI content on occasion, though the scenarios that succeed have considerable nuance.”
Highlights from the report:
- AI-enhanced content is more influential than fully AI-generated content
- AI audio is more impactful than AI video
- Fake content purporting to come from a private setting such as a phone call is more effective than fake content from a public setting, such as a deepfake video of a world leader
- Disinformation messaging has more cut-through during times of crisis and breaking news
- Impersonations of lesser-known people work better than impersonations of very well-known people such as world leaders
Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.
Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!
We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.