Facebook has removed more than 20 million messages and more than 3 000 accounts and groups from Facebook and Instagram for repeatedly violating rules against spreading COVID 19 and vaccine information since the pandemic started and until June, Facebook said in a report on content control for Q2.
Display warning were added to 190 million pieces of Covid-related content on Facebook that fact-checking partners said was false or misleading, the company argued in the report as an answer to criticism from President Joe Biden and others for not doing enough to stop spreading of misinformation.
“Prevalence of hate speech has decreased for three quarters in a row since we first began reporting it. This is due to improvements in proactively detecting hate speech and ranking changes in News Feed”.
“Hate speech content removal has increased over 15X on Facebook and Instagram since we first began reporting it.”
The company said it in Q2 has removed 31.5 million pieces of hate speech content from Facebook, compared to 25.2 million in Q1, and 9.8 million from Instagram, up from 6.3 million in Q1.
“This is due to continued improvement in our proactive detection. Our investments in AI enable us to detect more kinds of hate speech violations on Facebook and Instagram.”