A third of people using video-sharing platforms (VSP) say they have witnessed or experienced hateful content; a quarter claim they’ve been exposed to violent or disturbing content, and one in five have been exposed to videos or content that encouraged racism, British media regulating authority Ofcom said referring to its own research. The authority has announced guidance for the platforms that are now by law required to protect people from harmful content.
VSPs established in the UK are by a new law required to take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.
Ofcom said appropriate measures may include:
- terms and conditions for sharing videos;
- reporting and flagging functions to mark disturbing content;
- viewer rating systems;
- age verification;
- parental control functions;
- complaints procedures;
- media literacy tools and information.
”Our guidance is aimed at helping these platforms to understand their new obligations and judge how best to protect their users from this kind of harmful material”, Ofcom said.
If Ofcom finds that a VSP has broken the rules, it can fine the VSP up to 5% of its qualifying revenue or £250k (whichever is greater).
The legislation states that users should be protected from:
- might impair the physical, mental or moral development of under-18s;
- are likely to incite violence or hatred based on particular grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation; and/or
- directly or indirectly encourage acts of terrorism; show or involve conduct that amounts to child sexual abuse; and show or involve conduct that incites racism or xenophobia.