Skip links
Harmful content online goes unreported

Few young users report harmful online content

Young people are active online but they don’t report harmful content. Two thirds of teens and young adults have recently encountered at least one potentially harmful piece of content online, but only around one in six go on to report it, according to UK’s media and communications authority. The authority is now joining forces with an influencer and a media psychologist to change that.

The Parliament is on its way to approve the Government’s Online Safety Bill and it will be Ofcom’s responsibility to enforce these new laws. The authority has already started regulating video sharing platforms established in the UK like TikTok, Snapchat and Twitch.

To make more young internet users report potentially harmful content, Ofcom announced it has joined forces with social media influencer Lewis Leigh, and a behavioural psychologist, Jo Hemmings, to launch a new campaign. The social media campaign aims to reach young people on the sites and apps they use regularly, to highlight the importance of reporting posts they may find harmful.

“Ofcom’s Online Experiences Tracker shows that most younger people aged between 13 and 24 (65%) believe the overall benefits of being online outweigh the risks. But around the same proportion – 67% – have encountered potentially harmful content”, Ofcom says.

“Younger people told us that the most common potential harms they came across online were: offensive or ‘bad’ language (28%); misinformation (23%); scams, fraud and phishing (22%); unwelcome friend or follow requests (21%) and trolling (17%).”

“A significant number of young people (14%) also encountered bullying, abusive behaviour and threats; violent content; and hateful, offensive or discriminatory content, targeted at a group or individual based on their specific characteristics.”

“Our research reveals a worrying gap between the 67% of young people who experience harm online and those who flag or report it to the services. Fewer than one in five young people (17%) take action to report potentially harmful content when they see it.”

Ofcom reports that younger users say the main reason for not reporting is that they didn’t see the need to do anything (29%); while one in five (21%) do not think it will make a difference. Over one in ten (12%) say they don’t know what to do, or whom to inform.

“User reporting is one important way to ensure more people are protected from harm online. For example, TikTok’s transparency report shows that of the 85.8 million pieces of content removed in Q4 2021, nearly 5% were removed as a result of users reporting or flagging content. In the same period, Instagram reported 43.8 million content removals, of which about 6.6% were removed as a result of users reporting or flagging content”, Ofcom says.

“With young people spending so much of their time online, the exposure to harmful content can unknowingly desensitise them to its hurtful impact. People react very differently when they see something harmful in real life – reporting it to the police or asking for help from a friend, parent or guardian – but often take very little action when they see the same thing in the virtual world” says behavioral and media psychologist Jo Hemmings.

“What is clear from the research is that while a potential harm experienced just once may have little negative impact, when experienced time and time again, these experiences can cause significant damage. Worryingly, nearly a third of 13-to-17 year olds didn’t report potentially harmful content because they didn’t consider it bad enough to do something about. This risks a potentially serious issue going unchallenged.”

Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.

Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!

We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.

    Do you want an experienced opinion on a job issue?
    Moonshot Manager is here to answer!

      Moonshot community sharing thoughts and ideas, in a anonymous, safe environment.