On Air Now Matt Fletcher 6:30am - 9:30am Email
Now Playing Unbreak My Heart Toni Braxton Download

Teenagers exposed to 'horrific' content online - and this survey reveals the scale of the problem

Teenagers are routinely seeing inappropriate violent or sexual content, "doom-scrolling" and being contacted by strangers online, according to an exclusive survey for Sky News.

More than 1,000 young people aged 14 to 17 in Darlington schools told us what they see and experience online when looking at apps commonly used by teenagers.

Their answers raise troubling questions about whether government and tech companies are doing enough to protect children online amid a growing debate among parents and campaigners about how far to restrict children's access to smartphones and social media.

Of those surveyed, 40% spent at least six hours a day online - the equivalent of a school day. One in five said they spent upwards of eight hours a day on their phones.

Some of the findings in the under-16 group were striking, including that 75% had been contacted by strangers through social media and online gaming.

Over half (55%) of the Year 10 students, aged 14 to 15, had seen sexually explicit or violent content that was inappropriate for their age.

Concerningly, a large proportion of them (50%) said this always or usually came up on social media apps without them searching for it - suggesting it is driven by algorithms.

Doom-scrolling is the act of spending an excessive amount of time online consuming negative news or social media content, often without stopping.

The survey represents a snapshot of teenagers in one town in the UK, but resonates more widely.

The teenagers said they wanted their voices to be heard in the debate about online safety. While they did not favour a social media or smartphone ban, many wanted tougher controls on the content they see.

When asked if they were in favour of social media companies doing more to protect under 16s from seeing explicit or harmful content, 50% were in favour and 14% against.

'It's quite horrific'

Sky News was invited to film a focus group of under-16s from different schools discussing the results at St Aidan's Academy in Darlington, hosted by Labour MP Lola McEvoy, whose office carried out the research.

Jacob Lea, who is 15, said among the things he had seen on social media were "gore, animal abuse, car crashes, everything related to death, torture".

He said: "It's quite horrific. A lot of the things that I've seen that I shouldn't have, have not been searched by me directly and have been shown to me without me wanting to.

"Most of this stuff pops up on social media, Instagram Reels, TikTok, sometimes on YouTube.

"It's like a roulette, you can go online and see entertainment, because there's always a risk of seeing racism, sexism and 18+ explicit content."

Matthew Adams, also 15, said he spends six to seven hours a day online, before school and late into the evening - and up to nine hours on weekends, gaming and messaging with friends.

"After school, the only time I take a break is when I'm eating or talking to someone. It can turn into addiction," he said.

He also said inappropriate content was unprompted. "I've seen a varied spectrum of things - sexually explicit content, graphic videos, gory photos and just upsetting images," he added.

"Mostly with the violence it's on Instagram Reels, with sexually explicit content it's more Snapchat and TikTok."

Read more:
Boy's mental health 'severely impacted' after pornography shared
Instagram unveils new feature to let users reset algorithms

'It can be sexual stuff'

Summer Batley, 14, said: "I see unwanted content about getting into a summer body and how you should starve yourself.

"It just pops up randomly without searching anything. I reported it, but it keeps coming up."

Many of the group had been contacted by strangers. Summer said: "I have, and a lot of my friends have as well. They can just randomly come up on Snapchat and TikTok and you don't know who they are, and it's quite worrying, they're probably like 40 years old."

Olivia Bedford, 15, said: "I've been added to group chat with hundreds of people sending images like dead bodies, gore.

"I try to leave but there's so many people, I don't know who has added me, and I keep getting re-added. It can be sexual stuff or violent stuff. It can be quite triggering for people to see stuff like that quite damaging to your mental health."

Asked what she disliked online, Briony Heljula, 14, said: "Involvement with older people, people who aren't my friends and that I don't know. It's very humiliating when other people are commenting and being rude; and it's quite horrible."

Fewer than a third of those surveyed (31%) said they were always asked their age before viewing inappropriate content.

When asked about their age on social media, around a third said they usually pretended to be older. But in the focus group, teenagers were clear that they had seen upsetting and disturbing content when they used their real age.

Parents 'can't tackle this alone'

Ms McEvoy described the findings as "shocking" and said "the safety of our children online is one of the defining issues of our time".

"Parents and teachers are doing their best, but they can't tackle this alone," she added.

"We need enforceable age verification, better content controls, and stronger legislation to ensure children can go online without fear."

The Online Safety Act, which was passed by MPs in October 2023, is intended to protect users - particularly children - from illegal and harmful content.

It is being implemented this year, with tough fines for platforms which do not prevent children from accessing harmful and age-inappropriate content coming in this summer.

A private members' bill debated by MPs earlier this month proposed that the internet "age of consent" for giving data to social media companies be raised from 13 to 16, but it was watered down after the government made clear it would not support the move.

Snapchat, Instagram and TikTok were contacted for comment, but did not provide an on-the-record statement on the comments by the teenagers.

The companies insist they take issues of safety and age-appropriate content seriously.

Instagram is rolling out Teen Accounts, which it says will limit who can contact teenagers and the content they can see.

Snapchat and TikTok say on their websites that accounts for under-16s are set to private.

Sky News

(c) Sky News 2025: Teenagers exposed to 'horrific' content online - and this survey reveals the scale of the proble

More from UK News

Recently Played

Listen Live Listen