‘Taking innocence from children’: Two-thirds of youngsters exposed to harmful online content – but only 16% report it | UK News


The UK’s communications regulator is calling on young people to help protect each other by reporting harmful content online.

Ofcom says two-thirds of youngsters aged between 13 and 17 see harmful content online but only 16% report it.

As it waits for the government’s Online Safety Bill to go through parliament, it has called on young people to help tackle online harms themselves.

At Soap Box Youth Centre in Islington, north London, Sky News met a group of young people all involved in digital creation.

Within seconds of going on their social media platforms, harmful content appears: a racist post about ‘white power’; critical messages about body image; fake promises; disinformation and a video of police brutally arresting a 16-year-old.

“There’s always a lot of violence on social media,” says Braulio Chimbembe. “I saw one video where someone was getting shot which takes the innocence from children.”

“That stuff shouldn’t be on social media and shouldn’t be so easily accessible.”

More on Online Safety Bill

But Braulio says reporting it isn’t always effective: “I think the person has to be reported twice before anything actually gets done.”

Marcus Austin agrees: “I’ve seen footballers completely insulted on social media and nothing really happens.”

His friend Joshua Lyken adds: “You think it’s normal after a while so you’re not going to really report it or tell someone about it, it just becomes normal to you.”

Ofcom says the most common online potential harms encountered by young users include generally offensive or bad language (28%), misinformation (22%), unwelcome friend or follow requests (21%), trolling (17%), bullying, abusive behaviour and threats (14%), content depicting violence (14%), and hateful, offensive or discriminatory content that targets a group based on specific characteristics (14%).

It says more than three quarters (77%) of those who were bothered or offended enough took some form of action, the most common being unfollowing, unfriending or blocking the poster or perpetrator, and clicking the report or flag button or marking it as junk, 51% said nothing had happened since doing reporting the content. A fifth (21%) said that the content had been removed.

Image:
The Soap Box Youth Centre in Islington

“If there’s a fake page and someone’s not happy about it and they ask me to report it then yeah, I report it,” says Michelle Akpata.

“Someone can always create another account and minutes later message you again,” says Coltrane Chead, warning that often reporting incidents does not bring an end to online threats.

Ofcom says not all potentially harmful online content or behaviour has the same degree of negative impact. Some potential harms may have a negative impact that is cumulative while some people may have become desensitised to it after repeated exposure.

The findings come as the Online Safety Bill continues to make its way through parliament. Ofcom will enforce the new laws, and has already started regulating video sharing platforms established in the UK – such as TikTok, Snapchat and Twitch.

“Platforms already have systems and processes in place to keep users safe,” explains Ofcom’s online safety principal Anna-Sophie Harling.

“User flagging tools are an option that are already available. We know that platforms use proactive content moderation to find to find harmful content and remove it, but we don’t know enough about how well it’s working.

“There are going to be really important changes coming in. Platforms will have to do things like risk assessments on products, they’ll have to produce transparency reports and publish data.”

Molly Russell
Image:
Molly Russell was exposed to harmful digital content

In 2017, Ian Russell’s teenager daughter Molly took her own life after being exposed to harmful digital content. He wants to see urgent improvements in the way digital content is regulated.

“It’s really important that something changes because the period of self-regulation that tech platforms have enjoyed, obviously hasn’t worked,” Mr Russell told Sky News.

“Nearly five years ago we lost our youngest daughter Molly who seemed a very normal, lovely, adorable young person with such a bright future in front of her.

“Somehow she was persuaded to think that she was worthless and life ahead of her was worthless. I could never understand how someone with so much potential thought that. And it wasn’t long before we looked at what she’d been looking at on her social media accounts and what we saw shocked us because she’d been exposed to harmful digital content that I have no doubt helped kill my daughter.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Translate »