Advertisement
Social Justice

Campaigners vow to ‘keep fighting’ for young people as Online Safety Act ‘fails to go far enough’

New Online Safety Act measures have been criticised for failing to close ‘loopholes’ that allow harmful content to stay online. But regulator Ofcom has pledged new rules are ‘just the beginning’

Campaigners have claimed Online Safety Act (OSA) measures “fail to go far enough” to protect vulnerable young people, with not enough done to “tackle self-harm and suicide material online”. 

Ofcom, the regulator enforcing the UK’s internet safety law, published its Illegal Harms Codes of Practice on Monday (16 December), which means online platforms must begin assessing whether users are exposed to illegal material by March 2025 or face fines. 

The set of rules, published on Monday, focuses on “illegal harms”, including terror, fraud, hate offences, child sexual abuse, and encouraging suicide. The codes also include further clarity around requirements to remove intimate image abuse content. Certain platforms must also use technology called hash-matching to detect child sexual abuse material on its sites. 

Ofcom stated the codes of practice are “just the beginning”, aiming to publish draft guidance on protecting women and girls in February 2025, and additional protections for children from harmful content, for example content promoting suicide, self-harm, and eating disorders, in April 2025.

Under the OSA, Ofcom can now fine companies up to £18 million or 10% of their qualifying global turnover, and in very serious cases can apply for sites to be blocked in the UK.

Dame Melanie Dawes, chief executive of Ofcom, explained that the authority will be “watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”

Advertisement
Advertisement

“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them,” she added. 

“For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.”

Technology secretary Peter Kyle added that the publication of the first set of codes under the OSA is a “significant step” in making online spaces safer. 

“This government is determined to build a safer online world where people can access its immense benefits and opportunities without being exposed to a lawless environment of harmful content,” he said. 

“Ofcom’s illegal content codes are a material step-change in online safety meaning that, from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world.

“If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites.”

Advertisement

Campaigners have claimed the new rules do not go far enough, however, and that the OSA contains “loopholes” for platforms to keep harmful content online. 

“Despite meeting with bereaved families including my own and hearing our stories and the recommendations we put to them, Ofcom’s code fails to go far enough to tackle self-harm and suicide material online,” Adele Walton, online safety campaigner and author of Logging Off: The Human Cost of Our Digital World, told the Big Issue.

“There is not a single targeted measure for platforms to tackle suicide and self-harm material that meets the criminal threshold. These rules shield platforms from real accountability.”

Walton added that legislation is “failing to keep young people safe online” and online platforms that “profit from young people’s vulnerabilities” aren’t being sufficiently held to account.

“If a shop opened on the high street advertising and encouraging suicide it would be shut down in seconds – online, nothing will happen. If platforms find harm that isn’t in the codes, they don’t have to do anything, meaning that this lack of targeted measures leaves huge loopholes in the act,” she said. 

“This isn’t the Online Safety Act parents want, it isn’t the Online Safety Act young people want – it’s worth asking who this decision serves. These rules are a real disappointment for online safety campaigners and bereaved families today, but we will keep fighting.”

Advertisement

The Molly Rose Foundation, which focuses on suicide prevention for young people, added that it was “astonished and disappointed” by the lack of regulation on suicide and self-harm content online. 

“We are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold in today’s codes despite this being a priority harm,” the charity said in a statement on Twitter/X

Imran Ahmed, CEO of Center for Countering Digital Hate (CCDH), welcomed the new regulations, explaining that they are only “step one” in ensuring strengthened online safety, and that he hopes to see more over the coming months. 

“This is part one, we hope, of a series of significant and bold steps being taken by Ofcom in the coming months,” Ahmed told the Big Issue.

He explained that Ofcom’s proposed fines for companies that do not comply with the regulations were a positive, claiming that as hate and disinformation online can be profitable to social media companies, in order to counter the issue, “we have to create costs for the production and distribution of hate and disinformation”.

“This is a significant step in creating the accountability and the responsibility, the economic responsibility, that is required to persuade platforms to work to reduce hate and disinformation, not as they are currently doing to to increase it and monetise it,” he said.

Advertisement

Ahmed explained that CCDH had published its own reports illustrating how vital protections are for young people online, explaining that protecting young people from harmful content is a “crisis issue”.

“We just put out a report last week looking at YouTube, where we modelled being a 13-year-old girl on that platform who looked at one video about disordered eating,” he said.

“YouTube had promised parents that they will never, ever recommend self-harm or eating disorder content. We found that 700 of the 1,000 recommendations that YouTube had up next in the 100 different iterations of the experiment that we tried… recommended harmful content after they watched a single video,” he said, adding that further updates to the OSA in the future should see regulations to “algorithms, AI and advertising”.

“We need to start thinking about Online Safety Act 2.0… that will have to address a series of issues which weren’t fully covered by Online Safety Act 1.0,” he said. “There are really big, substantial issues which aren’t covered right now.”

Ahmed added that if the OSA is to be effective, it should “create an online experience which is entertaining, sometimes edifying, and always enjoyable for young people, in which they’re not exposed to as much threat as they are right now”.

Do you have a story to tell or opinions to share about this? Get in touch and tell us more. This Christmas, you can make a lasting change on a vendor’s life. Buy a magazine from your local vendor in the street every week. If you can’t reach them, buy a Vendor Support Kit

Advertisement
Advertisement

Buy a Big Issue Vendor Support Kit

This Christmas, give a Big Issue vendor the tools to keep themselves warm, dry, fed, earning and progressing.

Recommended for you

Read All
I faced oppression as an LGBTQ+ man in Syria. We need to keep fighting for a brighter future
Khaled Alesmael, an LGBTQ+ writer from Syria
Syria

I faced oppression as an LGBTQ+ man in Syria. We need to keep fighting for a brighter future

'We all deserve magic': Meet the teachers working to bring Christmas joy to children in poverty
kids doing christmas craft activities
Christmas

'We all deserve magic': Meet the teachers working to bring Christmas joy to children in poverty

'I was restrained for hours': Mental health hospitals accused of 're-traumatising' patients
a person in a hospital gown
Mental health

'I was restrained for hours': Mental health hospitals accused of 're-traumatising' patients

Cashless Christmas is a gift for our vendors
Advertorial

Cashless Christmas is a gift for our vendors

Most Popular

Read All
Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits
Renters: A mortgage lender's window advertising buy-to-let products
1.

Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal
Pound coins on a piece of paper with disability living allowancve
2.

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal

Cost of living payment 2024: Where to get help now the scheme is over
next dwp cost of living payment 2023
3.

Cost of living payment 2024: Where to get help now the scheme is over

Citroën Ami: the tiny electric vehicle driving change with The Big Issue
4.

Citroën Ami: the tiny electric vehicle driving change with The Big Issue