Advertisement
Opinion

Why the internet needs to be safer for women

Safer Internet Day invites us to reflect on what online safety should look like. But does the current regulatory framework work for women?

As Safer Internet Day encourages reflection on how digital spaces can be made safer, recent controversies surrounding generative AI tools have reignited urgent questions about how and when the UK takes technology facilitated violence against women and girls (TFVAWG) seriously.

While one high-profile episode involving an AI chatbot hosted on a major platform drew intense media and political attention earlier this year, it did not reveal a new or isolated problem. Rather, it exposed a familiar pattern, where harms that disproportionately affect women and girls are allowed to persist until public outrage and headlines make regulatory inaction untenable.

Responsibility for overseeing technology companies’ compliance with their legal duties under the Online Safety Act 2023 lies with Ofcom. In recent months, Ofcom has opened and progressed multiple investigations into platforms’ compliance with the act, including in relation to the use of AI tools and the hosting of illegal and harmful content.

These investigations followed days of intense public and political debate, including correspondence from parliamentary committee chairs to the secretary of state for science, innovation and technology and to Ofcom’s chief executive, raising concerns about regulatory gaps and the need for a more robust, system wide response. The prime minister’s public condemnation further underscored the political discomfort generated once the issue reached a tipping point.

Read more:

A long-signalled problem, not a sudden crisis

The suggestion that these harms emerged suddenly does not withstand scrutiny. The misuse of generative AI tools to create abusive, sexualised and violent imagery of women and girls has been documented for years. Framing recent controversies as unexpected risks undermines the work of victims, campaigners, journalists, researchers and civil society organisations who have repeatedly warned about the foreseeable misuse of these technologies.

Advertising helps fund Big Issue’s mission to end poverty
Advertisement

The gendered nature of these harms is clear. The Internet Watch Foundation’s 2024 report found that over 98% of AI generated child sexual abuse material identified involved girls. This was not framed as a speculative future risk, but as an existing and escalating harm.

Seen in this context, platform specific scandals should not be treated as anomalies. They are part of a wider and well-established pattern of technological misuse that regulators and lawmakers have been warned about repeatedly.

Why does it still take a tipping point?

This leads to a familiar and troubling question, why does it take a high-profile moment to catalyse change for women and girls?

Online abuse, non-consensual intimate images and other forms of TFVAWG have been the subject of sustained critique for more than a decade. Gaps in the law and in regulatory guidance have been openly acknowledged. Yet reform has remained slow and incremental.

The act itself was years in the making, and its implementation continues to be phased. While Ofcom’s guidance on violence against women and girls has improved, concerns remain about whether it is sufficiently agile to respond to rapidly evolving technologies such as generative AI, particularly where enforcement relies heavily on platform cooperation. Importantly, this guidance remains voluntary despite long standing calls from women’s organisations and academics for a mandatory code of practice.

What recent controversies illustrate is not a lack of evidence or awareness, but a lack of urgency. Routine abuse experienced by women online has rarely been sufficient to push change over the line. Instead, decisive action appears most likely when harm becomes visible, politically uncomfortable and impossible to ignore.

Advertising helps fund Big Issue’s mission to end poverty

Regulation after harm

Ofcom’s current investigations are necessary and welcome. However, their timing raises legitimate questions about regulatory posture. A system that intervenes primarily after crises emerge risks entrenching a culture of harm tolerance, where abuse is effectively permitted until it attracts sufficient attention.

If the act is to fulfil its promise, Ofcom must be willing to act preventatively, not merely responsively. This means using its existing powers to anticipate and mitigate risks posed by emerging technologies, rather than waiting for harms to be incontrovertibly demonstrated in public.

Recent law reform reflects similar challenges. The forthcoming commencement of the new offence introduced in the Data Use and Access Act 2025 marks an important step forward, recognising that AI generated abuse can be as harmful as material produced by other means. Yet the delay in bringing this provision into force, despite the new VAWG strategy and the government’s commitment to halving violence against women and girls, raises further questions about urgency.

Even once in force, gaps remain around enforcement pathways, criminal justice readiness and the burdens placed on victims seeking redress. As with previous reforms, the law risks arriving after harm has already become entrenched.

Responsibility, prevention and what comes next

It is essential not to obscure human agency. AI systems do not independently decide to generate abusive imagery. They respond to prompts, which are overwhelmingly input by men. Digital spaces do not create misogyny, but they provide new avenues through which it can be expressed, normalised and amplified. Platform responses that minimise or monetise these harms, or frame regulatory scrutiny as an attack on free expression, further undermine claims of commitment to safety.

The government’s recent strategy to tackle violence against women and girls acknowledges the links between online and offline harm, but its treatment of technology facilitated abuse remains limited. As Safer Internet Day invites reflection on what online safety should mean in practice, the challenge is whether the current regulatory framework and its regulator can respond with the urgency the scale of harm demands.

Advertising helps fund Big Issue’s mission to end poverty

Safer internet should be grounded in prevention, not reaction. Yet if online safety regulation continues to operate primarily in response to scandal, then harm to women and girls remains an accepted cost of technological progress. A genuinely safer internet requires regulators to act before abuse becomes visible, widespread and politically uncomfortable, not only once it can no longer be ignored.

A practical way forward is the creation of an independent observatory on TFVAWG. This body would provide specialist, centralised oversight of online gender-based harms, offering victims clearer reporting routes, generating real-time evidence for regulators and supporting proactive enforcement across platforms. By combining expert insight with operational authority, an observatory could ensure that technology facilitated abuse is addressed systematically rather than only after scandals erupt.

As Ofcom’s investigations continue, the question is not simply what action will be taken in individual cases. It is whether the UK can move beyond a cycle of reactive regulation, where women and girls must wait for the next moment of outrage before their harms are taken seriously.

Do you have a story to tell or opinions to share about this? Get in touch and tell us more

Leyla Buran (Research Fellow in Policy and Practice) and Professor Olga Jurasz (Director) at the Centre for Protecting Women Online

Change a vendor’s life this winter.

Advertising helps fund Big Issue’s mission to end poverty

Buy from your local Big Issue vendor every week – and always take the magazine. It’s how vendors earn with dignity and how we fund our work to end poverty.

You can also support online with a vendor support kit or a magazine subscription. Thank you for standing with Big Issue vendors.

Advertising helps fund Big Issue’s mission to end poverty

GIVE A GIFT THAT CHANGES A VENDOR'S LIFE THIS WINTER 🎁

For £36.99, help a vendor stay warm, earn an extra £520, and build a better future.
Grant, vendor

Recommended for you

Read All
Reform's pints promise is really talking about British identity. Labour should take note
Paul McNamee

Reform's pints promise is really talking about British identity. Labour should take note

Waiting for the Out review – discussing Odysseus with Trevor from EastEnders was not on my bingo card
Lucy Sweet

Waiting for the Out review – discussing Odysseus with Trevor from EastEnders was not on my bingo card

This government must make private rent more affordable. Here's how they can do it
Ben Cooper

This government must make private rent more affordable. Here's how they can do it

I'm a volunteer at Childline. Our calls matter to children in times of need
Childline counsellor
Jenny Cockle

I'm a volunteer at Childline. Our calls matter to children in times of need

Most Popular

Read All
Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits
Renters: A mortgage lender's window advertising buy-to-let products
1.

Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal
Pound coins on a piece of paper with disability living allowancve
2.

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal

Cost of living payments: Where to get help in 2025 now the scheme is over
next dwp cost of living payment 2023
3.

Cost of living payments: Where to get help in 2025 now the scheme is over

Citroën Ami: the tiny electric vehicle driving change with The Big Issue
4.

Citroën Ami: the tiny electric vehicle driving change with The Big Issue