About


The Digital Crossroads: A Defining Moment in Human History

01


Protecting Digital Lives: The Online Safety Act & Ethical Tech Regulation

Our commitment to online safety goes beyond compliance—it’s about fostering a digital environment where individuals feel secure. With the Online Safety Act 2025, we champion responsible technology use, advocating for transparency, accountability, and the protection of vulnerable users. By engaging with policymakers and digital platforms, we ensure that online spaces prioritize safety over profit-driven algorithms. Through education and regulation, we’re shaping a digital landscape that safeguards freedom of expression while eliminating harmful content.

02


Online Safety Act 2025: Enforcing Accountability in the Digital Age

In a world where online harms are increasingly sophisticated, the Online Safety Act 2025 is a pivotal step towards greater accountability for tech companies. We push for stringent enforcement mechanisms, ensuring platforms adhere to their legal obligations to remove illegal content, combat misinformation, and protect children from online exploitation. By supporting rigorous compliance measures and public awareness initiatives, we empower individuals to navigate digital spaces confidently and securely.

03


Defending Digital Rights: How the Online Safety Act 2025 Protects Users

Online safety is a fundamental right, not a privilege. The Online Safety Act 2025 introduces groundbreaking measures to shield users from digital threats, from cyberflashing to algorithmic manipulation. We advocate for user-centric policies that uphold privacy, mitigate online harms, and enhance digital literacy. By collaborating with industry leaders and regulatory bodies, we drive the conversation on ethical online governance, ensuring that technology remains a tool for empowerment—not exploitation.

Introduction

The Online Safety Act 2025 is one of the most significant legal interventions in the digital space in recent years. Designed to protect both children and adults online, it introduces sweeping regulations that hold social media companies, search services, and other online platforms accountable for the safety of their users. This legislation is set to reshape the online landscape, impacting businesses, content creators, and everyday users alike.But what exactly does the Online Safety Act 2025 do, and how will it be enforced? More importantly, how does it affect you? Let’s explore the details.

What Does the Online Safety Act 2025 Do?

At its core, the Online Safety Act 2025 places new responsibilities on online platforms to protect users from harmful content, prevent illegal activities, and provide transparency about the types of content they allow. Platforms are now required to:

  • Reduce risks associated with illegal activity on their services.
  • Remove illegal content swiftly.
  • Implement safeguards to prevent children from accessing harmful or inappropriate material.
  • Provide tools that allow adults to control the content they see online.
  • Address online threats such as cyberflashing, intimate image abuse, and self-harm promotion.
These regulations aim to create a safer digital environment while balancing free speech and the rights of internet users.

Who Does the Online Safety Act 2025 Apply To?

The Online Safety Act 2025 is broad in scope, covering:

  • Social media platforms.
  • Search engines.
  • Online forums and discussion sites.
  • File-sharing services.
  • Online messaging platforms.
  • Any service that allows user-generated content.
Importantly, this legislation also applies to companies outside the UK if they provide services accessible to UK users. If a platform has a significant number of UK users, targets UK users, or presents a material risk to UK citizens, it falls under the jurisdiction of the Online Safety Act 2025.

How the Online Safety Act 2025 is Being Implemented

The Online Safety Act 2025 became law on 26 October 2023, and implementation has been phased to allow companies time to comply. Ofcom, the UK’s independent communications regulator, is responsible for overseeing enforcement and ensuring compliance.Key Implementation Phases:

  • Illegal content duties – As of 17 March 2025, platforms must assess the risks of illegal content appearing on their services and take steps to mitigate these risks.
  • Age verification for online pornography – From January 2025, websites that publish pornographic content must implement robust age verification measures.
  • Child safety – Platforms must assess their risk to children and implement safeguards by April 2025.
Categorised services – Large platforms with significant influence (Category 1, 2A, and 2B services) will face additional transparency and accountability requirements, with final codes expected by early 2026.

The New Criminal Offences Introduced by the Online Safety Act 2025

The Online Safety Act 2025 has criminalised several harmful online behaviours, including:

  • Encouraging or assisting serious self-harm.
  • Cyberflashing (sending unsolicited explicit images).
  • Spreading false information intended to cause harm.
  • Threatening communications (such as doxxing or targeted harassment).
  • Intimate image abuse.
  • Epilepsy trolling (sending flashing images to trigger seizures).
Already, convictions have been made under the cyberflashing and threatening communications offences, highlighting the seriousness of the new regulations.

Tackling Harmful Content Under the Online Safety Act 2025

The Online Safety Act 2025 categorises online harms into two main types:

  • Illegal Content – Platforms must remove content linked to serious offences such as child sexual abuse, fraud, terrorism, and sexual exploitation.
  • Harmful Content to Children – Platforms must ensure children do not encounter content promoting self-harm, eating disorders, suicide, or other dangerous activities.

How the Online Safety Act 2025 Tackles Harmful Algorithms

A significant feature of the Online Safety Act 2025 is its focus on harmful algorithms. Platforms must assess whether their recommendation systems increase exposure to harmful or illegal content, particularly for children. Ofcom will require platforms to:

  • Conduct risk assessments of their algorithms.
  • Implement changes to prevent harm.
  • Publish transparency reports detailing how their algorithms impact users.

Increased Controls for Adults Under the Online Safety Act 2025

While protecting children is a key focus, the Online Safety Act 2025 also empowers adults by offering more control over their online experience. Large platforms must provide tools that allow users to:

  • Filter out unwanted interactions.
  • Block messages from non-verified accounts.
  • Reduce the likelihood of encountering harmful but legal content (e.g., content promoting self-harm).
These measures aim to combat anonymous online abuse while maintaining user choice.

Enforcing the Online Safety Act 2025: What Happens if Companies Don't Comply?

Ofcom has been granted significant enforcement powers under the Online Safety Act 2025, including the ability to:Fine companies up to £18 million or 10% of their global revenue (whichever is higher).

  • Hold senior executives personally liable for non-compliance.
  • Order payment providers and advertisers to cut ties with non-compliant platforms.
  • Block access to platforms that fail to meet their obligations.

Strengthening Online Safety: Ofcom’s Latest Enforcement Actions

As of 17 March 2025, online platforms must start putting in place measures to protect UK users from criminal activity. Ofcom has launched an enforcement programme to ensure compliance, with a focus on tackling child sexual abuse material (CSAM). Tackling CSAM on File-Sharing Services File-sharing and storage services are at high risk of hosting CSAM. To combat this, Ofcom has recommended that these services adopt automated moderation technologies such as perceptual hash-matching to detect and remove illegal content swiftly. Platforms failing to comply may face severe penalties, including fines of up to 10% of their global revenue or being blocked in the UK. Working with Child Protection Experts 

  • Ofcom has collaborated with organisations such as:
  • The Internet Watch Foundation (IWF)
  • The Canadian Centre for Child Protection (C3P)
  • The National Centre for Missing and Exploited Children (NCMEC)
These partnerships help identify high-risk platforms and ensure robust safety measures are implemented.

Combatting Misinformation and Disinformation Under the Online Safety Act 2025

While respecting freedom of expression, the Online Safety Act 2025 introduces specific measures to combat dangerous false information, particularly:

  • Illegal disinformation (such as state-sponsored propaganda or fraudulent schemes).
  • Misinformation harmful to children.
  • False information banned under a platform’s own terms of service.

Close StoryRead More
The Digital Resistance James

Connect with Our CEO – James Vincent

As a leader in digital safety, AI governance, and online regulation, James Vincent is dedicated to shaping the future of internet security, content moderation, and responsible technology use.

Click the button

Join the Digital Resistance Team

Be part of a movement shaping the future of digital freedom, ethical AI, and cybersecurity. At The Digital Resistance, we are building a team of forward-thinkers, innovators, and advocates committed to protecting online rights, educating the next generation, and holding technology accountable.

Click the button
The Digital Resistance Team Group

The Need for Stronger Online Protections

Over 70% of children and young people encounter harmful content online before adulthood. With the rise of AI-driven misinformation, data exploitation, cyber threats, and algorithmic manipulation, urgent action is required to protect the next generation and equip them with the skills to navigate the digital world safely.

The Fight for Safer Digital Spaces

Children are among the most vulnerable in today's unregulated online world. From AI-generated deepfakes to online scams, grooming, and data breaches, they face threats they are not yet equipped to recognize. The Online Safety Act 2025 mandates that platforms take proactive measures to reduce exposure to illegal and harmful content, enforce stricter age verification, and prioritize digital well-being.

Online Safety Act 2025 child protection

Privacy Preference Center