London, UK – The United Kingdom's Online Safety Act (OSA), which became law in October 2023, is progressively coming into full effect, imposing stringent new obligations on technology companies worldwide. Designed to enhance user safety, particularly for children, the legislation grants the UK's communications regulator, Ofcom, significant powers to enforce compliance, including substantial financial penalties and the ability to block non-compliant services within the UK. This regulatory push has sparked considerable debate, especially among large, often US-based, tech firms concerned about its implications for free speech and operational feasibility.
The Act mandates platforms to proactively address illegal content, such as child sexual abuse material and terrorism, and implement robust measures to protect children from harmful content like self-harm promotion and pornography through effective age verification. Ofcom is tasked with developing codes of practice and guidance, with duties regarding illegal content already enforceable and child safety provisions set to be fully in effect by Summer 2025. The legislation's reach extends to any service with significant UK users or a target market in the UK, regardless of where the company is based.
Failure to comply can result in fines up to 18 million or 10% of a company's qualifying worldwide revenue, whichever amount is greater. In severe cases, Ofcom can seek court orders to block access to services in the UK or restrict their access to payment providers and advertisers. Criminal action against senior managers is also a possibility for non-compliance with information requests or child safety duties.
However, the Act has drawn criticism from some quarters, including prominent US tech executives, who argue it risks stifling free expression and imposes disproportionate burdens. As tech entrepreneur Preston Byrne stated in a recent tweet, characterizing the standoff:
"UK: You need to obey our censorship law. Americans: No. UK: We'll fine you! Americans: No you won't. UK: We'll... uh... block your website! Americans: Go ahead. North Korea looks good on you. UK: ..." This sentiment reflects concerns that the broad scope and severe penalties could lead to over-blocking of legitimate content.
BBC Verify has reported instances where content, including news about the wars in Ukraine and Gaza, has been restricted on platforms like X (formerly Twitter) and Reddit for users who have not completed age verification checks, raising questions about unintended consequences. Elon Musk, owner of X, has publicly criticized the Act, describing its purpose as "suppression of the people." The ongoing implementation will test the balance between online safety and maintaining open digital environments, with global tech companies navigating a complex regulatory landscape.