EU Digital Services Act: Platforms Face 6% Global Turnover Fines Amidst 'Censorship' Concerns

Image for EU Digital Services Act: Platforms Face 6% Global Turnover Fines Amidst 'Censorship' Concerns

The European Union's Digital Services Act (DSA), a sweeping regulation governing online content, is facing significant criticism from some quarters, with commentators labeling it a "censorship law." Samo Burja, a prominent voice, recently stated in a tweet, "> "Shameful day for the European Union: Censorship law. This will be abused and I'd argue it is intended to be abused, designed for it since day one."" This legislation, which became fully applicable for all in-scope services in early 2024, empowers the EU to levy substantial fines, potentially reaching up to 6% of a company's global annual turnover, for non-compliance.

The DSA imposes stringent content moderation obligations on online platforms, particularly targeting Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) that serve over 45 million monthly active users within the EU. These requirements include implementing robust systems for identifying and acting against illegal content, conducting annual risk assessments for systemic issues like disinformation, and increasing transparency regarding content moderation decisions and targeted advertising. The overarching goal is to foster a safer digital environment and enhance platform accountability.

Despite its stated aims, the DSA has drawn accusations of compelling platforms to over-moderate content due to the broad definition of "harmful content" and the threat of severe penalties. Critics argue that this framework could inadvertently suppress lawful speech, as platforms might adopt the EU's stricter speech standards globally to simplify compliance, potentially impacting free expression in other regions. The significant compliance costs and administrative burdens disproportionately affect large U.S. technology companies, which operate the majority of services designated as VLOPs and VLOSEs.

The European Commission has already initiated proceedings against major platforms for alleged breaches of DSA obligations, including investigations into Meta and TikTok concerning transparency failures and inadequate tools for reporting illegal content. Furthermore, the voluntary EU Code of Practice on Disinformation, which encourages fact-checking, is slated for integration into the DSA framework as a co-regulatory instrument by July 1, 2025. However, platforms like X (formerly Twitter), Meta, and Google have recently challenged or altered their commitments to third-party fact-checking, highlighting ongoing tensions.

The implementation of the DSA continues to evolve, with the European Commission issuing guidelines for VLOPs and VLOSEs to mitigate systemic risks to electoral processes, and expanding its scope to include services such as WhatsApp as VLOPs. The law mandates detailed transparency reports from service providers, requires clear statements of reasons for content moderation decisions, and establishes effective user complaint systems. This regulatory landscape marks a significant shift in online content governance, carrying profound implications for global tech companies and the future of online discourse.