Adam Townsend, a prominent commentator, on September 2, 2025, publicly criticized X (formerly Twitter) for its alleged failure to enforce its own rules against platform manipulation. In a tweet, Townsend stated, > "accounts like 👇 this are not only allowed to game the system, but by X’s neglect to enforce their own rules of platform manipulation, are encouraged." This statement underscores ongoing concerns regarding the platform's integrity and its ability to combat inauthentic activity.
X maintains official policies designed to prevent platform manipulation and spam, explicitly prohibiting activities intended to "artificially amplify or suppress information" or disrupt user experience. These rules address issues like coordinated inauthentic behavior, mass account registration, and the manipulation of content visibility. The company has previously reported suspending hundreds of millions of accounts for violating these rules.
However, recent research and regulatory actions suggest a significant gap between policy and enforcement. A study from the University of Notre Dame, published earlier this year, found that X, alongside Reddit and Mastodon, made it "trivial" to launch bots, indicating a lack of effective enforcement mechanisms. This research highlights how easily malicious actors can operate and manipulate discussions on the platform.
The concerns extend to content moderation, as evidenced by X's legal dispute with media watchdog Media Matters. The lawsuit, initiated by X, arose after Media Matters reported that advertisements from major companies were appearing next to white nationalist content, raising questions about X's content filtering and brand safety measures. X acknowledged the screenshots were real but claimed they were "manufactured" through specific user behavior.
Internationally, X is also under intense scrutiny from regulators, particularly in the European Union. The EU's Digital Services Act (DSA) has led to an investigation into X's content moderation practices and its handling of information manipulation. Potential penalties, including significant fines, are anticipated, reflecting a growing global concern over the platform's adherence to its stated safety and integrity standards. These collective criticisms from individuals, researchers, and regulatory bodies point to persistent challenges in X's efforts to maintain a healthy and authentic online environment.