A recent tweet from "Reddit Lies" highlighted a growing internal conflict among Redditors concerning the appropriateness of children moderating adult content within various subreddits. The tweet stated: "> "Redditors are infighting about whether or not children should be moderating adult content on their subreddit." This public discourse underscores long-standing debates about platform moderation, user safety, and the volunteer nature of Reddit's community governance.
Reddit's moderation system relies heavily on volunteer moderators who enforce community-specific rules in addition to the platform's sitewide policies. While Reddit's user agreement generally requires users to be at least 13 years old, there is no explicit minimum age requirement for becoming a moderator. This policy gap has led to situations where minors, some as young as 13, have taken on moderation roles, potentially exposing them to a wide array of content, including explicit or disturbing material.
The issue gains particular sensitivity when considering "Not Safe For Work" (NSFW) subreddits, which are designated for adult content. Reddit has strict policies against child sexual abuse material (CSAM), actively using automated tools and human review to detect and remove such content, as detailed in its Transparency Report for January-June 2024. However, the broader category of "adult content" can still be graphic or psychologically taxing for young moderators. A moderator for r/AccidentalRenaissance, for example, previously described being exposed to "the worst photos they could imagine, including child pornography" during their duties, highlighting the severe mental toll moderation can take.
The ongoing "infighting" reflects a fundamental tension between Reddit's decentralized, community-driven moderation model and the need to protect vulnerable users, including young volunteers. As regulatory bodies, such as the UK's Ofcom, increasingly push for age verification to shield minors from inappropriate online material, Reddit faces mounting pressure to re-evaluate its policies regarding who can moderate content, especially in adult-oriented communities. The debate calls into question the ethical implications of relying on potentially underage volunteers for such demanding and sensitive tasks.