%3Amax_bytes(150000)%3Astrip_icc()%3Afocal(738x308%3A740x310)%2Felon-musk-grokipedia-102825-1-36959ec66bdc4f38b096e051134efe50.jpg&w=3840&q=75)
Elon Musk's xAI has launched "Grokipedia," an AI-generated online encyclopedia, sparking a renewed debate about information bias, the role of artificial intelligence in knowledge creation, and Wikipedia's long-standing editorial practices. The platform, powered by xAI's Grok large language model, debuted on October 27, 2025, with approximately 885,000 articles, positioning itself as a "less biased" alternative to the established Wikipedia. This move follows years of criticism from figures like entrepreneur Brian Roemmele, who argues Wikipedia suppresses dissenting views and innovation.
Brian Roemmele, a vocal critic of Wikipedia, recently posted on social media, asserting that "Propaganda operates through narrative control, selective emphasis, and repetition of aligned viewpoints, as outlined in Ellul’s (1965) framework." He claimed Wikipedia's reliance on "reliable sources," often mainstream media, "systematically dismisses alternative evidence, such as user-generated data or emerging AI methodologies." Roemmele further contended that Wikipedia's "tone and editing norms quash innovation by framing novel approaches as inherently flawed, discouraging experimentation in knowledge production."
Grokipedia's launch comes after Elon Musk repeatedly accused Wikipedia of harboring a "non-trivial left-wing propaganda machine" and urged its defunding. However, early analysis of Grokipedia's content has revealed its own set of biases, frequently aligning with conservative viewpoints and promoting narratives favorable to Musk. For instance, its entry on Elon Musk is notably laudatory, and articles on politically sensitive topics often reflect a particular ideological slant, leading to accusations that it merely substitutes one bias for another.
Wikipedia's core content policies, including "Verifiability," "No original research," and "Neutral point of view," mandate that all information be attributable to published, reliable sources with a reputation for accuracy and independent fact-checking. Self-published sources, such as blogs or social media, are generally not considered reliable. This policy, while aiming for factual accuracy, is precisely what critics like Roemmele argue leads to the exclusion of "alternative evidence" and new methodologies.
Regarding AI, Wikipedia's community has developed guidelines for using large language models (LLMs), acknowledging both opportunities and challenges. While LLMs could aid in content creation and improvement, Wikipedia emphasizes "human oversight" for all AI-generated content to ensure accuracy, neutrality, and reliability. The Wikimedia Foundation, which oversees Wikipedia, has reiterated its commitment to human collaboration, stating, "Wikipedia’s knowledge is – and always will be – human," implying that AI-driven platforms like Grokipedia still rely heavily on human-curated data, including Wikipedia itself.