Sydney, Aug 13 (The Conversation) Last month, the Wikimedia Foundation, the non-profit organization managing Wikipedia, issued draft guidelines for researchers studying its neutrality. However, these guidelines reveal a significant lack of awareness of the Foundation's own influence and seem to curtail open inquiry. Researchers, including those from universities and non-profit organizations, are being told not only how to study Wikipedia’s neutrality but also what to focus on and how to interpret their findings. This could potentially stifle independent research into one of the most critical knowledge repositories globally.
As someone who has engaged in Wikipedia research for over 15 years and previously served on the Wikimedia Foundation’s Advisory Board, I am concerned about the implications of such guidelines. These could discourage independent scholarly work.
The guidelines have emerged during a period when Wikipedia is facing scrutiny. Prominent figures like tech billionaire Elon Musk have accused Wikipedia of biased representation against American conservatives. On X, formerly known as Twitter, Musk urged users to stop donating to what he called “Wokepedia.” Additionally, a conservative think tank in the US was recently exposed for planning to target Wikipedia volunteers over perceived antisemitic content.
Traditionally, the Wikimedia Foundation has refrained from influencing how research is conducted or how articles are authored, focusing instead on providing guidance around privacy and ethics, without intruding into editorial decisions made by its volunteer community. However, this is changing.
In March, the Foundation formed a working group to standardize the “neutral point of view” policy across Wikipedia’s 342 language versions. Now, it seeks direct involvement in research.
The guidelines instruct researchers on conducting neutrality research and interpreting results. They also specify what the Foundation considers open and closed research questions.
While universities already have guiding principles for research, these new Wikipedia guidelines matter because the Wikimedia Foundation wields significant control over who it collaborates with, provides funding to, promotes work for, and grants data access to. As a result, it can subtly influence which research gets prioritized.
The guidelines are lacking in three main areas.
1. Singular View on Neutrality: They assume Wikipedia’s definition of neutrality is the sole valid interpretation. English Wikipedia’s rules suggest neutrality is attained when all significant viewpoints are covered proportionally via reliable sources. Yet, researchers like Nathaniel Tkacz argue this idea is neither perfect nor universal. The reliability and consensus of sources are often contested.
2. Treating Debates as Settled: The guidelines consider ongoing debates about neutrality as resolved. They attribute factors like language and article type as crucial to neutrality while claiming Wikipedia becomes more neutral over time. However, neutrality can regress, particularly during political disputes or attacks, as seen in cases like Gamergate or nationalistic edits. The guidelines neglect influential elements like politics, culture, and state affects.
3. Limited Research Focus: The guidelines mandate researchers to share results with the Wikipedia community and strengthen the platform through constructive criticism. This narrow perspective constrains research freedom. In our wikihistories project, for instance, we prioritize educating the public on Australian bias while supporting site improvements. Researchers should share findings openly, even if discomforting.
Neutrality Spotlight: Critics often target Wikipedia not for lack of neutrality but disagreement with its content. The platform's influence is vast, extending to search engines and AI systems.
The Wikimedia Foundation might view critical research as a threat, yet such work is pivotal for maintaining Wikipedia’s integrity. Constructive research identifies where neutrality efforts fall short, without proposing defunding or targeting editors. It acknowledges better representation methods exist without discarding neutrality as a goal. Achieving neutrality requires ongoing effort, transparency, self-awareness, and room for independent critique.
(Only the headline of this report may have been reworked by Editorji; the rest of the content is auto-generated from a syndicated feed.)