New from my team at Yale’s Digital Ethics Center: Counter-Misinformation Dynamics: The Case of Wikipedia Editing Communities during the 2024 US Presidential Elections

Wikipedia is a vital source of information, heavily trafficked and relied upon by individuals and large language models alike. However, its open editing nature makes it susceptible to misinformation, especially during politically sensitive times. Our study delved into these dynamics to understand how misinformation manifests and how effective Wikipedia’s mechanisms are in combating it.

Key Findings:

1. Editing Patterns Correlated with Political Events:

During significant political events, such as elections and legal proceedings, there was a noticeable spike in Wikipedia edits on pages related to politicians. This trend underscores the heightened engagement from users, both to contribute valuable information and, regrettably, to introduce biases or misinformation.

2. Prevalence of Misinformation:

Despite Wikipedia’s robust mechanisms to detect blatant misinformation, our study found that subtle biases and unsupported claims often seep through. Among the 3,000 analyzed edits, biased language and unsupported claims were the most frequent types of misinformation, followed by misleading information and factual inaccuracies.

3. Effectiveness of Current Mechanisms:

Wikipedia employs several automated tools to flag and revert potentially problematic edits. While these tools are effective at catching overt misinformation, subtle and insidious changes often go undetected. Our analysis revealed that edits marked as "high risk" by the revert risk model had a higher likelihood of containing misinformation than those marked as "medium" or "low" risk.

4. Recommendations for Improvement:

Based on our findings, we propose several recommendations to enhance Wikipedia’s ability to combat misinformation:

These measures aim to bolster Wikipedia’s role as a reliable information source, particularly during politically sensitive periods.

I am proud of this effort and believe that our study not only sheds light on the current challenges but also paves the way for more robust defenses against misinformation. I invite you to read our full study (availabe above at SSRN for free) for a comprehensive understanding of our methodologies and detailed findings.

By ensuring the integrity of platforms like Wikipedia, we contribute to the broader fight against misinformation, fostering a more informed and discerning public.