Wikimedia is writing new policies to fight Wikipedia harassment


Wikipedia plans to crack down on harassment and other “toxic” behavior with a new code of conduct. The Wikimedia Foundation Board of Trustees, which oversees Wikipedia among other projects, voted on Friday to adopt a more formal moderation process. The foundation will draft the details of that process by the end of 2020, and until then, it’s tasked with enforcing stopgap anti-harassment policies.
“Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission,” said the board in a statement. “The board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can contribute productively and debate constructively.”
The trustee board gave the Wikimedia Foundation four specific directives. It’s supposed to draft a “binding minimum set of standards” for behavior on its platforms, shaped by input from the community. It needs to “ban, sanction, or otherwise limit the access” of people who break that code, as well as create a review process that involves the community. And it must “significantly increase support for and collaboration with community functionaries” during moderation. Beyond those directives, the Wikimedia Foundation is also supposed to put more resources into its Trust and Safety team, including more staff and better training tools.
The trustee board says its goal is “developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.”
Wikipedia’s volunteer community can be highly dedicated but intensely combative, launching edit wars over controversial topics and harshly enforcing editorial standards in a way that may drive away new users. The Wikimedia Foundation listed harassment as one factor behind its relative lack of female and gender-nonconforming editors, who have complained of being singled out for abuse. At the same time, the project grew out of a freewheeling community-focused ethos — and many users object to the kind of top-down enforcement you’d find on a commercial web platform.
These problems came to a head last year, when the Wikimedia Foundation suspended a respected but abrasive editor who other users accused of relentless harassment. The intervention bypassed Wikipedia’s normal community arbitration process, and several administrators resigned during the backlash that followed.
The board of trustees doesn’t mention that controversy, saying only that the vote “formalizes years’ of longstanding efforts by individual volunteers, Wikimedia affiliates, Foundation staff, and others to stop harassment and promote inclusivity on Wikimedia projects.” But on a discussion page, one editor cited the suspension to argue that the Wikimedia Foundation shouldn’t interfere with Wikipedia’s community moderation — while others said a formal code of conduct would have reduced the widespread confusion and hostility around it.
Amid all this, Wikipedia has become one of the internet’s most widely trusted platforms. YouTube, for instance, uses Wikipedia pages to rebut conspiracy videos. That’s raised the stakes and created a huge incentive for disinformation artists to target the site. Friday’s vote suggests the Wikimedia Foundation will take a more active role in moderating the platform, even if we don’t know exactly how.
Wikipedia plans to crack down on harassment and other “toxic” behavior with a new code of conduct. The Wikimedia Foundation Board of Trustees, which oversees Wikipedia among other projects, voted on Friday to adopt a more formal moderation process. The foundation will draft the details of that process by the…
Recent Posts
- Top digital loan firm security slip-up puts data of 36 million users at risk
- Nvidia admits some early RTX 5080 cards are missing ROPs, too
- I tried ChatGPT’s Dall-E 3 image generator and these 5 tips will help you get the most from your AI creations
- Gabby Petito murder documentary sparks viewer backlash after it uses fake AI voiceover
- The quirky Alarmo clock is no longer exclusive to Nintendo’s online store
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010