Wikipedia will launch its first global code of conduct, seeking to address criticism that it failed to combat harassment.
“We need to be much more inclusive,” said María Sefidari, the chair of board of trustees for the non-profit Wikimedia Foundation. “We are missing a lot of voices, we’re missing women, we’re missing marginalized groups.”
Online platforms have come under intense scrutiny for abusive behavior, violent rhetoric and other forms of problematic content, pushing them to revamp content rules and more strictly enforce them.
Unlike Facebook Inc and Twitter Inc which take more top-down approaches to content moderation, the online encyclopedia, which turned 20 years old last month, largely relies on unpaid volunteers to handle issues around users’ behavior.
Prince Harry won an apology and substantial damages from publishers
Wikimedia said more than 1,500 Wikipedia volunteers from five continents and 30 languages participated in the creation of the new rules after the board of trustees voted in May last year to develop new binding standards.
“There’s been a process of change throughout the communities,” Katherine Maher, the executive director of the Wikimedia Foundation, said in an interview. “It took time to build the support that was necessary to do the consultations for people to understand why this is a priority.”
The new code of conduct bans harassment on and off the site, barring behaviors like hate speech, the use of slurs, stereotypes or attacks based on personal characteristics, as well as threats of physical violence and ‘hounding,’ or following someone across different articles to critique their work.
It also bans deliberately introducing false or biased information into content. Wikipedia is a relatively trusted site compared to major social media platforms which have struggled to curb misinformation.
Maher said some users’ concerns that the new rules meant the site was becoming more centralized were unfounded.
Top diplomats of Hungary,Ukraine aim to defuse dispute
Wikipedia has 230,000 volunteer editors who work on crowdsourced articles and more than 3,500 ‘administrators’ who can take actions like blocking accounts or restricting edits on certain pages. Sometimes, complaints are decided on by panels of users elected by the communities.
Wikimedia said the next phase of the project would be working on the rules’ enforcement.
“A code of conduct without enforcement…is not going to be useful,” said Sefidari. “We’re going to figure this out with the communities,” she said.
Maher said there would be training for communities and interested task-forces of users.
Wikimedia has no immediate plans to beef up its small ‘trust and safety’ team, a group of about a dozen staff which currently acts on urgent matters such as death threats or the sharing of people’s private information, she said.