User:Julle/Essays/The Patroller's Dilemma
This personal essay is an attempt to explain why the community reacts to changes and technical development the way it does. My background is on the one hand working for the Wikimedia Foundation, being part of the software development process, and on the other hand being a long-term patroller and admin.
On verifiability
[edit]The battle of Klingenthal wasn’t fought between Swedish troops and forces of the Holy Roman Empire in 1642. The German forces weren’t vanquished, the battle wasn’t the final part of the Swedish conquest of Saxony and it didn’t lead to the golden age of the Swedish Empire. As far as we can tell, the battle never took place at all.
But for eight years English Wikipedia had an article on it. Since 2014, it had not just informed the readers of the fictional event, but also helped the battle of Klingenthal make its way into the occasional book, been translated into Swedish and had the battle mentioned on German Wikipedia. There were so many battles in the Thirty Years’ War – who can keep track of them all? It took until 2022 for someone to say, hey, this battle isn’t mentioned in any of the significant works on the war. It’s not in the one it’s referencing, nor in any work we could find which predated the Wikipedia article. The Swedish article, I found out, had been translated based on the assumption that the English article was correct, and the mention of the battle on German Wikipedia was removed as soon as people started asking about it.
This came as no major surprise to any of the editors involved. “Likely hoax”, we commented in discussion. It was merely the latest one, and we’ve all seen similar endeavours before. Articles about things that never happened or never existed, people who were never born. People inserting themselves into history for laughs, or adding their names to the list of actors of a long-forgotten movie. Or the constant attempts to improve the world in hindsight: individuals claiming successes they never had, companies attempting to make exceedingly favourable interpretations of past events. Or the other way around: those who have never edited Wikipedia, but suddenly find articles about themselves making claims of misdeeds they never perpetrated, misconduct which exists on the wiki but nowhere else.[1] Some editors have spent significant time building a reputation on the wikis, while amusing themselves by inserting small falsehoods into the texts.
Wikipedia started out as a largely empty space and we filled it with words and references, norms and rules. Initially, most content was welcome and if not there from the beginning, sources could be found later: the important thing was that we had any articles at all. There was so much that needed to be done, and writing a new article was easy. My first pieces were a few sentences long and didn’t reference any sources, even though many of them were biographies of living persons. Today, they would have been removed immediately. Gradually, the focus shifted. We started worrying slightly less about not having the content, and slightly more about not having the right content. About quality and what information we were serving our readers.
Partly, this is a result of more than twenty years of debating what Wikipedia is and should be. What should an encyclopedia contain? What is kept and what is not is decided largely on two factors, notability and verifiability. They are intertwined and difficult to fully separate from each other: an abundance of verifiability helps notability in itself: look, see how much effort has been spent on writing about this thing – surely that makes it notable. To the degree notability can be decoupled from verifiability, that is primarily based on our understanding of what we want to be, our vision of the encyclopedia – an internal process.
But our relationship to sources – verifiability – has, like marble or gneiss, largely formed under external pressure, as reactions to vandalism, controversies and mistakes. To patrol Wikipedia is to feel like one is under constant siege: there’s a never-ending onslaught of destruction, mischief and misinformation to keep at bay. When someone points out that certain topics are more likely than others to be shut out of the encyclopedia, patrollers typically agree that it would be good if something could be done about it. But the solution “change the standards we use for verification” tends to sound like “open a breach in the wall and let the attackers in”.
The patroller’s burden
[edit]In the research paper “The Rise and Decline of an Open Collaboration System: How Wikipedia’s Reaction to Popularity Is Causing Its Decline” from 2012,[2] Aaron Halfaker et al. showed that English Wikipedia started bleeding editors in 2007, a development which didn’t flatten out until 2014. No fewer than before made their first edit, but a far smaller number stayed around to make ten edits. This correlated with how likely an edit was to survive. Pseudonymously editing an encyclopedia is a pastime for a minority, and pseudonymously seeing your edits reverted is a hobby which appeals to very few indeed. One of the reasons, the paper argued, was the rise of tools which made it very easy to revert edits to the previous state.
To which degree the specific tools shaped the development can be debated – many of the other languages who enthusiastically adopted Wikipedia in the early days saw very similar trends, with or without the tools of English Wikipedia. But as the articles grew more difficult to write, as we demanded more and more from the first version, patrolling the edits became more of a binary mindset of yes or no. Let be or revert. As much as we talk about the raised bar for those who write, it’s easily forgotten that the burden on the patroller who wanted to help instead of remove was made equally heavy. Instead of making some simple corrections, you might need to hunt down sources for an article about a subject you know nothing about. Even assuming they are easily available to you in your language, that might be a commitment of half an hour for a simple article.
This steers us towards letting someone else, typically the newcomer who wrote it, fix the problem. Revert the edit and briefly let them know that it didn’t correspond to Wikipedia’s norms. Move an article to a user subpage and let them know they need to keep working on it. The problem, obviously, is that we put the burden on someone who doesn’t understand what they need to do for the text to live up to the standards we’re referring to.
A matter of trust
[edit]I’m first and foremost a Swedish Wikipedian, but I try to be of service on other language versions as well, not least English Wikipedia where I haunt the Articles for Deletion discussions somehow related to Sweden, trying to help with both sources – some not easily accessible – and context. It’s a good exercise in understanding the limitations of how we handle sources we’re not personally familiar with. I’ve seen editors argue for deletion for something I’ve painstakingly referenced beyond the minimum requirements, because the sources I’ve used are neither accessible nor known to them. When I’ve pointed out that something about Sweden is obviously a misunderstanding and why, I’ve been told that Major American Newspaper is a respectable source, even when it’s been laughably apparent to anyone who speaks Swedish that this is an American rewrite of a British rewrite of an article in a less reputable Swedish newspaper – as if there was no difference between the investigative journalism of The New York Times and a text in the same newspaper which is a repackaging of tabloid content, two steps of Telephone later, by someone who can’t read the original text.
I’ve been around for a while. I have strategies for this, and I know what keywords to use to signal that I understand the game. It can be a frustrating exercise even for me. But someone who is new? Who doesn’t know how to present the sources to explain that they are reliable? Who doesn’t know how to describe something so it’s clear that it’s not just been mentioned in passing? Most give up.
Some articles are scrutinised more than others: I regularly come across drafts on English Wikipedia which have been declined, which I’m fairly certain would have survived the Articles for Deletion process. Good enough to be kept, but not good enough to be created. All of this is very difficult to navigate for those who are not already familiar with the norms and politics of editing Wikipedia.
This creates gaps in an encyclopedia with the ambition to cover the entire world equally.
To a certain degree, Wikipedia runs on trust. In the paper “Do I Trust this Stranger? Generalized Trust and the Governance of Online Communities”,[3] Jérôme Hergueux et al. described an experiment on English Wikipedia where Wikipedia administrators were more active and took more actions if they generally had a lower level of trust in strangers. Trust is probably relevant outside of the specific context of their study, too. We trust editors we have encountered before and we trust sources we are familiar with. We might trust editors who have internalised the Wikipedian vocabulary and signal that they understand the Wikipedia norms, or sources that we recognise as fitting into a familiar system.
Working within the system
[edit]Our tools do shape our behaviour. If we could make it easier for patrollers to make something better – somehow find ways to save them time, instead of merely having to decide to let something through or not – that would make it more likely that edits that failed good-faith additions to the encyclopedia could be improved and kept, and the editors no longer discouraged from further editing. If we could make our editing tools surface Wikipedia norms and criteria for newcomers, that could help them create articles more likely to survive – more content for the encyclopedia, time saved for the patrollers and the new editors having a better experience, far less likely to turn them off from editing in the future.
If we could help explain and verify sources, this could make it easier for those who want to reference publications others are less used to. If we could assign a mentor to someone every time we draftify or move an article to a user subpage, this could help them understand what is actually necessary, as opposed to having to interpret our comments without the benefit of having internalised Wikipedia criteria. If we could assign a mentor – and had mentors to assign – when we revert ambitious good-faith edits, this could help them more than our current guidance.
We could probably do better.
References
[edit]- ↑ See e.g. w:en:Wikipedia Seigenthaler biography incident.
- ↑ Halfaker A, Geiger RS, Morgan JT, Riedl J. The Rise and Decline of an Open Collaboration System: How Wikipedia’s Reaction to Popularity Is Causing Its Decline.
- ↑ Hergueux, Jérôme; Algan, Yann; Benkler, Yochai; Fuster-Morell, Mayo. “Do I Trust this Stranger? Generalized Trust and the Governance of Online Communities”. Conference paper, 2021.