If you read anything this weekend, this should be it. It’s long, and not as anxiety-inducing as I feared. Despite all the trouble Wikipedia has faced since its start, reading about its process and its struggle to maintain integrity is strangely uplifting.
Wikipedia is one of the few platforms online where tremendous computing power isn’t being deployed in the service of telling you exactly what you want to hear.
When I was translating my book into English I had to replace pretty much all of my Wikipedia citations with other sources, because the publisher didn’t think it was academic enough, or trustworthy enough for an academic book. Never mind that my own content hadn’t been fact-checked, let alone peer-reviewed. But considering that I was writing about American television, even in 2010 Wikipedia would have been a better reference than, say, IMDB. I’m wondering if that perception from traditional publishers has changed. But I think its role and the example its internal procedures set are becoming more important because of the ongoing attacks.
A lie might be more plausible or useful than a fact, but it lacks a fact’s dumb arbitrary quality of being the case for no particular reason and no matter your opinion or influence. History once rewritten can be rewritten again and becomes insubstantial. Rather than believe the lie, people stop believing anything at all, and even those in power lose their bearings.
One of the weird things I get from the article is that for Wikipedia to be useful and interesting it has to be fundamentally plain and level, almost boring, at least on the surface:
In 2016, researchers published a study of 10 years of Wikipedia edits about US politics. They found that articles became more neutral over time — and so, too, did the editors themselves. When editors arrived, they often proposed extreme edits, received pushback, and either left the project or made increasingly moderate contributions.
Of course, as other links I posted make abundantly clear, “AI” is probably the top threat to Wikipedia’s survival:
The more people who get information from AI summaries of Wikipedia rather than the site itself, the fewer people who will wander down a rabbit hole, encounter an error that needs correcting, and become editors themselves.
Maybe I should start posting my own rabbit holes—but that’s a project for another week.