Wikipedia was founded 25 years ago today, on January 15, 2001. In that time, it’s become one of the most important websites on the internet. So today I wanted to share some of my favorite parts from this article on The Verge: “How Wikipedia survives while the rest of the internet breaks.”

Wikipedia is the largest compendium of human knowledge ever assembled, with more than 7 million articles in its English version, the largest and most developed of 343 language projects. Started nearly 25 years ago, the site was long mocked as a byword for the unreliability of information on the internet, yet today it is, without exaggeration, the digital world’s factual foundation.

This is a fantastic article that talks about how Wikipedia was created, how it works, and why it’s under attack. It started with Jimmy Wales and Larry Sanger attempting to start their own online encyclopedia called Nupedia; only experts could contribute to it, and there was a rigorous review process before anything could be posted. Unsurprisingly, it took them a year to write just 20 articles. To speed things up, they decided to open-source the project, and thus Wikipedia was born. There were more than 20,000 articles created in its first year.

There were few rules at first, but one that Wales said was “non-negotiable” was that Wikipedia should be written from a “neutral point of view.” The policy, abbreviated as NPOV, was imported from the “nonbias policy” Sanger had written for Nupedia. But on Wikipedia, Wales considered it as much a “social concept of cooperation” as an editorial standard. If this site was going to be open to anyone to edit, the only way to avoid endless flame wars over who is right was, provocatively speaking, to set questions of truth aside. “We could talk about that and get nowhere,” Wales wrote to the Wikipedia email list. “Perhaps the easiest way to make your writing more encyclopedic is to write about what people believe, rather than what is so,” he explained.

I think everyone would agree that the NPOV policy is a good one, and setting aside questions of truth in favor of beliefs is a good way to get there. If people can’t agree on an issue, at least they should be able to agree on what it is they’re disagreeing about. But of course, people believe all sorts of silly things. So how do you handle things like climate change deniers or flat-Earthers?

In response, the early volunteers added another rule. You can’t just say things; any factual claim needs a citation that readers can check for themselves. When people started emailing Wales their proofs that Einstein was wrong about relativity, he clarified that the cited source could not be your own “original research.” Sorry, Wales wrote to an Einstein debunker, it doesn’t matter whether your theory is true. When it is published in a physics journal, you can cite that.

Instead of trying to ascertain the truth, editors assessed the credibility of sources, looking to signals like whether a publication had a fact-checking department, got cited by other reputable sources, and issued corrections when it got things wrong.

At their best, these ground rules ensured debates followed a productive dialectic. An editor might write that human-caused climate change was a fact; another might change the line to say there was ongoing debate; a third editor would add the line back, backed up by surveys of climate scientists, and demand peer-reviewed studies supporting alternate theories. The outcome was a more accurate description of the state of knowledge than many journalists were promoting at the time by giving “both sides” equal weight, and also a lot of work to arrive at. A 2019 study published in Nature found that Wikipedia’s most polarizing articles — eugenics, global warming, Leonardo DiCaprio — are the highest quality, because each side keeps adding citations in support of their views. Wikipedia: a machine for turning conflict into bibliographies.

It’s not Wikipedia’s job to ascertain truth; its job is to present information from credible sources. So each article acts like a summary, and you always have the option to check the citations and explore the topic in more detail. And if you think important information is missing, you can add it as long as you follow the rules: neutral point of view, no original research, and verifiability.

If each article is a summary, the Talk page is the history of how we got to the summary. If you ever wonder why an article is written in a particular way or why some piece of information is included, you can view the Talk page to see the discussion. There may be fierce disagreement, but everyone must follow the rules. And it’s all documented so that anyone can read the arguments and see how the editors arrived at that conclusion.

In 2009, law professors David A. Hoffman and Salil K. Mehra published a paper analyzing conflicts like these on Wikipedia and noted something unusual. Wikipedia’s dispute resolution system does not actually resolve disputes. In fact, it seems to facilitate them continuing forever.

These disputes may be crucial to Wikipedia’s success, the researchers wrote. Online communities are in perpetual danger of dissolving into anarchy. But because disputes on Wikipedia are won or lost based on who has better followed Wikipedia process, every dispute becomes an opportunity to reiterate the project’s rules and principles.

In 2016, researchers published a study of 10 years of Wikipedia edits about US politics. They found that articles became more neutral over time — and so, too, did the editors themselves. When editors arrived, they often proposed extreme edits, received pushback, and either left the project or made increasingly moderate contributions.

This is obviously not the reigning dynamic of the rest of the internet. The social platforms where culture and politics increasingly play out are governed by algorithms that have the opposite effect of Wikipedia’s bureaucracy in nearly every respect. Optimized to capture attention, they boost the novel, extreme, and sensational rather than subjecting them to increased scrutiny, and by sending content to users most likely to engage with it, they sort people into clusters of mutual agreement. This phenomenon has many names. Filter bubbles, epistemological fragmentation, bespoke realities, the sense that everyone has lost their minds. On Wikipedia, it’s called a “point of view split,” and editors banned it early. You are simply not allowed to make a new article on the same topic. Instead, you must make the case for a given perspective’s place amid all the others while staying, literally, on the same page.

That last sentence is my favorite in the whole article. Wikipedia is designed to force everyone to literally get on the same page. And the content of each page lives or dies based on who did a better job of following the rules. It doesn’t matter what your personal opinion is, you must maintain NPOV and verifiability if you want to be taken seriously. Unfortunately, some people can’t handle that.

There’s a whole section of this article about all the people who complain because they think Wikipedia is biased. A lot of the complaints are about a page called Reliable sources/Perennial sources. This is where editors have compiled a massive list of sources and determined the credibility for each one. This way they don’t need to keep having the same arguments over and over again about the same source on every Talk page. And of course, as with all of Wikipedia, you can see the the historical record of all the arguments that led to that decision.

But to Wikipedia’s critics, the page has become a symbol of the encyclopedia’s biases. Sanger, the briefly tenured cofounder, has found a receptive audience in right-wing activist Christopher Rufo and other conservatives by claiming Wikipedia has strayed from its neutrality principle by making judgments about the reliability of sources. Instead, he argues, it should present all views equally, including things “many Republicans believe,” like the existence of widespread fraud in the 2020 election and the FBI playing a role in the January 6th Capitol attack.

Last spring, the reliable source page collided with one of the most intense political flashpoints on Wikipedia, the Israel-Palestine conflict. In April, an editor asked whether it was time to reevaluate the reliability of the Anti-Defamation League in light of changes to the way it categorizes antisemitic incidents to include protests of Israel, among other recent controversies. About 120 editors debated the topic for two months, producing text equal to 1.9 The Old Man and the Seas, or “tomats,” a standard unit of Wikipedia discourse. The consensus was that the ADL was reliable on antisemitism generally but not when the Israel-Palestine conflict was involved.

Unusually for a Wikipedia administrative process, the decision received enormous attention. The Times of Israel called it a “staggering blow” for the ADL, which mustered Jewish groups to petition the foundation to overrule the editors. The foundation responded with a fairly technical explanation of how Wikipedia’s self-governing reliability determinations work.

The article goes on to talk about how people will cherry-pick examples of things getting deleted in an effort to prove bias, but when you read the Talk page that explains why it was deleted, it’s usually because it wasn’t following the rules. One person tried to use some kind of GPT language model to prove bias, but that study wasn’t peer reviewed, and those who looked into it explained why it doesn’t hold up.

There are biases on Wikipedia, but it’s not about the politics of left versus right. It’s more about the site as a whole and what topics even get a page in the first place. Sometimes this reflects bias in other media; as former Wikimedia CEO Katherine Maher said, “We’re a mirror of the world’s biases, not the source of them. We can’t write articles about what you don’t cover.” Other biases could come from the fact that most editors are men in Europe and America, so topics that interest them are much more comprehensive.

Then there’s a large section of the article that explains in detail all of the ways that authoritarian governments around the world have attempted to manipulate Wikipedia. There’s even been some threats from the US government, and American editors are getting concerned:

In April, the Trump administration’s interim US attorney for DC, Edward Martin Jr., sent a letter to the Wikimedia Foundation accusing the organization of disseminating “propaganda” and intimating that it had violated its duties as a tax-exempt nonprofit.

From a legal perspective, it was an odd document. The tax status of nonprofits is not generally the jurisdiction of the US attorney for DC, and many of the supposed violations, like having foreign nationals on its board or permitting “the rewriting of key, historical events and biographical information of current and previous American leaders,” are not against the law. Sanger is quoted, criticizing editor anonymity. In several cases, the rules Martin accuses Wikipedia of violating are Wikipedia’s own, like a commitment to neutrality. But the implied threat was clear.

“We’ve been anticipating something like this letter happening for some time,” a longtime editor, Lane Rasberry, said. It fits the pattern seen in India and elsewhere. He has been hearing more reports of threats against editors who work on pages related to trans issues and has been conducting security trainings to prevent their identities being revealed. Several US-based editors told me they now avoid politically contentious topics out of fear that they could be doxxed and face professional or legal retaliation. “There are more Wikipedia editors getting threats, more people getting scared,” Rasberry said.

Most people don’t really understand how Wikipedia works, and that can lead to them being skeptical of it or mistrusting it completely. But the great thing about it is that anyone can participate. One of the guidelines is to assume good faith, so as long as you’re engaging in good faith, you will be welcomed. The article mentions that there are many people who first got involved by vandalizing certain pages, but after talking to the community and learning the rules, they became valuable editors. The best defense of Wikipedia is to explain how it works.

As for the letter from the interim DC attorney, Trump withdrew Martin’s nomination in May, though he still has a position leading the Justice Department’s retribution-oriented “task force on weaponization.” In any case, the Wikimedia Foundation responded promptly.

“The foundation staff spent a lot of passion writing it,” Wales said of the reply. “Then they ran it by me for review, and I was ready to jump in, but I was like, actually, it’s perfect.”

“It’s very calm,” Wales said. “Here are the answers to your questions, here is what we do.” It explains how Wikipedia works.