Why Jimmy Wales Allows "Vulva" on Wikipedia

Decisions made by Wikipedia editors are not law. But when it comes to determining what is 'normal' online -- and how much taboo we're willing to stomach on our Internet -- a Wikipedia referendum can count for just as much.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Since last Wednesday's Wikipedia blackout, face time with the site's founder Jimmy Wales has been a hot commodity.

It might seem curious then, that when we sat down with Jimmy on Thursday evening -- just hours after Wikipedia came back online -- we used our prized hours with him to revisit a rather well-worn, five-year-old debate. We wanted to talk about the Muhammad cartoons.

Jimmy obliged us.

More than five years after the cartoon controversy erupted, Wales appeared less-than-bullish on the issue of publishing controversial content. If I expected Wikipedia's "benevolent dictator" to venerate information sharing above all else, I was mistaken.

Instead Wales, who sat behind a wooden podium, facing out on a crowd of mostly graduate students, backed a system that would allow Wikipedia users to filter images. After all, he said, you don't want "to ram [information] down people's throats."

***

On Thursday, Jimmy Wales came to Oxford University to launch www.freespeechdebate.com: an ambitious, multilingual website for the discussion of free speech in the Internet age. (Full disclosure: I am a contributor.)

Walking into the lecture hall at Oxford's Clarendon Laboratory, Wales -- a small and compact man, who sports a few more wrinkles than he does in Wikipedia's "Personal Appeal" ads -- looked a little disoriented. Sitting down beside the event's moderator, the historian and writer Timothy Garton Ash, Wales dubbed the website "cool," and glanced up at a live Twitter feed, where questions poured in from as far away as China and Sweden.

We did the requisite back and forth on SOPA and PIPA, which Wales predictably and rather flippantly dismissed as Washington's "Something must be done. This is something. Therefore, we must do it." approach.

But soon we were on to the Prophet Muhammed.

We had wanted to discuss the Mohammed cartoons, because they are a test of one of Free Speech Debate's core principles: "We allow no taboos in the discussion and dissemination of knowledge."

Wales agreed with that idea. And he dismissed the notion that individuals have a right not to be offended. Being offended sometimes is part of the game, he agreed, when you live in a society that sanctifies free speech.

Indeed Wikipedia.org's entry on the cartoon controversy displays the pictures proudly. And its editors have defended that decision stoically -- with the help of a standard "Muhammed cartoon" email rebuttal that they send to anyone who kicks up a fuss.

But that's not true across all Wikipedia pages. The French and German Wikipedias, for instance, give links to the cartoons, but don't feature the images directly. The Arabic, Urdu, and Turkish pages show neither images nor links. (Wikipedia's language pages are governed by local communities, who make their own editorial calls.)

Back in 2006, the publication of the cartoons was debated actively in Wikipedia.org's digital back rooms. In February of that year, editors held a poll on the issue. The primary question was straightforward: to publish or not to publish. Wikipedians were also asked how published images should be positioned, how many images should be used, and where images should appear on the Wikipedia entry page.

In fact the broader debate over whether to filter potentially-offensive images has been one of Wikipedia's most bellicose. Today, controversial pictures are published all the time. But this practice was hardly preordained.

In 2004, some Wikipedia editors began to advocate for a cross-the-board image filtering system. Similar proposals have been made -- and voted down -- again in recent years. Last year, editors held a community referendum; 24,000 voted: most, against an image filter.

Similar debates are taking place around the world. In 2010, one kicked off in Germany, after Wikipedia.de published an entry on "Vulva" (complete with in-between-the-legs photographs) on its main site.

On Thursday, Wales cast his own vote: He argued that users should have the ability to "control their own experiences" online, by filtering out images they don't like.

It's not censoring, Wales cautioned. It's personalized filtering: controlled entirely by individuals.

***

Back again to Mohammed.

Last Thursday, Wales's line was that you really need to see the Muhammed images to "think thoughtfully" about the controversy. I agree, which is why I linked to cartoons above.

Yet that seems somewhat at odds with Wales pro-"image filter" stance.

This gets at the heart of a critical tension: between Wikipedia's goal of disseminating knowledge, and its mandate "to be educational in nature."

A 2011 statement by a member of Wikipedia's Board of Trustees revealed the thin gap between those aims: "...We believe there is a problem. The purpose of the Wikimedia movement is to make information freely available to people all around the world, and when material on the projects causes grave offence, those offended don't benefit from our work. We believe that exercising editorial judgement to mitigate the offence is not censorship." We believe that exercising editorial judgement to mitigate the offence is not censorship.

When the educational merit of information is not obvious, that information is subject to scrapping. That's not surprising; an encyclopedia, after all, is more than a data dumping ground.

As Wales said on Thursday (employing a rather icky mixed metaphor): "Simply puking up every bit of information you can isn't very helpful and doesn't help the reader digest it."

Naturally, the problem comes in determining what to classify as "educational."

The tension between knowledge and education also comes out in Wikipedia's concern about alienating users. It seems that some trustee members are willing to delete (or, at least, filter) especially controversial content to avoid losing potential readers.

In front of the Oxford crowd, Jimmy Wales waxed poetic about the plight of an archetypal young boy in an Arab country who wants to learn more about the cartoon controversy, but is nervous that he will offend Allah by looking at the pictures. Shouldn't the boy be granted protection from the images?

I have sympathy for the boy.

But it also strikes me that when Wales agreed, at the start of his talk on Thursday, that "we must allow no taboos in the discussion and dissemination of knowledge" -- and when he firmly declared that we don't have the right to not be offended -- he was speaking in the aggregate.

On an atomized level, Wales wants to protect a reader's rights to avert his/her eyes, to opt out of offense, to hide from taboo. And he supports giving us the technological capability to do that.

In a round-about way, Wales can back his proposal with an appeal to "education," as distinct from information.

Decisions made by Wikipedia editors are not law. But when it comes to determining what is 'normal' online -- and how much taboo we're willing to stomach on our Internet -- a Wikipedia referendum can count for just as much.

Debate these issues and more at www.freespeechdebate.com, a project of Oxford University. Follow the discussion on Twitter: @onfreespeech.
And watch our entire interview with Jimmy Wales.

Popular in the Community

Close

What's Hot