09/29/2012 03:17 pm ET Updated Nov 29, 2012

Why Facebook and YouTube Should Err on the Side of Free Speech

I certainly understand why Huffington Post blogger Soraya Chemaly wants Facebook to take down the page entitled "Boobs breasts and boys who love em," and of course there are many reasonable people who feel that Google (which owns YouTube) should remove that horrible anti-Muslim video that has been associated with sometimes violent demonstrations in Arab countries.

The Facebook "boobs" page (Chemaly links to it, but I won't) is incredibly sexist and distasteful. Yet, as offensive as these pictures are, they do not violate Facebook's Community Standards that don't tolerate hate speech, graphic content or nudity and pornography, among other categories.

The images on the page are definitely not nude and don't pass any reasonable test for pornography. One could argue that they're hateful towards women, but that's not an opinion everyone would share and it certainly doesn't "attack a person based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition."

YouTube has blocked access to "The Innocence of Muslims" in Libya and Egypt (where it is illegal) but won't ban it in the U.S. and most other countries.  In response to pressure from the White House and other quarters, Google issued a statement (as reported by AP) that the video "is clearly within our guidelines and so will stay on YouTube. However, given the very difficult situation in Libya and Egypt we have temporarily restricted access in both countries. Our hearts are with the families of the people murdered in yesterday's attack in Libya."

In its Community Guidlines, YouTube doesn't permit hate speech, which it defines as "speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity." I suppose it's up to interpretation as to whether this video "attacks or demeans a group"  As The New York Times pointed out, "Under YouTube's terms of service, hate speech is speech against individuals, not against groups. Because the video mocks Islam but not Muslim people, it has been allowed to stay on the site in most of the world," a company spokesperson told the Times.

Tough call

Regulating nasty and offensive speech, even for private companies is a tough call even though (unlike government entities in the U.S) they have a legal right to ban just about anything they want. On one hand, banning offensive Facebook pages and hateful YouTube videos would be popular among many people. Yet, having to make decisions on a page-by-page or video-by-video basis without adhering to guidelines, sets a dangerous precedent because it would require censors at the companies to make value decisions about the nature of specific content their users post. It's one thing to enforce guidelines, but it's something different to make exceptions just because a piece of content is offensive.

Solemn responsibility

Combined, Facebook and Google have a reach that's bigger than any of the world's governments, so decisions made by these companies have enormous weight, even though they have nothing to do with the rule of law. As a result, it's incumbent on these companies to treat speech with an enormous amount of reverence. They do have a right to set limits that exceed speech rules in democratic countries like the U.S. but they also have a solemn responsibility to take their roles seriously and not arbitrarily censor content unless it clearly violates their stated guidelines.

This article also appeared on

Subscribe to Larry on