Eli Pariser and the Annoying, Aggressive Algorithms

06/02/2011 04:24 pm ET | Updated Aug 02, 2011

Eli Pariser's great online success was as executive director of, taking a single-issue Web campaign (censure then-president Bill Clinton and move on) and turning it into a potent, multi-issue political force for progressive causes. Now, in his book, The Filter Bubble: What the Internet Is Hiding From You, he's looking at the Web from the opposite direction, noticing that choices for his online experience are narrowed without his knowledge, much less consent, by the wizards (human or otherwise) behind the curtain and pondering how dangerous that trend is now, and could be in the not too distant future.

His book is about the fierce personalization going on behind the screens -- the Filter Bubble -- whether you know it is going on or not. It's a fine inspection about how two people can call up the same search term on Google and get two different responses, based on the preferences those two people had established in their Web habits as constructed by deeply embedded online data collection. That data collection by itself is tremendously intrusive, and is the subject of a number of bills in Congress to restrict it and to protect consumer privacy. And yet it's that data which allows for the personalization.

And now for an Irony Alert.

A couple of nights ago, while reading Pariser's book, I decided to put on some music in the background and so called up my stations on Pandora, the online music service. I "dialed up" (a quaint term that once referred to a radio dial) one of my favorite "stations" that I created on Pandora. Little did I know that in listening to Pandora, I was in the middle of what could be another case study for Pariser. That's because even though two Pandora listeners may ask for a "station" with the same artist, say Adele, the Adele station of Listener 1 will be different from the Adele station of Listener 2, based on listener preference.

At least the songs for the stations are chosen by actual humans with a background in music, and not some anonymous algorithm, even if there is some mechanization involved in matching those songs with individual listener preference. (Subject to limits to playing only four songs by a single artist within a three-hour period, and users can only turn thumbs down on a limited number of songs per hour.)

It's that concept of a "station" for one that distinguishes Pandora from a regular broadcast radio station. Everyone listening to the same broadcast radio station at the same time hears the same thing. It's a shared experience, and there's something beneficial to that.

The same goes for television. Everyone who watches 30 Rock or American Idol or The Good Wife can discuss with friends the same episode, from having the same shared experience. It's one of the social benefits of a mass medium. Everyone can talk about the same show, which helps promote the show. Advertisers pay for the privilege of reaching the people watching the show, hoping their target audience will cut through the clutter to see (or hear, in the case of radio) their ad among all the others.

Pandora is different. Because there are so many musical choices and because the service has information on each listener who sets up favorite "stations," the ads can be targeted to the exact type of person an advertiser is trying to reach. And with only three or four ads per hour, each ad gets more attention than a broadcast ad. The service can be specialized because it starts with information that each user give it, including musical tastes, where you live, where you work, where you go to school.

Pariser's epiphany came on Facebook, when he noticed that he was seeing posts from some of his FB friends, but not from others, depending on which ones he read. The liberal ones were kept; the conservative friends dropped off. He wonders whether the Web would turn into a one-size-fits-one Web, in which everyone would have a different experience, even down to the colors of a Web site.

The larger question is, does this personalization increase the value of the product to a consumer, or decrease it? Steven Levy's new book about Google, In The Plex, tells the story of a Google engineer Joe Kraus, who was trying to find an anniversary gift for his wife. Using Google didn't cut it, so he went to the closest thing to a social network -- his tag line in Google Talk, and asked for gift ideas. He got a bunch of good ones and, as Levy reports, "It was a sobering revelation for Kraus that sometimes your friends could trump algorithmic search." Maybe that's one reason why Google hasn't grasped the social media.

It's also obvious, however, that the algorithms won, as Pariser's Facebook experience showed. Algorithmic alchemy has now diluted the value of friends by choosing your BFFs for you, thus at once making results and advice more relevant but also much more limited. This is the "Filter Bubble" which Pariser argues needs to be popped.

It's one thing to choose your filters. If you are going to be limited in your interests and world outlook, then at least you are doing the choosing. It's quite another to have your interests assumed and accounted for by software without your knowledge, the situation Pariser describes.

At the end of the day, the humans on the keyboard can do some things to outwit the machines, should they choose to do so. You do it by adding Cee Lo Green or Beethoven to your Adele station, by adding foreign news to the usual domestic political articles you tag from a news site. Such tactics won't disable the annoying algorithms, but might slow them down for a millisecond or two as they ponder such diverse tastes.

There are bills floating around in Congress that would prohibit Web sites from tracking where users go online without permission, and have other data-collection restrictions. The result would be more privacy, less personalization -- certainly less personalization without permission.

The debates over personalization, pro and con, and privacy, are more informed because of Pariser's timely and important book.