Facebook is in hot water after news of a psychological experiment they conducted was revealed this weekend.As reported in The Verge:
The co-author of the study defended the research, saying:
According to a new paper in the Proceedings of the National Academy of Science, Facebook altered the News Feeds for hundreds of thousands of users as part of a psychology experiment devised by the company's on-staff data scientist. By scientifically altering News Feeds, the experiment sought to learn about the way positive and negative effect travels through social networks, ultimately concluding that "in-person interaction and nonverbal cues are not strictly necessary for emotional contagion."
The reason we did this is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.
The study was conducted for a very short period of time on a statistically small sample. No kittens or puppies were harmed.
I'm not a psychologist so I'll admit that I don't fully understand the study (how do they actually measure "negative and positive effects" anyway?) or its conclusions. I am, however, a business owner, salesman and marketer so I also admit that I'm a supporter of firms using our data to provide us with a better experience.
Large corporations like Samsung, Amazon, Proctor and Gamble, Intuit and yes, even Facebook are known for using their customer data for marketing and research. Good for them. I get it. In fact, I'm constantly looking for more data to do a better job selling my company's products and services too. As a consumer and as long as I give permission, I'm more than happy to share my viewing, eating, spending and online habits with corporate America if it means that they can come up with more services and products that will make my life better, easier and more convenient. I have no problem with this.
We enjoy the benefits of these mostly-free services. They open up our lives and businesses to new friends and customers around the world. In return, the people running those services get advertising dollars, maybe a small monthly fee, the possibility of collecting billions in an IPO and... our data. Take a look at your terms and use agreement with Facebook and you'll find that the company did not break any laws. They can pretty much use your information however they'd like, even conduct scientific studies with it. We allow this. (if you don't believe me you can wade into the company's Data Use Policy here and let's admit it, this is the first time you, like me, have ever been to this page, right?).
But sometimes, like now, a line gets crossed.
And in this instance, Facebook crossed it. They didn't just use our data for us. They used it against us. They changed the way their service was provided to a random group of their customers for the purposes of a scientific experiment. This is like AMC running an alternate ending of Breaking Bad for a sampled number of viewers to see how they would react if Walter White lived (uh...I hope you've seen it already?). Or Twitter sending out a fake "follow" from Kanye West to a sample group of users to gauge their reaction (Mine? Meh...). This wasn't a study. This was a violation of an agreement. Customers didn't get the news feeds they should have. Now, let's all calm down -- it's not the end of the world. But it's still a violation. Enough for a lawsuit? Probably not. Deserved of a lot of bad PR and abuse in the media? Absolutely.
So what's the lesson? The era of big data is upon us and this is a good thing. The more companies know about me, the better and more tailored services they can deliver to me. But don't change the service you promised just for the purpose of collecting my data! And don't include me in an "experiment" without getting my permission. Whether or not you're collecting our data or conducting a scientific study, disclose your intentions. Be transparent. Tell your customers what you're doing. Some may not like it so they can decline. But most customers, like me, probably won't care. Especially if we understand how this can help us.
No, Facebook didn't break the law. But they did break the trust with their customers. And that's worse.
A version of this column previously appeared in Forbes.com.