I just read in today's New York Times business section that Facebook last week admitted to doing "psychological testing" on its readers by -- during a week in January 2012 -- trying to manipulate the feelings of 689,003 of its randomly selected users by changing the number of positive and negative posts that the readers saw. "It was part of a psychological study to examine how emotions can be spread on social media," according to The Times.
Tell the truth -- if you saw a lot of negative posts on Facebook would this bring you down and cause you to write more negative posts? And if you saw upbeat, positive news on Facebook would that lift your spirits? Of course it would!
That's what the Facebook study discovered, according to The Times.
The researchers found that moods were contagious. The people who saw more positive posts responded by writing more positive posts. Similarly, seeing more negative content prompted the viewers to be more negative in their own posts.
So when this news about Facebook came out last week, there was a lot of outcry, as might be expected. "I wonder if Facebook KILLED anyone with their emotion manipulation stunt." tweeted one commentator, Lauren Weinstein, according to The Times.
This is a valid question. I happen to be a news junky who reads three newspapers every morning first thing, and I admit to checking Facebook about a zillion times a day to see what my children and friends are up to. Lately, there have been so many headlines about children being abused, kidnapped, shot, stricken with deadly diseases or locked in hot cars that I'm seriously considering cutting out the newspapers in the morning. And every time I see an item on Facebook that appears to chronicle a child's injury or disease or abusive childhood or tragic death, I avert my eyes and quickly scroll on by.
Part of the reason I've become hypersensitive to bad news about kids is the entry of a granddaughter into my life during the past three years. You forget how vulnerable and small and easily harmed your children were when they were new.
My daughter, the baby's mother, had the same reaction. She and her husband used to enjoy watching the TV show "Dexter", about a serial killer. But since the baby was born, she can't watch violence of any kind. As you can imagine, we both avoid shows such as "Game of Thrones" like the plague. (They'll probably incorporate that into the script, too, if they haven't already.) And, while I'd really like to see the Oscar-winning film "12 Years a Slave", I know I couldn't manage to sit through all the violence, but would probably run out of the theater, the way I did when I was seven and my crusty old grandmother would take me to Bible films like "Samson and Delilah."
Back to Facebook manipulating the posts we saw to find out what lots of negative or positive news would do to us. It seems the Facebook people are now feeling sorry and trying to explain themselves, in view of the public outcry.
"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused," posted Adam D. I. Kramer, who led the study.
"Ultimately, we're just providing a layer of technology that helps people get what they want," said Chris Cox, chief product officer of Facebook, talking to The Times.
All the excuses of the Facebook executives are, for lack of a more pungent phrase, a bunch of hooey. I'm not a researcher or internet genius, but I do know that, when you feel happy, you're much more likely to react to ads, like the ones on Facebook, and buy something. When you're depressed, you're not.
Whenever I manage to diet off that pesky ten pounds of excess weight, I always happily rush out and buy clothes in my new size that will hang in my closet, price tags still attached, to silently rebuke me when they don't fit any more, and that's when, sadly, I have no urge to buy more clothes.
A happy Facebook reader is more likely to respond to the ads on Facebook than a depressed Facebook reader, and that's the whole reason for their little foray into psychological testing and emotional manipulation. The Facebook executives should confess this and be ashamed.
But unless they throw me out for badmouthing the site, I suspect that's still not going to ameliorate my Facebook addiction.