"Imagine... that you knew which sites -- or what news stories -- people you trust found useful and which they disliked," David Kirkpatrick wrote in the June 11, 2007 issue of Fortune magazine. "This isn't fantasy. Facebook might make it possible, and soon. Yes, the social-networking site college kids spend so much time on -- the one you thought was just about hooking up -- could turn out to be more important than any of us thought."
Kirkpatrick, who was then Fortune's Senior Editor for Internet and Technology, went on to write the best-selling The Facebook Effect: The Inside Story of the Company That is Connecting the World, the definitive book on the company. He was prescient. In a startlingly short period of time, Facebook did make it possible for you to find those trusted and useful news sites and stories -- along with much, much more.
Now, with Facebook facing growing scrutiny in advance of its IPO next month, which is expected to value the Internet giant at $100 billion, the question of trust looms even larger. True, the social networking giant has made it easier than ever before to find trusted friends and followers, who can now create, curate, aggregate and distribute news and information with an unprecedented ease, as I detail in my new book Friends, Followers and The Future: How Social Media are Changing Politics, Threatening Big Brands and Killing Traditional Media.
But is Facebook itself, the billion dollar baby whose rapid growth has yet to be slowed by continuing controversy over the privacy of its more than 800 million users, itself worthy of our trust? Can we rely on its wunderkind CEO Mark Zuckerberg, who has repeatedly pronounced privacy to be outmoded and argued that we are living in a new era beyond it, to safeguard our interests? Despite our differing -- some would say competing -- concerns, should we regard Facebook and Zuckerberg as our friends?
After all, the online social network, which offers its tools, technologies, and services at no cost, makes profit primarily by using heretofore private information it has collected about you to target advertising. And Zuckerberg has repeatedly made sudden, sometimes ill conceived and often poorly communicated policy changes that resulted in once-private personal information becoming instantly and publicly accessible. As a result, once-latent concerns over privacy, power and profit have bubbled up and led both domestic and international regulatory agencies to scrutinize the company more closely.
In one case, consumer protection groups, including the Electronic Privacy Information Center (EPIC) and fourteen others, filed a 2009 unfair-trade complaint with the Federal Trade Commission (FTC) accusing Facebook of unfair and deceptive trade practices that "violate user expectations, diminish user privacy, and contradict Facebook's own representations." It said that Facebook's decisions to disclose previously restricted "personal information to the public" had violated users' expectations, diminished their privacy, and contradicted its own representations. It asked the FTC to order the company to "restore privacy settings that were previously available... and give users meaningful control over personal information," to investigate Facebook's trade practices, require the company to restore privacy settings that were previously available and force it to "give users meaningful control over personal information."Facebook settled in November 2011 by agreeing to refrain from making any further deceptive privacy claims, to obtain consumers' approval before changing the way it shares their data, and to undergo independent third-party auditing for 20 years. Shortly after the uproar subsided, however, renewed concerns over privacy and trust began to shake the brand again. This privacy blunder centered on Facebook's belated admission that it was still tracking the web pages its members visited, even after they have logged out of the Facebook site. As Daniel Bates reported for the Daily Mail,
The social networking giant says the huge privacy breach was simply a mistake -- that software automatically downloaded to users' computers when they logged in to Facebook 'inadvertently' sent information to the company, whether or not they were logged in at the time. Most would assume that Facebook stops monitoring them after they leave its site, but technology bloggers discovered this was not the case.
Instead, the tracking information -- worth billions of dollars to advertisers -- was being sent back to the Facebook servers. Even after you were logged out, Facebook still tracked every page you visited. As Bates noted, "The admission is the latest in a series of privacy blunders from Facebook, which has a record of only correcting such matters when they are brought to light by other people."As its executives struggled to explain the "inadvertent" privacy row over its "creepy" web-tracking practices, that trust was shaken once again "by criticism and speculation regarding how it uses browser cookies to get data about users," as Josh Constine posted on Insidefacebook.com.
A lack of thorough documentation explaining what each of its cookies does has led some observers to assume that the company is tracking offsite browsing behavior in order to target ads. Facebook needs to provide explanations for both the average user and privacy researchers about how exactly its cookies work in order to prevent these press flare-ups from giving users a negative impression and bringing on regulatory scrutiny from governments.
The company's growing stature and importance only magnifies such concerns. As Facebook profile pages morph more and more into overall online identities, the inherent tension between our individual desire to protect personal information and the company's need for that information comes into ever-sharper focus. Last week, for example, Facebook sought once again to address the persistent criticism of its privacy practices by instituting a new policy providing greater transparency on the types of data it stores about you. Yet critics like Max Schrems, a German law student who filed a complaint leading to the agreement, still criticize the company's response. "We welcome that Facebook users are now getting more access to their data, but Facebook is still not in line with the European Data Protection Law," Schrems told Kevin J. O'Brien of the New York Times. "With the changes, Facebook will only offer access to 39 data categories, while it is holding at least 84 such data categories about every user." In 2011, when Schrems requested his own data from Facebook, he learned that the company was keeping information he had previously deleted from the website, and was storing information on his location.
None of that sounds too friendly to me, so I really can't recommend that you trust Zuckerberg, or Facebook, or indeed any corporation that makes its money by selling you -- down the river or anywhere else. And as Nielsen's Latest Global Trust in Advertising Survey proves, we trust "word-of-mouth recommendation from friends and family" above all other forms of communication. (At least that's what 92 percent of respondents in 56 different countries said.)
At the same time, our trust in paid traditional media (including television, magazine and newspaper ads) has steadily declined since 2009. (Trust in television is down 24 percent; magazines, down 20 percent; and newspapers down 25 percent, according to the survey.) "Consumers around the world continue to see recommendations from friends... as by far the most credible," said Randall Beard, global head, Advertiser Solutions at Nielsen.
Trust is essential for the success of any brand. Mark Zuckerberg may think that Facebook's recurrent privacy flaps haven't much affected the sometimes anti-social social network, but they represent a huge potential threat to what he has built. The high-handed manner in which members' personal information has been treated, the lack of consultation or even communication with them beforehand, Facebook's growing domination of the entire social networking sphere, Zuckerberg's constant and very public declarations of the death of privacy and his seeming imposition of new social norms all feed growing fears that he and Facebook itself simply can not be trusted. As Zuckerberg's fellow CEOs from the legacy media should have already learned, losing the trust of your audience is the first step in losing your audience itself -- and eventually the power of your brand.
Follow Rory O'Connor on Twitter: www.twitter.com/rocglobal