Facebook Controls What News People Read. That’s A Problem

It's Facebook's world. We just live in it.

Facebook has come a long way from the little social site launched to help us chat and flirt and cyber-stalk friends. It directs web traffic to a large portion of the Internet, it commands a huge share of mobile advertising and, increasingly, it has become the arbiter of information.

New research released Tuesday by the Pew Research Center with the Knight Foundation suggests that the social network’s hold on how we find news is continuing to grow. The problem here is that Facebook's News Feed uses an algorithm to select the posts it thinks you're most likely to read, which is worrisome for those who like to control their information diet.

In a survey of 2,035 social media users, Pew found that 63 percent of Twitter users and 63 percent of Facebook users say they use their accounts to find and read articles. It’s been a long time since Facebook was primarily a place to interact with friends, but this is the first study that suggests that most people who use social media are consuming news -- and a large increase from 2013, from the last time Pew comprehensively studied news consumption on Facebook.

In a short survey, published by Pew last August, just under half of the Facebook users surveyed said they’d used the platform to read a news article in the last week.

Pew Research Center

And, this growth of people reading news is happening across different populations. People both under and over 35 are more likely to use Twitter to read news; on Facebook more men and women say they get news from the site.

As the new study points out, these numbers match growing campaigns over the last year by both Twitter and Facebook to get closer to being, not just a filter for users to access journalism, but a source of news.

Facebook’s "Trending" sidebar, released in June, spits out a constantly updated feed of news stories, directing users to the most-shared news content. That news is more likely to be hosted entirely within Facebook, now that news outlets like The New York Times, Buzzfeed and National Geographic publish articles directly to Facebook Instant. Twitter purchased Periscope, a live streaming app, to encourage users to record content directly to Twitter. And in June Buzzfeed broke news of Project Lightning, a Twitter experiment to curate feeds of text, video and images around news events, giving readers a “newsroom experience” of breaking news.

But, even though both Facebook and Twitter are trying to get into the news game, it’s Facebook that’s having the most influence over what you read.

Only 17 percent of American adults use Twitter, a fraction of the 66 percent of Americans who now have Facebook accounts. That means that while one in 10 adults gets news from Twitter, a solid one in four gets news from Facebook. And while Twitter offers a cumulative feed of everything the people you follow have posted, Facebook relies on its algorithm to filter its content and select the stuff that it thinks you’ll enjoy.

Often, that stuff is not hard news.

Readers were significantly more likely to come across stories on business, international news or national politics if they were reading Twitter, according to the Pew report. As John McDermott reported in Digiday, stories on Ferguson and Michael Brown generated a lot less traffic from Facebook than Ice Bucket Challenge stories, which resulted in an average of 2,106 visitors for news outlets versus 257 for the average Ferguson article.

“Relying too heavily on Facebook’s algorithmic content streams can result in de facto censorship,” McDermott wrote. “Readers are deprived a say in what they get to see.”

But it’s not just Facebook’s algorithm withholding news from us. What we’re willing to share with our friends and family on Facebook, and what we choose to click on, is different than what might otherwise read away from the site.

A much-discussed study, published by two Facebook employees in the journal Science, confirmed that if you identify your political party on Facebook, its feed filters out posts with differing political views. But both Republicans and Democrats self-filter, as well. They were less likely to click on content that didn’t align with their beliefs.

After all Facebook, unlike a newspaper, is a public space, with its own set of cultural norms, which shape the way we present ourselves on the platform.

In one of the most comprehensive studies of Facebook sharing, two professors at the Wharton School of Business at the University of Pennsylvania found that positive content -- the kind of stuff we’d like our personal brand to be aligned with --more often goes viral on Facebook than negative content.

Another Pew study on social media and the "Spiral of Silence," released last August, found that Facebook users were less likely to share or engage with content that their friends on the site might take issue with. They also unearthed a more chilling response: This disengagement followed users to their real lives. The average Facebook user was half as likely to say that they would voice a contrary opinion with friends face-to-face, as compared to people who didn’t use the social network.

Facebook, it seems, is only going to become a stronger dictator of what kind of information we read. Which makes tools that give readers some kind of control, like Facebook’s News Feed update, even more important.

It’s still Facebook’s world, but from now on, we’re all gonna have to live in it.

Close

What's Hot