Is Facebook Controlling the News You See?

With so many people on Facebook every day, for so much time, and most of them looking out for news, how news appears on Facebook clearly matters, a lot.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Why do you see what you see when you log onto Facebook?

You probably log onto Facebook once a day... or 20 times a day. Right? #notashamed

But explosive allegations of bias in how Facebook manages their trending topics have opened the floodgates about Facebook's power over what people see and how they consume news.

Let's take a look at everything behind it:

The stats

Facebook is more influential than you might think.

There are 1.6 billion users who use Facebook at least once a month. There are three billion internet users today and Facebook can reach more than half of them.

The world's population is just over seven billion -- so 1/7th of the world is on Facebook.

And people spend nearly an hour a day on Facebook. That's a lot of time.

And more and more people get their news on FB, so many don't actively seek it out elsewhere. 63 percent of American Facebook users say they get their news from FB. 31 percent use Facebook to get breaking news.

With so many people on Facebook every day, for so much time, and most of them looking out for news, how news appears on Facebook clearly matters, a lot.

The algorithm magic

Facebook has a lot of algorithms (basically fancy coding systems that automate operations).

The home News Feed

It's mostly a mystery. (No, seriously, FB won't tell anyone how it really works.) It organizes hundreds or thousands of posts from friends and family, and Facebook groups, events, or liked pages to present what is most interesting to the user, you.

Facebook wants to know what you want to see, so they can show it to you. (Chris Messina/Flickr)

Facebook says:

"The stories that show in your News Feed are influenced by your connections and activity on Facebook. This helps you to see more stories that interest you from friends you interact with the most."

They narrow it down in a seemingly random order but with the most "interesting" posts at the top.

For example, when the word "congratulations" appears in the comments, FB can detect that post is a significant one for more friends of that person to see. The friends you chatted with (or even that ex-friend you stalked the other day) or the pages you've liked a post from (your celebrity crush, hmm) tend to go to the top almost in time for the next time users log on.

Facebook also takes other factors into account, like the time you spend on posts and stories, to figure out what to show you. After all, if they showed you every single post from all of your hundreds of friends and pages you've liked, you'd be lost in a sea of posts. So they use data to figure out what to put into your feed.

Facebook's trending topics

This is the thing that people are worked up about: trending topics. In case you've never noticed it, there's a box on the right side of the feed where it highlights 10 trending news stories in four general subjects: politics, science and technology, sports, entertainment.

Facebook says:

Trending shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors including engagement, timeliness, Pages you've liked and your location.

No one ever knew exactly how it worked or categorized what was "trending" for more than a billion users. Until now, because Facebook workers and leaked documents revealed it was all the work of people, not an algorithm.

So, what happens is:
- An algorithm finds what is up and coming in terms of what Facebook users are talking about and sharing in the news.
- Then, human editors choose how to present them and which stories from news organizations to put at the top.

But ... this week, reports suggested that editors would "inject" certain stories into the Trending section even if they weren't popular stories. Or they would "blacklist" a story and not showcase it to users.

So, yes, there are mathematical algorithms, but also a lot of involvement by actual human curators.

Does Facebook have bias?

Some suggest there may be some biased work by Facebook. And this isn't good.

Bias towards their own priorities

Facebook is not just providing free services -- they're a company. And they're always coming up with more ways to keep you on and using the site and ways to make money. So as they roll out more features, they want users to see and use them, and they have ways of compelling you to do just that.

One way is by privileging certain types of content. Like, they recently introduced Facebook Live -- live video. They're really keen on making it popular, so they make sure you see live videos. In fact, they're into videos overall, so they're more likely to show you video in your feed than a link to an article.

They're also into Instant Articles -- publishers' articles that open quickly inside of Facebook, without taking you off to the publisher's website. You're probably seeing more and more of these too.

Bias towards big news organizations

Think: BuzzFeed and the New York Times. These are two of the news orgs that have worked closely with FB to publish Instant Articles.

Plus: Facebook pays BuzzFeed and the New York Times to produce content for them, like live streams.

Here's the short list of which news organizations Facebook recognizes as national outlets. https://t.co/5K7hGji0CX pic.twitter.com/d8KEvsnNvH

— Margarita Noriega (@margarita) May 12, 2016

They also lean towards old-media news organizations. There is allegedly a list of 10 news orgs they tend to focus on, like BuzzFeed and The New York Times, but also The Guardian, NBC News, CNN, and The Washington Post, among a few others.

Bias towards friends' activity

The way the algorithm works helps popular news organizations on Facebook, which means you're more likely to see their stuff. For example, you may not like a news org's Facebook page, but if your friends like a post or video from that page and share it, then, friends of friends or your friends comment on it, it gets pushed it up into your news feed.

Bias against clickbait

And if the content is from a mainstream publisher and not a junky clickbait site, you're even more likely to be shown it.

In fact, they tweak their algorithm all the time to ensure they're showing you the best content so that you'll be happy with what you see and keep coming back for more.

Bias towards content that keeps you engaged

Takes a good amount of time to read? Gets you to interact? Then Facebook wants to show it to you. That's because this kind of content keeps you on Facebook longer. They like that.

(Alleged) bias against conservative news

This week, people who used to work on Facebook's trending topics claimed the social network was biased against conservative news. Meaning they were discouraged from highlighting conservative stories in the Trending topics section.

If true, this is alarming, considering how influential Facebook is.

political flap aside, the facebook trend people really seem to know what i like pic.twitter.com/AkvdxQIqJ7

— Adam Weinstein (@AdamWeinstein) May 13, 2016

So alarming that the Senate is investigating.

Facebook's looking into it themselves too.

What you can do

Facebook says they're working on addressing political bias, and they continue to tweak their practices and algorithms.

But you do have some control -- with your behavior. What you spend time on and engage with will affect what Facebook puts in front of you.

So: Engage with the Facebook pages and news organizations you prefer to follow and see in your feed. Like their page, share a post, comment, click through. Facebook will "learn" that you like that content and show you more of it. (For example, you can click on the Facebook button below to share this article!)

And for good measure, to counteract your own biases, include pages whose content you may not naturally prefer. If Facebook feels like an echo chamber, it might not be Facebook's fault.

This article was written by Patrick deHahn and originally appeared on Kicker. Kicker explains the most important, compelling things going on in the world and empowers you to get in the know, make up your own mind, and take action. For more, check out the Kicker site, like their Facebook page, or subscribe to their email newsletter.

Popular in the Community

Close

What's Hot