Trying to Have It All
In a post on Facebook, Mark Zuckerberg addressed criticism that his company was responsible for Donald Trump’s election victory over Hillary Clinton due to the prevalence of fake news on the platform.
I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
This passage highlights the core of Facebook’s identity crisis. They want to refrain from censorship or editing while delivering the “most meaningful” content to users. They don’t want to be a media organization but they want to be your favorite news source. Quite simply, they can’t have it both ways.
The Newsfeed of the Past
Facebook at one time had a true Newsfeed. Like Twitter, it was a simple chronological listing of what the people and pages you followed were posting. This is a solution that works for people and content creators because the rules are simple. You build up friends or followers and they will see what you post in their feed.
In this scenario, you would only get fake news if you followed pages publishing fake news. This is the key difference from curated content in “Top Stories”, and why Facebook is deserving of the criticism they are currently receiving.
Top News shows popular stories from your favorite friends and Pages, many of which have gained lots of attention since the last time you checked. In this view, you might find out about an old friend becoming engaged or see a hilarious video that your sister posted and that tons of your friends liked.
Since Top News is based on an algorithm, it uses factors such as how many friends are commenting on a post to aggregate content that you’ll find interesting. It displays stories based on their relevance, rather than in chronological order.
Facebook shifted from the Newsfeed to “Top Stories” as the default sorting method because it was more profitable. Over the years, Facebook has used a combination of human editors and artificial intelligence algorithms in an attempt to give people what they want to see. Happy people, who are getting lots of content that interests them, spend more time on Facebook.
Now, instead of you being responsible for everything you see, Facebook is making choices on what content to deliver to you. By leveraging your engagement history, Likes, and network data, Facebook’s algorithms attempt to predict which posts you will want to see.
The issue of “Echo Chambers” has been widely highlighted during this election cycle. People now live in their own news bubble unlike at any previous time in modern history. Many people only follow sources which validate their existing worldview, but now Facebook is attempting to do that on your behalf. By automating those choices, they have put themselves in the position to be evaluated for how well they are doing. When their algorithms suggest fake news that you would not have seen in the Newsfeed of old, they are responsible.
Mark Zuckerberg seems to feel that the individual is still ultimately responsible for what they see. However the way Facebook works today, that really isn’t fair.
What Comes Next for Facebook?
Facebook has to decide whether they want to become the news media giant that most people already think they are, or take a step back to stop using algorithms to determine what is relevant content.
I don’t expect Facebook to go back. I think they will double-down on strengthening the algorithms to become more intelligent. Facebook may bring in some experienced folks from traditional media backgrounds, but that will be a stopgap to ease the election backlash. Those people will be tasked with converting their expertise into better algorithms. Human editors can’t possibly be expected to work on the scale that Facebook does in real-time.
If Facebook wants to curate content then they are going to need to develop AI that can understand the nuances between not only real vs. fake information, but also the complexities of irony, sarcasm, parody and so many other valid forms of content. Their AI needs to understand not just the reaction a post is receiving, but why. Only then can they begin to develop a system where users can fully understand what they are being shown.