4 Suggestions for Facebook’s “Fake News” Program

Today, Facebook announced that it will be flagging “fake news”. While its not entirely clear where Facebook is going with this, I do have some concerns.

In Facebook’s announcement, it says,

We believe providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with third-party fact checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.

What most concerns me about this is the power that it gives to news organizations. Facebook is going to be giving a select number of organizations the ability to unilaterally discredit and suppress content.

Given this, I have a few suggestions that can turn this potentially dangerous system into one that could actually be quite beneficial (in order of importance):

  1. Don’t suppress disputed stories. Disputes are not always going to be correct. We can inform people about dissenting viewpoint, but should not prevent them from sharing these stories if they choose to do so. If disputed stories appear lower in the News Feed, this will turn into a subtle form of censorship.
  2. Don’t give a popup warning about disputed content. This does too much to discredit an article in the eye’s of a lay person. Instead, there should be a little icon somewhere by the link that indicates that the content is disputed. An interested user can click on the link to read the rebuttles if they so choose. If there must be a pop-up, it should be when an article is about to be shared– not when it is about to be read.
  3. Allow anyone to dispute stories. Any organization should be able to submit a dispute. Given point 1 & 2, an individual dispute should be less impactful. This eliminates the need to reserve it’s creation for only highly vetted organizations. Debate is important and it’s really easy to get caught in an echo chamber. If everyone had the ability to see both sides of any story, this could be a really positive feature.
  4. Allow users to subscribe to disputers. If a user trusts a certain organization, they may like to be alerted when they click on an article that it has disputed. The difference between this and point 2 is that this system is opt-in and only has user-selected bias. However, this would need to be implemented in a manner that does not push the subscription of certain disputers harder than that of others.

With these changes, I believe that Facebook could make great strides in allowing users to get multiple perspectives while shedding these issues of censorship and bias. Instead of following Poynter’s priciples and claiming to offer objective unbiased fact checking, Facebook should embrace bias and view it as a necessary lens for getting a wholistic picture of reality.

Thank you for reading! Let me know what you think in the comments. Also, please share this article to help encourage Facebook to make these changes. They are currently on the fence about point 1, and we need to nudge them in the right direction.

The post 4 Suggestions for Facebook’s “Fake News” Program appeared first on LJC.IO.

Source: Liam Cardenas