sfdjn245454

Fake news. It’s been making headlines since the conclusion of the 2016 Presidential Election in which many pundits said that a deluge of fake news stories shared on Facebook ultimately help Donald Trump win the election.

However, many experts, including Facebook CEO Mark Zuckerberg dismissed the impact fake news stories had on the election outcome.

1647708565507255_1446330354630

Today though, Zuckerberg and Facebook announced a significant overhaul of the Facebook website to deal with possible fake news stories and hoaxes.

Here is the post Zuckerberg made on his public Facebook profile:

A few weeks ago, I outlined some projects we’re working on to build a more informed community and fight misinformation. Today, I want to share an update on work we’re starting to roll out.

We have a responsibility to make sure Facebook has the greatest positive impact on the world. This update is just one of many steps forward, and there will be more work beyond this.

Facebook is a new kind of platform different from anything before it. I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through. While we don’t write the news stories you read and share, we also recognize we’re more than just a distributor of news. We’re a new kind of platform for public discourse — and that means we have a new kind of responsibility to enable people to have the most meaningful conversations, and to build a space where people can be informed.

With any changes we make, we must fight to give all people a voice and resist the path of becoming arbiters of truth ourselves. I believe we can build a more informed community and uphold these principles.

Here’s what we’re doing:

Today we’re making it easier to report hoaxes, and if many people report a story, then we’ll send it to third-party fact checking organizations. If the fact checkers agree a story is a hoax, you’ll see a flag on the story saying it has been disputed, and that story may be less likely to show up in News Feed. You’ll still be able to read and share the story, but you’ll now have more information about whether fact checkers believe it’s accurate. No one will be able to make a disputed story into an ad or promote it on our platform.

We’ve also found that if people who read an article are significantly less likely to share it than people who just read the headline, that may be a sign it’s misleading. We’re going to start incorporating this signal into News Feed ranking.

These steps will help make spreading misinformation less profitable for spammers who make money by getting more people to visit their sites. And we’re also going to crack down on spammers who masquerade as well-known news organizations.

You can read more about all of these updates here: http://newsroom.fb.com/?p=7014

This is just one of many steps we’ll make to keep improving the quality of our service. Thanks to everyone for your feedback on this, and check back here for more updates to come.

In an article in their Newsroom, Facebook released the following information:

We believe providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with third-party fact checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.

reporting-a-story-as-fake

Here is an excerpt of the Poynter’s International Fact Checking Code of Principles mentioned by Facebook:

A COMMITMENT TO NONPARTISANSHIP AND FAIRNESS

We fact-check claims using the same standard for every fact check. We do not concentrate our fact-checking on any one side. We follow the same process for every fact check and let the evidence dictate our conclusions. We do not advocate or take policy positions on the issues we fact-check.

A COMMITMENT TO TRANSPARENCY OF SOURCES

We want our readers to be able to verify our findings themselves. We provide all sources in enough detail that readers can replicate our work, except in cases where a source’s personal security could be compromised. In such cases, we provide as much detail as possible.

A COMMITMENT TO TRANSPARENCY OF FUNDING & ORGANIZATION

We are transparent about our funding sources. If we accept funding from other organizations, we ensure that funders have no influence over the conclusions we reach in our reports. We detail the professional background of all key figures in our organization and explain our organizational structure and legal status. We clearly indicate a way for readers to communicate with us.

A COMMITMENT TO TRANSPARENCY OF METHODOLOGY

We explain the methodology we use to select, research, write, edit, publish and correct our fact checks. We encourage readers to send us claims to fact-check and are transparent on why and how we fact-check.

A COMMITMENT TO OPEN AND HONEST CORRECTIONS

We publish our corrections policy and follow it scrupulously. We correct clearly and transparently in line with our corrections policy, seeking so far as possible to ensure that readers see the corrected version.

Some content creators have expressed concerns that they could be targeted by groups of people who oppose their point of view or content and maliciously report it as “fake”.

Others have expressed concerns that the fact checkers used by Facebook could be partisan in regards to political reporting.

Facebook also released a video documenting the changes. See below:

Addressing Hoaxes and Fake News from Facebook on Vimeo.