News

Facebook’s Latest Experiment to Fight Fake News Backfires

Google+ Pinterest LinkedIn Tumblr

Social media giants have been forced to make changes to help combat the spread of fake news on their sites. Facebook’s most recent experiment, which has since concluded, restructured the comment section to prioritize “comments that indicate disbelief” as a method for fighting misinformation, but the results weren’t what they hoped.

As reported by the BBC, Facebook ran a limited trial of the new experimental feature. Instead of listing comments in the order they were posted, those that suggested the piece might contain inaccurate information were given priority, displaying near the top of the list.

Only some users were able to see the change in format, but those that did were often left frustrated. It appeared that specific keywords were used to identify “comments that indicate disbelief,” a potential sign that the post wasn’t accurate. However, even comments on legitimate stories that included phrases suggesting it was spreading misinformation, such as “fake news,” were given priority, including on stories from reputable news organizations.

“Clearly Facebook is under enormous pressure to tackle the problem of fake news, but to question the veracity of every single story is preposterous,” said freelance PR consultant Jen Roberts.

“Quite the reverse of combating misinformation online, it is compounding the issue by blurring the lines between what is real and what isn’t. My Facebook feed has become like some awful Orwellian doublethink experiment.”

Users on Twitter also expressed their discontent with the Facebook experiment.

One post asked, “Every top comment showing on a political post is ‘lie’ or ‘fake.’ Is this your new get-everyone-riled algorithm or bot inundation?”

Another asked, “Why is EVERY single comment preview on my feed about fake stuff?”

Facebook provided a statement to the BBC, which read, “We’re always working on ways to curb the spread of misinformation on our platform, and sometimes run tests to find new ways to do this. This was a small test which has now concluded.”

The statement continued, “We wanted to see if prioritizing comments that indicate disbelief would help. We’re going to keep working to find new ways to help our community make more informed decisions about what they read and share.”

Facebook came under fire after it was suggested that hoax stories published on the site impacted the 2016 Presidential election and has since been under pressure to create mechanisms to fight fake news.