Facebook Is Dropping Its Fake News Red Flag Warning


After finding it had the opposite effect, facebook is dropping its red flag warning. The social media giant has been implementing it to check the spread of fake news on its platform, which it says  was instrumental in the outcome of the US presidential election. The company says the red flag feature sometimes backfired. Facebook introduced a “DISPUTED” article flagging feature to help users quickly identify articles from third-party websites that failed to pass a fact-checking standard. But the indicator, which was rolled out in December 2016, wasn’t effective at curbing misinformation, facebook said, and sometimes even spurred readers to share dubious link more often.

“Putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended,” Lyons wrote in a blog post. Instead of a red warning icon, the company will offer users additional related and fact-checked content in the hope of drawing readers toward more reliable news sources, Lyons said.

Now, before reading an article shared on Facebook, users will be offered a menu of fact-checked “Related Articles” from reliable sources in order to “give more context, which our research has shown is a more effective way to help people get to the facts,” Lyons wrote.

Last month facebook joined tech giants Google and Twitter to testify before congress over efforts to stop the spread of misinformation and “extremist content” on their services.