Facebook is experimenting with new features to allow users and third-party fact checkers to flag posts that appear to be fake news.
The new features, announced on Thursday, essentially try to solve the fake-news problem through crowdsourcing. “We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully,” the company said in a blog post.
The measures are meant to crackdown on “clear hoaxes” spread by spammers. Users who find the false information appearing on their news feed will be able to flag it by clicking on the upper right-hand corner of a post and reporting it as a fake news story.
News that has been flagged as false will then be reviewed by a group of third-party organizations that have agreed to a fact-checking code developed by the Poynter Institute, a non-profit training center for journalism.
Facebook’s posting didn’t name the third-party organizations in its blog posting, but they include Poynter, PolitiFact, and ABC News, according to The New York Times.
If the news is found to be fake, the posting will be labeled with a “Disputed by 3rd Party Fact-Checkers” tag and will come with a link explaining why, Facebook said. “Stories that have been disputed may also appear lower in News Feed,” the company added.
Users who attempt to share the fake news will also receive alert warning them that the article’s accuracy has been disputed by third-party fact checkers. However, users can still choose to ignore the warning and share the posting.
To stop the producers of fake news, Facebook is eliminating the ability to spoof domains on the site, preventing spammers from masquerading as real news publications.
Spammers have made money from fake news by posting hoaxes to Facebook as a way to generate ad revenue. This proved to be especially lucrative during this year’s U.S. election. Among those reportedly cashing in were teenagers from Macedonia who published fake news related to the presidential race.
Facebook has vowed to fight the problem, following criticism that the site became a hotbed for biased and false news reporting during the election.
The new features, which will start to roll out on Thursday, were a sign progress, Facebook said. But “we’re going to keep working on this problem for as long as it takes to get it right,” the company added.