Until reading this from 538 Significant Digits, I didn’t know Facebook was trying to label fake news stories by tagging them as “disputed.” So how is that going?
Facebook’s new attempt to have third-party fact checking groups vet the content the company spreads on news feeds hasn’t been going well, according to a new report from Bloomberg. When two fact checking groups flag an item as false it tags the link as “disputed,” which cuts the number of people who see the hoax or misinformation by about 80 percent. However, this has only happened for around 2,000 links, a small fraction of the hundreds of potentially false stories screeners see every day. Factor in that the process typically takes over three days, and you’re not exactly mitigating the spread of false information. [Bloomberg]
Meanwhile, back at the Facebook reality ranch:
126 million people
That’s Facebook’s estimate for the number of people who may have seen content spread by a Kremlin-linked troll farm between June 2015 and August 2017. The content was intended to spread divisive political messaging on topics including LGBT rights, race issues and gun rights. [CNN Money]