This approach might be effective. Researchers at the University of Cambridge found that "inoculation" is the best way to prevent people from believing fake news. (Studies show that when fake news is corrected, they continue to remember and believe the fake news.) But when study participants were given true news, then told that specific groups were circulating fake news, followed by exposure to the fake news, they continued to believe the true news. In other words, identifying fake news only works if the fake news is identified before exposure.
A 19-year-old Stanford University student, Karan Singhal, created a "fake news detector" using artificial intelligence. It analyzes 55 different metrics, including writing style, layout and domain name. You can try it for free.
Climate scientists and technologists are also collaborating on fighting fake news with a new site called climatefeedback.org. The site gets climate scientists to do reviews on stories about the climate. The scientist/reviewers add notes and links to the articles, and add a credibility score.
The beauty of this approach is that each article is judged independently (instead of branding an entire publication as "bad"). For example, the site slams the Guardian for this article but praises it for this one. Better still, the site essentially teaches media criticism and skepticism from a scientific point of view.
One of the most aggressive actions comes from Snap. The company updated its content guidelines for Snapchat publishers to battle fake news, according to a report in The New York Times this week.
In advance of an expected IPO, Snap wants to cut down on clickbait. Snapchat's "discovery" section now bans profanity, sexual or violent content and misleading or fraudulent headlines.
But Snap also targets fake news. The company says all content must be fact-checked and accurate and that publishers can't impersonate entities or people. It's not clear how Snap will police the guidelines. But to the extent that they are policed, they're among the most stringent of all the social sites.
Snap is doing the right thing by saying: We're a publisher. We're responsible for content. And that's the only responsible policy.
What not to do
The attitudes of Twitter and Facebook, meanwhile, are irrational and toxic. They reserve the right to ban, censor or delete any content they want -- Facebook based on its "Community Standards" and Twitter based on "The Twitter Rules." Both Twitter and Facebook tend to take action only when shamed in the court of public opinion. But when it comes to fake news, they throw their hands up and say they're not publishers and that they support free speech -- or they make minor tweaks to filtering to placate critics.
Sign up for CIO Asia eNewsletters.