False news used to be easy to spot: It was in the tabloids by the supermarket checkout line, illustrated with hilariously fabricated images of the Yeti or Martian babies.

Now misinformation can be shared worldwide in an instant. It’s packaged in a way that makes it look like credible content. And the recent election has made clear that a dismayingly large number of people are eager to share misinformation to further their candidate or cause.

Nobody wants the kind of authoritarian crackdown that would silence social media. Instead, it’s up to us as responsible citizens to stop using the platforms to spread fabricated claims.

Six in 10 Americans now get at least some news from the likes of Twitter and Facebook, the Pew Research Center recently reported, noting that half of the public turned to social media for 2016 presidential election coverage.

The sites at their best enable a lively, informative and wide-ranging conversation. But a BuzzFeed News analysis has revealed how social media can be misused: In the last three months of the campaign, made-up, largely pro-Donald Trump articles had more Facebook shares, reactions and comments than the top election stories from 19 major news outlets combined.

Social scientists, pundits and technology experts have been debating what role fake news played in the election ever since the BuzzFeed analysis was published. However, we already know a few reasons why misinformation spreads so easily online.

Advertisement

For one thing, says Northwestern University psychologist David Rapp, it’s difficult to take in information and critically evaluate it at the same time. And a statement that we’ve just heard is relatively easy to retrieve – meaning that it quickly springs to mind during a fast-moving online discussion, whether it’s valid or not.

Users are baited into staying on a particular site with headlines written to appeal to strong feelings, like anger and frustration. And a lot of readers won’t bother to read past the headline, as long it lines up with their political leanings. As Emerson College communications professor Paul Mihailidis recently told The Washington Post: “The more they could spread rumors, or could advocate for their value system or candidate, that took precedent over them not knowing” whether the story was accurate.

Partisanship also explains why misinformation has such a long half-life on Facebook. Fact checkers have become the focus of bias accusations. Some readers, for example, have castigated Snopes.com for its “absolute and obvious bias towards the left, against the right and towards atheism and against faith”; others have slammed Snopes’ staff as “dumbass Republicans that want to control everything.”

Founder Mark Zuckerberg recently announced Facebook’s plan of attack, including putting warning labels on fake stories and making it easier to report misleading content. But reining in the spread of misinformation is a massive challenge – and one that will only grow until we as users put our critical intelligence to work and stop allowing catchy headlines to substitute for the truth.

Copy the Story Link

Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.

filed under: