Information technology has changed the way politics works, and not always for the better.

Technology got us in this mess – can it also get us out?

Some people are trying. The winners of a student competition at Yale University came up with a plug-in that would work with search engines to help people navigate the new media landscape. It sounds like a good idea, but what they came up with could make a bad situation worse.

We are all familiar with the ways – both good and bad – that the internet and social media have changed how we learn what’s going on around us.

Anyone can be a publisher now, putting out information that shows up alongside stories produced by major news organizations. Instead of everyone getting their information from a few trusted sources, people can now curate their news feeds, often receiving only information that corresponds with their ideological predispositions. Algorithms see what they like and deliver more of the same, filtering out opposing points of view.

That makes everyone vulnerable to intentionally fake news – made-up stories from made-up news organizations that are either propaganda meant to disinform, or commercial products designed to make money by drawing clicks. The end result is a world of ideological bubbles, where there is deep distrust of others who have not only different opinions, but often different facts as well.

The “fake news” program the students developed would flag stories that come from sites that have distributed fake news in the past, which sounds like a good idea, until you wonder: Who gets to decide what’s fake? Can the algorithm distinguish between intentional falsehoods and honest mistakes? Would the Chicago Tribune get on the blacklist for its “Dewey Defeats Truman” headline on early editions in 1948?

More troubling, however, is the students’ approach to breaking the information bubble. In an attempt to expose people to opposing points of view, it would automatically feed readers what the algorithm considers to be the other side of the story.

So if a reader picks a story deemed negative about President Trump, the search engine would provide a positive one. That sounds good, but it feeds the notion that everyone is equally biased, no one is trying to find the truth and everyone is on a side that is just trying to win. That is the worldview that leads people to reject opposing views in the first place.

Whether news is positive or negative depends on your perspective.

Is hurricane coverage anti-weather propaganda or useful information? Do you need to balance a story about an oncoming storm with one that says heavy rain will cure drought?

The solution to algorithms that feed us fake news is not different algorithms. Machines aren’t going to save us from thinking.

We all need do a better job using our critical faculties when we build our theories of how the world works, making sure that we take into account all the facts, not just the ones that support our biases.

Technology may have changed, but the responsibilities of being a citizen in democracy are still the same. It’s not easy now, and it never was.

filed under: