Subscribe to Here’s the Deal, our politics
newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Leave your feedback
In 2011, Eli Pariser delivered a TED Talk titled “Beware Online ‘Filter Bubbles.'” A few months later, he published his book, “The Filter Bubble”, which addresses the same topic. In both, Pariser identifies one of the creation myths of the Internet — while previously newspaper editors and TV producers controlled what viewers saw and, to a certain extent, how they perceived current events, the Internet broke down this system by allowing complete, unfettered access to information.
Pariser goes on to debunk this myth, explaining how Internet users and invisible algorithms function together to create a curated media experience. By choosing to “like” or “follow” specific stories, pages and users, we paint a picture for sites like Google and Facebook of our interests, preferences and worldview. These sites’ algorithms then set to work ensuring we are shown more of the same. The more we “like” a certain type of content, the more similar content becomes visible in our searches and news feeds.
According to Pariser, the problems that arise from this are two-fold. First, it becomes possible for users to unwittingly opt-out of meaningful news coverage. By repeatedly clicking on cute cat memes and pop culture listicles, we encourage sites to show us more and more of the content Pariser describes as “information junk food,” until “information vegetables” — thoughtful articles and serious journalism — are completely eliminated from our media diet.
Those who continue to eat their “vegetables” are in danger of having their feeds filtered to cater to their biases. Gilad Lotan recently published his findings on this topic on Medium.com. He explains:
“As we construct our online profiles based on what we already know, what we’re interested in, and what we’re recommended, social networks are perfectly designed to reinforce our existing beliefs … Content that makes us uncomfortable, is filtered out.”
Users who don’t bypass news altogether run the risk of bypassing stories that contradict their worldview. Lotan goes on to explore how this type of filtering has increased polarization on both sides of the Israel-Hamas conflict. Have the Internet, social media and algorithmic filters contributed to political polarization in the United States? Has the ability to subsist on a diet of “information junk food” led to increased apathy towards foreign and even domestic events?
We invited you to share your thoughts in a Twitter chat. Gilad Lotan (@gilgul) joined the conversation to discuss his work on this topic. Read a transcript of the chat below.
Support Provided By: