The "echo chamber" inflates certain phenomenon far beyond their true size
The “echo chamber” inflates certain phenomenon far beyond their true size

Enjoy Your Stay

The private tendencies to seek opinions that will strengthen our own and will also give us social credentials, we must also add the Facebook algorithm: a computer brain that filters the information coming to us. The result: a handful of people responsible for this algorithm, determining what information reaches us and what information does not.

The idea behind this is, of course, economic: Facebook’s goal as a commercial company, after all, is to make money. Its economic model is based on the fact that people enter the social network on a regular basis and expose themselves to advertisers who pay Facebook. In that case, Facebook is interested in us using its website as often as possible for as long as possible. How do they do that? By creating a more comfortable and pleasant space than life in the physical world, where, whether we like it or not, we must be aware of the presence of the other, even if we dislike him.

On Facebook, unlike in the real world, the other person simply disappears. We do not have to know that he and his different opinions exist. And if they exist, we will not at all be exposed to its complex concepts, but only to a stereotypical curiousness perceived by us and by the virtual group we belong to.

Facebook’s algorithm identifies which content we like, among other things, according to the amount of likes, comments, and shares we engage in, and expose us to similar content. In other words, Facebook exposes us to more and more of the same thing. Of course, Facebook wants us to be stay online for as long as possible and to become involved as much as possible so that they can earn as much as possible. Presenting information contrary to our own perception will not serve this purpose.

“It’s not happening only on Facebook, it’s also happening on Google for that matter – we get a personalized flow, and there’s a complex algorithm behind it,” explains John. “We don’t know exactly how it works, but we remember that Facebook’s loyalty belongs to the shareholders, and that’s a business in every sense. And a business, it must earn money by selling advertising space and by promising these advertisers that as many eyes as possible will be exposed to their ads. And in order for us to stay as long as possible on Facebook, it shows us only what we want to see. Which may be cats on a skateboard, but also news content that we like to consume.

That’s why Facebook insists not to define itself as a media company – because if it does, it must have ethics. Facebook also knows what we read outside of the Facebook website and knows what our political opinions are and what things we want to read. That’s how it knows what to provide us with in the most unprecedented personal form.”

The Element of Surprise

A similar situation occurs in traditional media as editors and new anchors not only present reality as it is, but are also able to construct and create it. But if we once believed that unlike the press, television, and radio the Internet is an open democratic space that allows us all unlimited access to unlimited information, the automatic algorithms are those that decide for us what we see and perhaps even worse – what we don’t see. All of this encourages the “spiral of silence”, a theory explaining that because people want to be like the majority, they won’t make their voices heard if they feel their opinion is different from popular public discourse.

As early as 2014, the PEW Research Center conducted a survey on American’s reactions and comments on the Internet regarding NSA (The United States National Security Agency) and the agency’s broad espionage affair with civilians following the exposure of former employee, Edward Snowden. About 86% of the respondents said they were willing to discuss the matter with family, friends, or public meetings, but only 43% expressed willingness to discuss it on Facebook. “The meaning of this type of self-censorship may be that important information is not being transmitted,” the research center said. “Many hoped that the social networks would create new paths that would encourage a broader discourse and broader spectrum of opinions, but we see the opposite – a spiral of silence that exists on the Internet as well.”

As a result, we tend to imagine a certain reality created by Facebook’s newsfeed, which is personalized, even if reality is completely different. Through the newsfeed, most of us consume the news that comes to us, but the newsfeed is more like a newspaper of close friends and much less of a reliable news source. The content that appears there is often posts and articles we enjoy reading that match our own opinions – and this is also the reason why it seems to us that an article shared by a few of our Facebook friends is actually a national consensus.

This resonance inflates a certain phenomena far beyond its true size: a few years ago, for example, the videos of the “New Land” movement received hundreds of thousands of Facebook views, but in practice, in the 2013 elections, the movement did not exceed the necessary threshold. A similar situation occurred in the last elections in Israel: users on the left-wing side of the political map saw their Facebook newsfeeds and understood that the “Machane Hazioni” party is closer than ever to winning the elections. So much so, that when the right-wing party won, it stunned Internet surfers with left-wing views. “I don’t think that today, people are surrounded with more people similar to themselves today than they used to be,” Dr. John said. “But because it’s on the Internet, we assume this to be true.”

If even those with key roles in the media and Facebook personalities have quite similar views, it creates the illusion of a very homogeneous environment, which obscures other people’s voices. This can also happen on the right side of the political map, but these people are less represented in traditional media.”