No more disagreements with bubble filters.
If you interact only with those who agree with you, you will fall under the illusion that your way of thinking is prevalent.

No more disagreements with bubble filters.

When we interact only with those who agree with our opinions, in addition to losing the chance to live with divergent thoughts, we fall into the illusion that our way of thinking is prevalent.

And this is just one of the side effects of “filter bubbles”, a concept created by internet activist Eli Pariser to define the system of algorithms that, based on online behavior, elaborates the profile of each user and personalizes the content supply according to this profile. As a result, the information we receive tends to have a similar bias to the profiles we have traditionally attended.

In such dynamic times, it is an anachronism to have our profile for collecting new information formatted from our behavior in the past. And if until recently the quality of the information we obtained from traditional media outlets depended on the competence and ethics of human editors who were subject to public criticism and their own consciences, on the internet, it is linked to cold algorithms that tend to maintain the intellectual status quo of each user without question or self-criticism.

Thus, the internet user is not aware of the information that was left out of the menu offered to him, limiting what he can see from the world. Pariser goes further by saying that: “A world built from what is familiar to us is a world where there is nothing to learn, since it is invisible self-advertising, indoctrinating us with our own ideas.”

He also warns that invisible algorithms editing the web can limit our exposure to new information and narrow our views, making people more vulnerable to advertising and manipulation. We are running the risk of becoming a society that only listens and sees what pleases, creating fragmentation in which the world that each one of us sees looks more and more like our own world, in which we will be isolated with the echo of our own voice.

Internet search filters are becoming a mechanism for reproducing our own ideas, compromising one of the greatest qualities of the network, which is to find what you are not looking for. In addition, custom filters show us content similar to the ones we access the most. And often these links are nonsense that we click for curiosity or fun, but that is taken to the top by the algorithms, leaving the contents denser and less accessed behind.

In the words of Pariser, instead of having a balanced diet of information, we are left with only fat and sugar. The old question of immediate and long-term rewards comes into play.

Pariser warns us that it is not easy to escape this trap, since the monitoring ranges from the clicks we give to the type of computer we use and the geographical position we are in. But one of the possibilities is to diversify as much as possible the focus of our interests on the web, avoiding always accessing the same sites or always looking for the same subjects. A valid tip both on a personal and professional level because living with diversity is what provides growth.

Leandro Correa

Beat Communication

Leave a Reply