It is easy to acquire the impression that you are getting the same information as everyone else on the internet. However, every time you get online, algorithms record the things you search for, save data from the sites you visit and what you prefer to click on. These algorithms show you material based on what they believe you will enjoy and they will keep doing this until they are only showing you things you will likely consume. It sounds pretty good, right? Who would not like a little bit of personalisation in their everyday life?
The problem is that this process cuts off critical sources of information and different viewpoints that you may not have paid attention to. As a result, you start missing out on vital details and become isolated in a filter bubble. The Cambridge dictionary defines filter bubbles as “a situation in which someone only hears or sees news and information that supports what they already believe and like, especially a situation created on the internet as a result of algorithms that choose the outcome of someone’s searches”. In short, a filter bubble is a state of intellectual isolation.
Most of your online activity is captured through data trackers and cookies. Cookies are a necessary part of web browsing. They let websites remember your preferences, website logins, shopping carts, and more. But they also store a wealth of data, enough to potentially identify you without your consent. EU law on cookie consent states web users should be offered the choice to accept or reject such data tracking, but most websites provide a skewed choice: typically a simple opt-in for all cookies versus a highly tedious opt-out option.
Beyond cookies, many big tech companies have developed data-tracking algorithms that are built into their products. Netflix, YouTube, Spotify, and Apple Music all recommend content they think you would enjoy after learning about the way you use those apps and the kinds of series, songs, and content creators you like and dislike. The more you interact with specific content, the more your feed gets filled with that specific content. The goal is to find the right content for each user and to entice them to stay on each respective platform as long as possible. While these algorithms provide us with personalized recommendations for movies, songs, and other forms of entertainment, they can become dangerous when applied to news and information. Facebook, Twitter and Instagram all use similar algorithms to make each individual’s experience more tailored to their interests and views.
These user-generated content (UGC) platforms try to reduce exposure to content that might be conflicting with your beliefs and increase the display of information that reaffirm your own views. Due to the algorithms’ lack of transparency and lack of asking for permission, you may not even be aware that you live in a filter bubble. This is a tactic used to make you spend more time online and on those platforms. Facebook has been scrutinized for allowing disinformation to spread on its platforms, most infamously in the United States’ 2016 presidential election. Trump’s win came as a complete surprise to everyone following the traditional media and the election polls. However, Donald Trump’s campaign, instead of established mediums chose to use social media to spread his message, a place where he outperformed his opponent in terms of total followers and user engagement. That gave him a precedence over the people who, through the years, became distant from the media and started trusting individuals they know personally over journalists. In the U.K. a similar pattern emerged during the EU Referendum when the “Leave” campaign was more successful at spreading their pro-leave hashtags and messages through social media.
The situation only worsens when every single one of us gets trapped in our own little bubble. It is challenging to have an informed conversation about the facts if everyone believes they have the whole story on a current event when in reality they only know half of it. This makes it impossible for anybody to make an informed judgment. Filter bubbles lead to a lack of knowledge, a reluctance to review opposing perspectives and unpleasant facts and can even incentivize radical reactions when confronted with opposing viewpoints. As Kendra Cherry puts it “during an election season, for example, people tend to seek positive information that paints their favored candidates in a good light. They will also look for information that casts the opposing candidate in a negative light.” This is a clear example of bias that can lead us to make poor decisions because it distorts the reality from which we draw evidence.
The aforementioned effects of filter bubbles result in a chain reaction that leads to the creation of echo chambers where people assume that everyone thinks like them and forget that other perspectives and viewpoints exist. Which in turn enables a biased search and interpretation of information. We became victims of our own biases and collectively we unintentionally undermined our democracies. The phenomenon of filter bubbles is something we should be aware of and something that we should fight against both by making ourselves open to new perspectives and by holding the big tech companies accountable for their algorithms and tactics. Hopefully, we will be able to regain some of the power we formerly had over our online lives and at the same time have a better and broader idea about what others think in our societies.
It is easy to acquire the impression that you are getting the same information as everyone else on the internet. However, every time you get online, algorithms record the things you search for, save data from the sites you visit and what you prefer to click on. These algorithms show you material based on what they believe you will enjoy and they will keep doing this until they are only showing you things you will likely consume. It sounds pretty good, right? Who would not like a little bit of personalisation in their everyday life?
The problem is that this process cuts off critical sources of information and different viewpoints that you may not have paid attention to. As a result, you start missing out on vital details and become isolated in a filter bubble. The Cambridge dictionary defines filter bubbles as “a situation in which someone only hears or sees news and information that supports what they already believe and like, especially a situation created on the internet as a result of algorithms that choose the outcome of someone’s searches”. In short, a filter bubble is a state of intellectual isolation.
Most of your online activity is captured through data trackers and cookies. Cookies are a necessary part of web browsing. They let websites remember your preferences, website logins, shopping carts, and more. But they also store a wealth of data, enough to potentially identify you without your consent. EU law on cookie consent states web users should be offered the choice to accept or reject such data tracking, but most websites provide a skewed choice: typically a simple opt-in for all cookies versus a highly tedious opt-out option.
Beyond cookies, many big tech companies have developed data-tracking algorithms that are built into their products. Netflix, YouTube, Spotify, and Apple Music all recommend content they think you would enjoy after learning about the way you use those apps and the kinds of series, songs, and content creators you like and dislike. The more you interact with specific content, the more your feed gets filled with that specific content. The goal is to find the right content for each user and to entice them to stay on each respective platform as long as possible. While these algorithms provide us with personalized recommendations for movies, songs, and other forms of entertainment, they can become dangerous when applied to news and information. Facebook, Twitter and Instagram all use similar algorithms to make each individual’s experience more tailored to their interests and views.
These user-generated content (UGC) platforms try to reduce exposure to content that might be conflicting with your beliefs and increase the display of information that reaffirm your own views. Due to the algorithms’ lack of transparency and lack of asking for permission, you may not even be aware that you live in a filter bubble. This is a tactic used to make you spend more time online and on those platforms. Facebook has been scrutinized for allowing disinformation to spread on its platforms, most infamously in the United States’ 2016 presidential election. Trump’s win came as a complete surprise to everyone following the traditional media and the election polls. However, Donald Trump’s campaign, instead of established mediums chose to use social media to spread his message, a place where he outperformed his opponent in terms of total followers and user engagement. That gave him a precedence over the people who, through the years, became distant from the media and started trusting individuals they know personally over journalists. In the U.K. a similar pattern emerged during the EU Referendum when the “Leave” campaign was more successful at spreading their pro-leave hashtags and messages through social media.
The situation only worsens when every single one of us gets trapped in our own little bubble. It is challenging to have an informed conversation about the facts if everyone believes they have the whole story on a current event when in reality they only know half of it. This makes it impossible for anybody to make an informed judgment. Filter bubbles lead to a lack of knowledge, a reluctance to review opposing perspectives and unpleasant facts and can even incentivize radical reactions when confronted with opposing viewpoints. As Kendra Cherry puts it “during an election season, for example, people tend to seek positive information that paints their favored candidates in a good light. They will also look for information that casts the opposing candidate in a negative light.” This is a clear example of bias that can lead us to make poor decisions because it distorts the reality from which we draw evidence.
The aforementioned effects of filter bubbles result in a chain reaction that leads to the creation of echo chambers where people assume that everyone thinks like them and forget that other perspectives and viewpoints exist. Which in turn enables a biased search and interpretation of information. We became victims of our own biases and collectively we unintentionally undermined our democracies. The phenomenon of filter bubbles is something we should be aware of and something that we should fight against both by making ourselves open to new perspectives and by holding the big tech companies accountable for their algorithms and tactics. Hopefully, we will be able to regain some of the power we formerly had over our online lives and at the same time have a better and broader idea about what others think in our societies.
It is easy to acquire the impression that you are getting the same information as everyone else on the internet. However, every time you get online, algorithms record the things you search for, save data from the sites you visit and what you prefer to click on. These algorithms show you material based on what they believe you will enjoy and they will keep doing this until they are only showing you things you will likely consume. It sounds pretty good, right? Who would not like a little bit of personalisation in their everyday life?
The problem is that this process cuts off critical sources of information and different viewpoints that you may not have paid attention to. As a result, you start missing out on vital details and become isolated in a filter bubble. The Cambridge dictionary defines filter bubbles as “a situation in which someone only hears or sees news and information that supports what they already believe and like, especially a situation created on the internet as a result of algorithms that choose the outcome of someone’s searches”. In short, a filter bubble is a state of intellectual isolation.
Most of your online activity is captured through data trackers and cookies. Cookies are a necessary part of web browsing. They let websites remember your preferences, website logins, shopping carts, and more. But they also store a wealth of data, enough to potentially identify you without your consent. EU law on cookie consent states web users should be offered the choice to accept or reject such data tracking, but most websites provide a skewed choice: typically a simple opt-in for all cookies versus a highly tedious opt-out option.
Beyond cookies, many big tech companies have developed data-tracking algorithms that are built into their products. Netflix, YouTube, Spotify, and Apple Music all recommend content they think you would enjoy after learning about the way you use those apps and the kinds of series, songs, and content creators you like and dislike. The more you interact with specific content, the more your feed gets filled with that specific content. The goal is to find the right content for each user and to entice them to stay on each respective platform as long as possible. While these algorithms provide us with personalized recommendations for movies, songs, and other forms of entertainment, they can become dangerous when applied to news and information. Facebook, Twitter and Instagram all use similar algorithms to make each individual’s experience more tailored to their interests and views.
These user-generated content (UGC) platforms try to reduce exposure to content that might be conflicting with your beliefs and increase the display of information that reaffirm your own views. Due to the algorithms’ lack of transparency and lack of asking for permission, you may not even be aware that you live in a filter bubble. This is a tactic used to make you spend more time online and on those platforms. Facebook has been scrutinized for allowing disinformation to spread on its platforms, most infamously in the United States’ 2016 presidential election. Trump’s win came as a complete surprise to everyone following the traditional media and the election polls. However, Donald Trump’s campaign, instead of established mediums chose to use social media to spread his message, a place where he outperformed his opponent in terms of total followers and user engagement. That gave him a precedence over the people who, through the years, became distant from the media and started trusting individuals they know personally over journalists. In the U.K. a similar pattern emerged during the EU Referendum when the “Leave” campaign was more successful at spreading their pro-leave hashtags and messages through social media.
The situation only worsens when every single one of us gets trapped in our own little bubble. It is challenging to have an informed conversation about the facts if everyone believes they have the whole story on a current event when in reality they only know half of it. This makes it impossible for anybody to make an informed judgment. Filter bubbles lead to a lack of knowledge, a reluctance to review opposing perspectives and unpleasant facts and can even incentivize radical reactions when confronted with opposing viewpoints. As Kendra Cherry puts it “during an election season, for example, people tend to seek positive information that paints their favored candidates in a good light. They will also look for information that casts the opposing candidate in a negative light.” This is a clear example of bias that can lead us to make poor decisions because it distorts the reality from which we draw evidence.
The aforementioned effects of filter bubbles result in a chain reaction that leads to the creation of echo chambers where people assume that everyone thinks like them and forget that other perspectives and viewpoints exist. Which in turn enables a biased search and interpretation of information. We became victims of our own biases and collectively we unintentionally undermined our democracies. The phenomenon of filter bubbles is something we should be aware of and something that we should fight against both by making ourselves open to new perspectives and by holding the big tech companies accountable for their algorithms and tactics. Hopefully, we will be able to regain some of the power we formerly had over our online lives and at the same time have a better and broader idea about what others think in our societies.