Digital Services Act and Media Literacy
The Digital Services Act claims to be marking a new era of digital regulation in Europe, aiming to create a safe digital environment for users while keeping the platforms accountable.
DSA: A new era
As of August 25th, 2023, especially “very large online platforms” (VLOP) such as Google, Meta, X and YouTube, together with other digital services, must comply with the EU’s legislative framework to continue to be present in Europe to create a safer digital experience for users through the Digital Services Act (DSA). DSA states that the EU aims to protect consumers and their fundamental rights and create a more transparent online environment, empowering users in their digital experiences.
DSA sets rules for online platforms, including VLOPs and search engines, to follow to create a safer and trusted online environment. The rules oblige online platforms to act to prevent disseminating posts containing illegal content and false information. On the users’ side, it gives a means to report these contents.
Media Literacy for Citizen Empowerment
But what if users don’t know precisely how or where to report? People are not usually aware of these obligations. For DSA to fully function, users need to be empowered in digital media usage. Empowering users would give them the ability to understand the platform infrastructures and give them the power to think critically about their digital experiences. Instead of passively consuming online content, they should acknowledge their agency in their digital experiences. There lies the crucial role of media literacy within DSA.
In a world where digital media is embedded in every part of our daily lives, we need to be conscious about what we consume and how we consume it. With appropriate media literacy tools, we can achieve this. However, media literacy is not only about what or how but also involves accessibility, critical analysis and the ability to convey a message purposefully. It furthermore seeks to understand how media content shapes personal behaviour and societal values.
Some concepts, such as platform transparency and disinformation, gain relevancy in this context. Let’s dive into these in terms of DSA.
Platform transparency is crucial to understand algorithms behind content moderation
Every digital platform has an algorithm behind it that helps it perform tasks and make decisions, such as matching users with content. It decides which content to put forward for each user. To be more critical about the digital experiences, users need to be informed about the platform algorithm. Since the algorithm works to optimise platform engagement, it has biases and motivations rooted in maximising content consumption. Understanding the algorithm allows people to acknowledge the hidden bias and motivation behind what they see on the platform, increasing its transparency. Platform transparency gives users insights on how and why they see certain content.
In terms of platform transparency, DSA suggests in articles 15 and 24 that platforms must publish a report about their content moderation. This report is expected to include the amount of illegal content removed in a time period, accounts terminated with the reasons and other transparency issues such as data collection and flagged contents.
It is ambiguous who these reports are for. Do users know about these reports? They would probably not want to read such a long and technical report. They would if they knew how crucial transparency is. To be more critical about digital experience, they should understand why they see specific content and what is behind them. Citizens also deserve to know what these platforms are doing with the data. More specifically, they need to know when they interact with the platform and when the algorithm issues their data. How can they know more about this? You guessed it: developing media literacy skills.
User empowerment is needed to tackle disinformation
In terms of disinformation, DSA addresses a “Crisis response mechanism” in article 36, mentioning in times of crisis, platforms shall follow the directives of the Commission to prevent disinformation. These times are the most susceptible times for disinformation to spread. Citizens, too, would be highly vulnerable to information they see online. Especially very large online platforms have become the primary source of information in these times since most people already use them as news resources. It is essential that they check the content disseminated around.
Should we believe everything we see online, assuming platforms are checking the validity of the content? The reality is more complex. As people interact with content that aligns with their beliefs, they increasingly find themselves in “echo chambers”—digital spaces created by the platform algorithms, where users primarily see information that reflects their own views. Over time, this creates a cycle where people hear “the same thing but in different words,” reinforcing their perspectives without encountering alternative viewpoints.
When this “same thing” is dis/misinformation, people in these echo chambers can start to believe it as fact. As they repeatedly encounter slightly different versions of the same misinformation, it becomes harder to separate what is true from what is false. This cycle is only broken if platforms actively intervene to label or flag the content as inaccurate. But if the platform doesn’t take action, users will continue to see and believe the misinformation, reinforcing their beliefs and spreading it further. So, it’s risky to rely entirely on platforms to manage what’s true and what’s false. If platforms choose not to act, especially in times of crisis, they could ignore calls from EU institutions due to a lack of clear and defined enforcements in the DSA.
DSA is not enough without media-literate citizens to create a safer online environment
While the DSA aims to hold platforms accountable for disinformation and illegal content, this alone isn’t enough to create a safer digital environment. We also need people to understand how these platforms operate and why they see specific types of content. For DSA to be truly effective, users must be aware of their own power to report and flag harmful content. This is where media literacy becomes essential—it equips people to recognise the impact of algorithms and empowers them to take action, fostering a more informed and responsible digital community.
–
References and For More:
Namle.org – Why is Media Literacy Important?
European Commission – Media literacy
European Commission – The Digital Services Act
Official Journal of the European Union – DSA Regulation
Zach Meyers – Will the DSA Save Europe From Disinformation?
Emma Roth – The EU’s Digital Services Act goes into effect today: here’s what that means
Digital Services Act and Media Literacy
The Digital Services Act claims to be marking a new era of digital regulation in Europe, aiming to create a safe digital environment for users while keeping the platforms accountable.
DSA: A new era
As of August 25th, 2023, especially “very large online platforms” (VLOP) such as Google, Meta, X and YouTube, together with other digital services, must comply with the EU’s legislative framework to continue to be present in Europe to create a safer digital experience for users through the Digital Services Act (DSA). DSA states that the EU aims to protect consumers and their fundamental rights and create a more transparent online environment, empowering users in their digital experiences.
DSA sets rules for online platforms, including VLOPs and search engines, to follow to create a safer and trusted online environment. The rules oblige online platforms to act to prevent disseminating posts containing illegal content and false information. On the users’ side, it gives a means to report these contents.
Media Literacy for Citizen Empowerment
But what if users don’t know precisely how or where to report? People are not usually aware of these obligations. For DSA to fully function, users need to be empowered in digital media usage. Empowering users would give them the ability to understand the platform infrastructures and give them the power to think critically about their digital experiences. Instead of passively consuming online content, they should acknowledge their agency in their digital experiences. There lies the crucial role of media literacy within DSA.
In a world where digital media is embedded in every part of our daily lives, we need to be conscious about what we consume and how we consume it. With appropriate media literacy tools, we can achieve this. However, media literacy is not only about what or how but also involves accessibility, critical analysis and the ability to convey a message purposefully. It furthermore seeks to understand how media content shapes personal behaviour and societal values.
Some concepts, such as platform transparency and disinformation, gain relevancy in this context. Let’s dive into these in terms of DSA.
Platform transparency is crucial to understand algorithms behind content moderation
Every digital platform has an algorithm behind it that helps it perform tasks and make decisions, such as matching users with content. It decides which content to put forward for each user. To be more critical about the digital experiences, users need to be informed about the platform algorithm. Since the algorithm works to optimise platform engagement, it has biases and motivations rooted in maximising content consumption. Understanding the algorithm allows people to acknowledge the hidden bias and motivation behind what they see on the platform, increasing its transparency. Platform transparency gives users insights on how and why they see certain content.
In terms of platform transparency, DSA suggests in articles 15 and 24 that platforms must publish a report about their content moderation. This report is expected to include the amount of illegal content removed in a time period, accounts terminated with the reasons and other transparency issues such as data collection and flagged contents.
It is ambiguous who these reports are for. Do users know about these reports? They would probably not want to read such a long and technical report. They would if they knew how crucial transparency is. To be more critical about digital experience, they should understand why they see specific content and what is behind them. Citizens also deserve to know what these platforms are doing with the data. More specifically, they need to know when they interact with the platform and when the algorithm issues their data. How can they know more about this? You guessed it: developing media literacy skills.
User empowerment is needed to tackle disinformation
In terms of disinformation, DSA addresses a “Crisis response mechanism” in article 36, mentioning in times of crisis, platforms shall follow the directives of the Commission to prevent disinformation. These times are the most susceptible times for disinformation to spread. Citizens, too, would be highly vulnerable to information they see online. Especially very large online platforms have become the primary source of information in these times since most people already use them as news resources. It is essential that they check the content disseminated around.
Should we believe everything we see online, assuming platforms are checking the validity of the content? The reality is more complex. As people interact with content that aligns with their beliefs, they increasingly find themselves in “echo chambers”—digital spaces created by the platform algorithms, where users primarily see information that reflects their own views. Over time, this creates a cycle where people hear “the same thing but in different words,” reinforcing their perspectives without encountering alternative viewpoints.
When this “same thing” is dis/misinformation, people in these echo chambers can start to believe it as fact. As they repeatedly encounter slightly different versions of the same misinformation, it becomes harder to separate what is true from what is false. This cycle is only broken if platforms actively intervene to label or flag the content as inaccurate. But if the platform doesn’t take action, users will continue to see and believe the misinformation, reinforcing their beliefs and spreading it further. So, it’s risky to rely entirely on platforms to manage what’s true and what’s false. If platforms choose not to act, especially in times of crisis, they could ignore calls from EU institutions due to a lack of clear and defined enforcements in the DSA.
DSA is not enough without media-literate citizens to create a safer online environment
While the DSA aims to hold platforms accountable for disinformation and illegal content, this alone isn’t enough to create a safer digital environment. We also need people to understand how these platforms operate and why they see specific types of content. For DSA to be truly effective, users must be aware of their own power to report and flag harmful content. This is where media literacy becomes essential—it equips people to recognise the impact of algorithms and empowers them to take action, fostering a more informed and responsible digital community.
–
References and For More:
Namle.org – Why is Media Literacy Important?
European Commission – Media literacy
European Commission – The Digital Services Act
Official Journal of the European Union – DSA Regulation
Zach Meyers – Will the DSA Save Europe From Disinformation?
Emma Roth – The EU’s Digital Services Act goes into effect today: here’s what that means
Digital Services Act and Media Literacy
The Digital Services Act claims to be marking a new era of digital regulation in Europe, aiming to create a safe digital environment for users while keeping the platforms accountable.
DSA: A new era
As of August 25th, 2023, especially “very large online platforms” (VLOP) such as Google, Meta, X and YouTube, together with other digital services, must comply with the EU’s legislative framework to continue to be present in Europe to create a safer digital experience for users through the Digital Services Act (DSA). DSA states that the EU aims to protect consumers and their fundamental rights and create a more transparent online environment, empowering users in their digital experiences.
DSA sets rules for online platforms, including VLOPs and search engines, to follow to create a safer and trusted online environment. The rules oblige online platforms to act to prevent disseminating posts containing illegal content and false information. On the users’ side, it gives a means to report these contents.
Media Literacy for Citizen Empowerment
But what if users don’t know precisely how or where to report? People are not usually aware of these obligations. For DSA to fully function, users need to be empowered in digital media usage. Empowering users would give them the ability to understand the platform infrastructures and give them the power to think critically about their digital experiences. Instead of passively consuming online content, they should acknowledge their agency in their digital experiences. There lies the crucial role of media literacy within DSA.
In a world where digital media is embedded in every part of our daily lives, we need to be conscious about what we consume and how we consume it. With appropriate media literacy tools, we can achieve this. However, media literacy is not only about what or how but also involves accessibility, critical analysis and the ability to convey a message purposefully. It furthermore seeks to understand how media content shapes personal behaviour and societal values.
Some concepts, such as platform transparency and disinformation, gain relevancy in this context. Let’s dive into these in terms of DSA.
Platform transparency is crucial to understand algorithms behind content moderation
Every digital platform has an algorithm behind it that helps it perform tasks and make decisions, such as matching users with content. It decides which content to put forward for each user. To be more critical about the digital experiences, users need to be informed about the platform algorithm. Since the algorithm works to optimise platform engagement, it has biases and motivations rooted in maximising content consumption. Understanding the algorithm allows people to acknowledge the hidden bias and motivation behind what they see on the platform, increasing its transparency. Platform transparency gives users insights on how and why they see certain content.
In terms of platform transparency, DSA suggests in articles 15 and 24 that platforms must publish a report about their content moderation. This report is expected to include the amount of illegal content removed in a time period, accounts terminated with the reasons and other transparency issues such as data collection and flagged contents.
It is ambiguous who these reports are for. Do users know about these reports? They would probably not want to read such a long and technical report. They would if they knew how crucial transparency is. To be more critical about digital experience, they should understand why they see specific content and what is behind them. Citizens also deserve to know what these platforms are doing with the data. More specifically, they need to know when they interact with the platform and when the algorithm issues their data. How can they know more about this? You guessed it: developing media literacy skills.
User empowerment is needed to tackle disinformation
In terms of disinformation, DSA addresses a “Crisis response mechanism” in article 36, mentioning in times of crisis, platforms shall follow the directives of the Commission to prevent disinformation. These times are the most susceptible times for disinformation to spread. Citizens, too, would be highly vulnerable to information they see online. Especially very large online platforms have become the primary source of information in these times since most people already use them as news resources. It is essential that they check the content disseminated around.
Should we believe everything we see online, assuming platforms are checking the validity of the content? The reality is more complex. As people interact with content that aligns with their beliefs, they increasingly find themselves in “echo chambers”—digital spaces created by the platform algorithms, where users primarily see information that reflects their own views. Over time, this creates a cycle where people hear “the same thing but in different words,” reinforcing their perspectives without encountering alternative viewpoints.
When this “same thing” is dis/misinformation, people in these echo chambers can start to believe it as fact. As they repeatedly encounter slightly different versions of the same misinformation, it becomes harder to separate what is true from what is false. This cycle is only broken if platforms actively intervene to label or flag the content as inaccurate. But if the platform doesn’t take action, users will continue to see and believe the misinformation, reinforcing their beliefs and spreading it further. So, it’s risky to rely entirely on platforms to manage what’s true and what’s false. If platforms choose not to act, especially in times of crisis, they could ignore calls from EU institutions due to a lack of clear and defined enforcements in the DSA.
DSA is not enough without media-literate citizens to create a safer online environment
While the DSA aims to hold platforms accountable for disinformation and illegal content, this alone isn’t enough to create a safer digital environment. We also need people to understand how these platforms operate and why they see specific types of content. For DSA to be truly effective, users must be aware of their own power to report and flag harmful content. This is where media literacy becomes essential—it equips people to recognise the impact of algorithms and empowers them to take action, fostering a more informed and responsible digital community.
–
References and For More:
Namle.org – Why is Media Literacy Important?
European Commission – Media literacy
European Commission – The Digital Services Act
Official Journal of the European Union – DSA Regulation
Zach Meyers – Will the DSA Save Europe From Disinformation?
Emma Roth – The EU’s Digital Services Act goes into effect today: here’s what that means