
09/01/2026
In recent years, digital technology has become increasingly essential to children’s daily lives. As Internet access provides them with the opportunity to connect with their peers, learn new things, and creatively express themselves, it also makes them vulnerable to a variety of risks and threats.
The most alarming threat they are exposed to is sexual abuse and exploitation. Child sexual abuse (CSA) is “the involvement of a child in sexual activity that he or she does not fully comprehend, is unable to give informed consent to, or for which the child is not developmentally prepared and cannot give consent” (World Health Organization, 1999). Child sexual exploitation (CSE) is a form of sexual abuse that attempts to “abuse of a position of vulnerability, differential power, or trust, for sexual purposes, including, but not limited to, profiting monetarily, socially, or politically from the sexual exploitation […] It includes but is not limited to exchanging money, employment, goods or services for sex […] It also includes any situation where sex is coerced or demanded by withholding or threatening to withhold goods or services or by blackmailing” (UNHCR).
The use of the Internet to perpetrate or facilitate the sexual abuse and exploitation of children is a widespread issue. The increased use of information and communication technologies (ICTs) by both children and perpetrators has led to the ever-increasing production and distribution of child sexual abuse material (CSAM), which refers to “the material depicting acts of sexual abuse and/or focusing on the genitalia of the child” (ECPAT International, 2016). The rise of AI-driven technology has also led to the production of new forms of CSAM, such as digitally generated child sexual abuse material, which refers to the material depicting artificially created children involved in sexual activities and/or in a sexualised manner. Other forms of online child sexual exploitation include “online grooming”, the practice by which adults establish a relationship with children to initiate either online or offline sexual contact, and “sexting”, the process by which someone intentionally shares sexually explicit content or self-generated sexualised images of themselves. “Sextortion” is a form of blackmail in which a child is coerced into providing sexual favours, money or other benefits under the threat that intimate or sexually explicit material involving them will be shared without their consent. Another form of complex exploitation is “live online child sexual abuse”, the real-time broadcasting on live-streaming platforms, such as Skype, of children’s sexual acts to offenders in remote locations.
Online child sexual abuse and exploitation is a critical global issue, with data highlighting its scale and severity. The 2025 edition of the Into the Light Index on Global Child Sexual Exploitation and Abuse from the Childlight Global Child Safety Institute, hosted by the University of Edinburgh in Scotland, found that one in five children in Western Europe (19.6%, nearly 15 million children) report experiencing online solicitation or grooming before turning 18, and that one in seven children (13.5%) reported facing CSAE in the past year alone. The Index also shows that Western Europe was the primary host of CSAM on the Internet in both 2023 and 2024, with the Netherlands accounting for the highest volume of CSAM (over 60% of all reported CSAM from Western Europe) and the highest CSAM availability rate. Moreover, Childlight revealed a 1,325% annual rise in AI-generated CSAM, with 67,000 reports in 2024, compared to 4,700 logged by the National Centre for Missing and Exploited Children in 2023. In the digital era, both international conventions, such as the Convention on the Rights of the Child of 1989, and the Optional Protocol to the Convention on the Rights of the Child on the sale of Children, child prostitution and child pornography of 2000, and regional conventions, such as the Council of Europe’s Convention on the Protection of Children against Sexual Exploitation and Abuse of 2007, that aim to prevent and combat the sexual exploitation and abuse of children, have proved to be insufficient.
As children across Europe are facing unprecedented levels of violence, the European Union has taken various actions to effectively address the misuse of online services for child sexual abuse. As part of the EU Strategy for a More Effective Fight Against Child Sexual Abuse, presented on 24 July 2020, the European Commission submitted four legislative proposals between 2020 and 2024. The first legislative proposal was an interim regulation (Regulation EU 2021/1232) that allowed service providers, such as Facebook Messenger and WhatsApp, to derogate from the privacy provisions of the ePrivacy Directive, enabling them to continue voluntarily detecting, reporting, and removing child sexual abuse content in their systems. As the interim allowing voluntary detection was set to expire in August 2024, the European Commission adopted a proposal on permanent rules, the ‘EU’s Regulation to Prevent and Combat Child Sexual Abuse’, on 11 May 2022. Due to ongoing disagreement over the proposal, the ‘Interim Derogation’ was extended until 3 April 2026 in 2024.
EU CSA regulation and its critics
The ‘EU’s Regulation to Prevent and Combat Child Sexual Abuse’, also referred to as the ‘CSA Regulation’, ‘CSAM proposal’, ‘CSAR’, or ‘Chat Control’ by its critics, is a lex specialis to the EU’s Digital Services Act, that was put forward alongside the ‘Better Internet for Kids’ strategy (Bik+). The Regulation aims to establish a legal framework that requires online service providers to detect, report, and remove CSAM from their services. This detection would also apply to end-to-end encrypted (E2EE) interpersonal communications services, such as WhatsApp, as well as to web-based email services, social media platforms, and app stores. The proposal also foresees the creation of an independent hub of expertise, the EU Centre on Child Sexual Abuse (EUCSA), which would review and assess the reports received from online providers, identifying erroneous reports and forwarding relevant reports to national authorities, while also supporting victims and ensuring the removal of CSAM in which they are depicted.
At the European Parliament, the file was allocated to the Committee on Civil Liberties, Justice and Home Affairs (LIBE), where the rapporteur presented his draft report on 19 April 2023. The draft report includes new amendments, such as restricting CSAM detection in end-to-end encrypted communications, and it emphasises the importance of integrating preventive measures into online services by providers. While supporting the creation of the EUCSA, the report also calls for the establishment of a Victims’ Consultative Forum within the Centre, and for the continuation of voluntary detection alongside mandatory measures. After it was adopted by the LIBE committee and approved in the Plenary session, it was decided to enter interinstitutional negotiations on 22 November 2023. At the Council, a common position was reached on 26 December 2025 under the current Danish Presidency. The Council position rejects mandatory detection orders to extend the voluntary scanning of companies’ services, therefore making the current temporary derogation measure from the ePrivacy directive permanent. Moreover, the Council introduces new risk mitigation measures, such as reporting tools, to be implemented by service providers based on a new tripartite service’s risk categorisation (low, medium, and high risk). Technical trilogues are expected in January 2026, following the first trilogue held on 9 December 2025.
Over the years, law enforcement, civil society groups, non-profit organisations, data protection authorities, and private citizens have expressed concerns about the mass screening of private communications, particularly encrypted messages. Since its proposal, many initiatives, such as the German party Young Liberals’ StopChatControl.eu and the citizen-led initiative Fight Chat Control, have actively and continuously opposed the CSA Regulation, while also raising awareness of its implications. The European Digital Rights association (EDRi) created the Stop Scanning Me movement, and signed, along one hundred thirty-four civil societies and professional organisations working across human rights, media freedom, technology and democracy – such as the European Center for Democracy and Technology, The Committee to Protect Journalists, the European Federation of Journalists (EFJ) and the Global Forum for Media Development – an open letter urging the European Commissioners to withdraw the Regulation. The letter “European Commission: uphold privacy, security and free expression by withdrawing new law” states that its application would be incompatible with the EU Charter of Fundamental Rights, and highlights that mass scanning of private communications would be detrimental to everyone’s privacy, security, and freedom of expression. The letter also shows that it would undermine media freedom and weaken journalists’ digital security. Experts and child protection services, such as the Dutch child protection hotline EOKM and the German child protection association Kinder-schutzbund, have also pointed out that the proposed measures would be harmful and likely ineffective.
The decision of the current Danish Presidency to give the CSA Regulation high priority has revived debates and protests across Europe and beyond. An open letter signed by more than 470 scientists and researchers from 34 countries across Europe states that the powers of the Regulation for detection, scanning and access to private communication “will create unprecedented capabilities for surveillance, control, and censorship and have an inherent risk for function creep and abuse by less democratic regimes”. The letter also considers the high potential of AI-based content scanning to produce false positives, which would lead to the incrimination of innocent individuals and conversations. In a statement coordinated by the European Centre for Press and Media Freedom (ECPMF), the ECPMF has shared the content of the letter and has condemned the proposal as an infringement of the right to privacy, ultimately undermining press freedom and democracy. Their contribution also highlights the potential misuse of mass surveillance by authoritarian regimes, which could redirect it toward monitoring political speech or journalistic investigations — a concern many critics of the proposal have raised. Tech companies have also expressed concerns about the impact that the legislation would have on the EU’s digital economy, stating that it would undermine the EU’s competitiveness compared to regions that protect digital rights and innovation. Another open letter, signed in October 2025 by over 40 VPN and encrypted messaging services, including Proton, NordVPN, Tuta, and Element, considers how the legislation would undermine European cybersecurity and digital sovereignty, while eroding the trust that European businesses have established internationally. Platforms, such as Meta and Signal, have also stated that the proposal would endanger privacy and freedom, with the latter threatening to leave the EU market.
The crucial role of media literacy and platform safety-by-design features
As the EU holds a central legislative role concerning children’s rights in the digital environment, educating children remains one of the key preventive measures to combat online child sexual abuse and exploitation. As studies show, digital and media literacy is fundamental for children to engage safely and effectively in digital environments (Ey, 2024). The ability to analyse digital content and communication provides users with the interpreting and decoding skills necessary to question who created the content they are engaging with, how it was made, what meaning it conveys, who the intended audience is, and what it represents. These skills are critical to prevent online abuse, as predators take advantage of children and gain their attention and trust by using fake photos, videos, and profiles. Moreover, the ability to critically evaluate content equips children with the capacity to assess its credibility, while creating and sharing online audio-visual and written content equips them with the skills necessary to understand the logic of media. This enables them both to recognise predators’ deceptive practices, such as photo editing, filters, and other forms of manipulation, and to discern between private and public forms of information sharing, while reflecting on the potential consequences of what they choose to share.
However, digital and media literacy is insufficient without legal frameworks that mandate that online platforms design safe and age-appropriate services (Eurochild, 2025). Vulnerable children, such as those with disabilities or living in poverty, are often left behind. It is important to remember that the Internet and many digital technologies were created by adults and for adults — they were not designed with children’s rights in mind. Despite all preventive measures in force, it is crucial that technology companies, particularly social media companies, create online services and products with child rights in mind (‘Child Rights by Design’). “Safety-by-design” technologies and features are “practices that put user safety at the core of designing online platforms and technologies” (Eurochild, 2025). These include measures such as child-friendly reporting mechanisms and effective content moderation, that can prevent harm before it happens.
It is only when technological corporations begin to prioritise the safety of their vulnerable users over profit and join efforts with other stakeholders that it will be possible to pave the way for the realisation of children’s rights online.
References
Childlight. 2025. Into the Light Index.
European Digital Rights (EDRi). 2025. CSA Regulation Document Pool.
Eurochild. 2025. The rights of Children in the Digital Environment.
European Parliament. 2025. New Legislation to Fight Child Sexual Abuse Online.
United Nations Children’s Fund (UNICEF). (n.d). Keeping Children Safe Online.
World Health Organisation. 1999. Report of the Consultation on Child Abuse Prevention.

09/01/2026
In recent years, digital technology has become increasingly essential to children’s daily lives. As Internet access provides them with the opportunity to connect with their peers, learn new things, and creatively express themselves, it also makes them vulnerable to a variety of risks and threats.
The most alarming threat they are exposed to is sexual abuse and exploitation. Child sexual abuse (CSA) is “the involvement of a child in sexual activity that he or she does not fully comprehend, is unable to give informed consent to, or for which the child is not developmentally prepared and cannot give consent” (World Health Organization, 1999). Child sexual exploitation (CSE) is a form of sexual abuse that attempts to “abuse of a position of vulnerability, differential power, or trust, for sexual purposes, including, but not limited to, profiting monetarily, socially, or politically from the sexual exploitation […] It includes but is not limited to exchanging money, employment, goods or services for sex […] It also includes any situation where sex is coerced or demanded by withholding or threatening to withhold goods or services or by blackmailing” (UNHCR).
The use of the Internet to perpetrate or facilitate the sexual abuse and exploitation of children is a widespread issue. The increased use of information and communication technologies (ICTs) by both children and perpetrators has led to the ever-increasing production and distribution of child sexual abuse material (CSAM), which refers to “the material depicting acts of sexual abuse and/or focusing on the genitalia of the child” (ECPAT International, 2016). The rise of AI-driven technology has also led to the production of new forms of CSAM, such as digitally generated child sexual abuse material, which refers to the material depicting artificially created children involved in sexual activities and/or in a sexualised manner. Other forms of online child sexual exploitation include “online grooming”, the practice by which adults establish a relationship with children to initiate either online or offline sexual contact, and “sexting”, the process by which someone intentionally shares sexually explicit content or self-generated sexualised images of themselves. “Sextortion” is a form of blackmail in which a child is coerced into providing sexual favours, money or other benefits under the threat that intimate or sexually explicit material involving them will be shared without their consent. Another form of complex exploitation is “live online child sexual abuse”, the real-time broadcasting on live-streaming platforms, such as Skype, of children’s sexual acts to offenders in remote locations.
Online child sexual abuse and exploitation is a critical global issue, with data highlighting its scale and severity. The 2025 edition of the Into the Light Index on Global Child Sexual Exploitation and Abuse from the Childlight Global Child Safety Institute, hosted by the University of Edinburgh in Scotland, found that one in five children in Western Europe (19.6%, nearly 15 million children) report experiencing online solicitation or grooming before turning 18, and that one in seven children (13.5%) reported facing CSAE in the past year alone. The Index also shows that Western Europe was the primary host of CSAM on the Internet in both 2023 and 2024, with the Netherlands accounting for the highest volume of CSAM (over 60% of all reported CSAM from Western Europe) and the highest CSAM availability rate. Moreover, Childlight revealed a 1,325% annual rise in AI-generated CSAM, with 67,000 reports in 2024, compared to 4,700 logged by the National Centre for Missing and Exploited Children in 2023. In the digital era, both international conventions, such as the Convention on the Rights of the Child of 1989, and the Optional Protocol to the Convention on the Rights of the Child on the sale of Children, child prostitution and child pornography of 2000, and regional conventions, such as the Council of Europe’s Convention on the Protection of Children against Sexual Exploitation and Abuse of 2007, that aim to prevent and combat the sexual exploitation and abuse of children, have proved to be insufficient.
As children across Europe are facing unprecedented levels of violence, the European Union has taken various actions to effectively address the misuse of online services for child sexual abuse. As part of the EU Strategy for a More Effective Fight Against Child Sexual Abuse, presented on 24 July 2020, the European Commission submitted four legislative proposals between 2020 and 2024. The first legislative proposal was an interim regulation (Regulation EU 2021/1232) that allowed service providers, such as Facebook Messenger and WhatsApp, to derogate from the privacy provisions of the ePrivacy Directive, enabling them to continue voluntarily detecting, reporting, and removing child sexual abuse content in their systems. As the interim allowing voluntary detection was set to expire in August 2024, the European Commission adopted a proposal on permanent rules, the ‘EU’s Regulation to Prevent and Combat Child Sexual Abuse’, on 11 May 2022. Due to ongoing disagreement over the proposal, the ‘Interim Derogation’ was extended until 3 April 2026 in 2024.
EU CSA regulation and its critics
The ‘EU’s Regulation to Prevent and Combat Child Sexual Abuse’, also referred to as the ‘CSA Regulation’, ‘CSAM proposal’, ‘CSAR’, or ‘Chat Control’ by its critics, is a lex specialis to the EU’s Digital Services Act, that was put forward alongside the ‘Better Internet for Kids’ strategy (Bik+). The Regulation aims to establish a legal framework that requires online service providers to detect, report, and remove CSAM from their services. This detection would also apply to end-to-end encrypted (E2EE) interpersonal communications services, such as WhatsApp, as well as to web-based email services, social media platforms, and app stores. The proposal also foresees the creation of an independent hub of expertise, the EU Centre on Child Sexual Abuse (EUCSA), which would review and assess the reports received from online providers, identifying erroneous reports and forwarding relevant reports to national authorities, while also supporting victims and ensuring the removal of CSAM in which they are depicted.
At the European Parliament, the file was allocated to the Committee on Civil Liberties, Justice and Home Affairs (LIBE), where the rapporteur presented his draft report on 19 April 2023. The draft report includes new amendments, such as restricting CSAM detection in end-to-end encrypted communications, and it emphasises the importance of integrating preventive measures into online services by providers. While supporting the creation of the EUCSA, the report also calls for the establishment of a Victims’ Consultative Forum within the Centre, and for the continuation of voluntary detection alongside mandatory measures. After it was adopted by the LIBE committee and approved in the Plenary session, it was decided to enter interinstitutional negotiations on 22 November 2023. At the Council, a common position was reached on 26 December 2025 under the current Danish Presidency. The Council position rejects mandatory detection orders to extend the voluntary scanning of companies’ services, therefore making the current temporary derogation measure from the ePrivacy directive permanent. Moreover, the Council introduces new risk mitigation measures, such as reporting tools, to be implemented by service providers based on a new tripartite service’s risk categorisation (low, medium, and high risk). Technical trilogues are expected in January 2026, following the first trilogue held on 9 December 2025.
Over the years, law enforcement, civil society groups, non-profit organisations, data protection authorities, and private citizens have expressed concerns about the mass screening of private communications, particularly encrypted messages. Since its proposal, many initiatives, such as the German party Young Liberals’ StopChatControl.eu and the citizen-led initiative Fight Chat Control, have actively and continuously opposed the CSA Regulation, while also raising awareness of its implications. The European Digital Rights association (EDRi) created the Stop Scanning Me movement, and signed, along one hundred thirty-four civil societies and professional organisations working across human rights, media freedom, technology and democracy – such as the European Center for Democracy and Technology, The Committee to Protect Journalists, the European Federation of Journalists (EFJ) and the Global Forum for Media Development – an open letter urging the European Commissioners to withdraw the Regulation. The letter “European Commission: uphold privacy, security and free expression by withdrawing new law” states that its application would be incompatible with the EU Charter of Fundamental Rights, and highlights that mass scanning of private communications would be detrimental to everyone’s privacy, security, and freedom of expression. The letter also shows that it would undermine media freedom and weaken journalists’ digital security. Experts and child protection services, such as the Dutch child protection hotline EOKM and the German child protection association Kinder-schutzbund, have also pointed out that the proposed measures would be harmful and likely ineffective.
The decision of the current Danish Presidency to give the CSA Regulation high priority has revived debates and protests across Europe and beyond. An open letter signed by more than 470 scientists and researchers from 34 countries across Europe states that the powers of the Regulation for detection, scanning and access to private communication “will create unprecedented capabilities for surveillance, control, and censorship and have an inherent risk for function creep and abuse by less democratic regimes”. The letter also considers the high potential of AI-based content scanning to produce false positives, which would lead to the incrimination of innocent individuals and conversations. In a statement coordinated by the European Centre for Press and Media Freedom (ECPMF), the ECPMF has shared the content of the letter and has condemned the proposal as an infringement of the right to privacy, ultimately undermining press freedom and democracy. Their contribution also highlights the potential misuse of mass surveillance by authoritarian regimes, which could redirect it toward monitoring political speech or journalistic investigations — a concern many critics of the proposal have raised. Tech companies have also expressed concerns about the impact that the legislation would have on the EU’s digital economy, stating that it would undermine the EU’s competitiveness compared to regions that protect digital rights and innovation. Another open letter, signed in October 2025 by over 40 VPN and encrypted messaging services, including Proton, NordVPN, Tuta, and Element, considers how the legislation would undermine European cybersecurity and digital sovereignty, while eroding the trust that European businesses have established internationally. Platforms, such as Meta and Signal, have also stated that the proposal would endanger privacy and freedom, with the latter threatening to leave the EU market.
The crucial role of media literacy and platform safety-by-design features
As the EU holds a central legislative role concerning children’s rights in the digital environment, educating children remains one of the key preventive measures to combat online child sexual abuse and exploitation. As studies show, digital and media literacy is fundamental for children to engage safely and effectively in digital environments (Ey, 2024). The ability to analyse digital content and communication provides users with the interpreting and decoding skills necessary to question who created the content they are engaging with, how it was made, what meaning it conveys, who the intended audience is, and what it represents. These skills are critical to prevent online abuse, as predators take advantage of children and gain their attention and trust by using fake photos, videos, and profiles. Moreover, the ability to critically evaluate content equips children with the capacity to assess its credibility, while creating and sharing online audio-visual and written content equips them with the skills necessary to understand the logic of media. This enables them both to recognise predators’ deceptive practices, such as photo editing, filters, and other forms of manipulation, and to discern between private and public forms of information sharing, while reflecting on the potential consequences of what they choose to share.
However, digital and media literacy is insufficient without legal frameworks that mandate that online platforms design safe and age-appropriate services (Eurochild, 2025). Vulnerable children, such as those with disabilities or living in poverty, are often left behind. It is important to remember that the Internet and many digital technologies were created by adults and for adults — they were not designed with children’s rights in mind. Despite all preventive measures in force, it is crucial that technology companies, particularly social media companies, create online services and products with child rights in mind (‘Child Rights by Design’). “Safety-by-design” technologies and features are “practices that put user safety at the core of designing online platforms and technologies” (Eurochild, 2025). These include measures such as child-friendly reporting mechanisms and effective content moderation, that can prevent harm before it happens.
It is only when technological corporations begin to prioritise the safety of their vulnerable users over profit and join efforts with other stakeholders that it will be possible to pave the way for the realisation of children’s rights online.
References
Childlight. 2025. Into the Light Index.
European Digital Rights (EDRi). 2025. CSA Regulation Document Pool.
Eurochild. 2025. The rights of Children in the Digital Environment.
European Parliament. 2025. New Legislation to Fight Child Sexual Abuse Online.
United Nations Children’s Fund (UNICEF). (n.d). Keeping Children Safe Online.
World Health Organisation. 1999. Report of the Consultation on Child Abuse Prevention.

09/01/2026
In recent years, digital technology has become increasingly essential to children’s daily lives. As Internet access provides them with the opportunity to connect with their peers, learn new things, and creatively express themselves, it also makes them vulnerable to a variety of risks and threats.
The most alarming threat they are exposed to is sexual abuse and exploitation. Child sexual abuse (CSA) is “the involvement of a child in sexual activity that he or she does not fully comprehend, is unable to give informed consent to, or for which the child is not developmentally prepared and cannot give consent” (World Health Organization, 1999). Child sexual exploitation (CSE) is a form of sexual abuse that attempts to “abuse of a position of vulnerability, differential power, or trust, for sexual purposes, including, but not limited to, profiting monetarily, socially, or politically from the sexual exploitation […] It includes but is not limited to exchanging money, employment, goods or services for sex […] It also includes any situation where sex is coerced or demanded by withholding or threatening to withhold goods or services or by blackmailing” (UNHCR).
The use of the Internet to perpetrate or facilitate the sexual abuse and exploitation of children is a widespread issue. The increased use of information and communication technologies (ICTs) by both children and perpetrators has led to the ever-increasing production and distribution of child sexual abuse material (CSAM), which refers to “the material depicting acts of sexual abuse and/or focusing on the genitalia of the child” (ECPAT International, 2016). The rise of AI-driven technology has also led to the production of new forms of CSAM, such as digitally generated child sexual abuse material, which refers to the material depicting artificially created children involved in sexual activities and/or in a sexualised manner. Other forms of online child sexual exploitation include “online grooming”, the practice by which adults establish a relationship with children to initiate either online or offline sexual contact, and “sexting”, the process by which someone intentionally shares sexually explicit content or self-generated sexualised images of themselves. “Sextortion” is a form of blackmail in which a child is coerced into providing sexual favours, money or other benefits under the threat that intimate or sexually explicit material involving them will be shared without their consent. Another form of complex exploitation is “live online child sexual abuse”, the real-time broadcasting on live-streaming platforms, such as Skype, of children’s sexual acts to offenders in remote locations.
Online child sexual abuse and exploitation is a critical global issue, with data highlighting its scale and severity. The 2025 edition of the Into the Light Index on Global Child Sexual Exploitation and Abuse from the Childlight Global Child Safety Institute, hosted by the University of Edinburgh in Scotland, found that one in five children in Western Europe (19.6%, nearly 15 million children) report experiencing online solicitation or grooming before turning 18, and that one in seven children (13.5%) reported facing CSAE in the past year alone. The Index also shows that Western Europe was the primary host of CSAM on the Internet in both 2023 and 2024, with the Netherlands accounting for the highest volume of CSAM (over 60% of all reported CSAM from Western Europe) and the highest CSAM availability rate. Moreover, Childlight revealed a 1,325% annual rise in AI-generated CSAM, with 67,000 reports in 2024, compared to 4,700 logged by the National Centre for Missing and Exploited Children in 2023. In the digital era, both international conventions, such as the Convention on the Rights of the Child of 1989, and the Optional Protocol to the Convention on the Rights of the Child on the sale of Children, child prostitution and child pornography of 2000, and regional conventions, such as the Council of Europe’s Convention on the Protection of Children against Sexual Exploitation and Abuse of 2007, that aim to prevent and combat the sexual exploitation and abuse of children, have proved to be insufficient.
As children across Europe are facing unprecedented levels of violence, the European Union has taken various actions to effectively address the misuse of online services for child sexual abuse. As part of the EU Strategy for a More Effective Fight Against Child Sexual Abuse, presented on 24 July 2020, the European Commission submitted four legislative proposals between 2020 and 2024. The first legislative proposal was an interim regulation (Regulation EU 2021/1232) that allowed service providers, such as Facebook Messenger and WhatsApp, to derogate from the privacy provisions of the ePrivacy Directive, enabling them to continue voluntarily detecting, reporting, and removing child sexual abuse content in their systems. As the interim allowing voluntary detection was set to expire in August 2024, the European Commission adopted a proposal on permanent rules, the ‘EU’s Regulation to Prevent and Combat Child Sexual Abuse’, on 11 May 2022. Due to ongoing disagreement over the proposal, the ‘Interim Derogation’ was extended until 3 April 2026 in 2024.
EU CSA regulation and its critics
The ‘EU’s Regulation to Prevent and Combat Child Sexual Abuse’, also referred to as the ‘CSA Regulation’, ‘CSAM proposal’, ‘CSAR’, or ‘Chat Control’ by its critics, is a lex specialis to the EU’s Digital Services Act, that was put forward alongside the ‘Better Internet for Kids’ strategy (Bik+). The Regulation aims to establish a legal framework that requires online service providers to detect, report, and remove CSAM from their services. This detection would also apply to end-to-end encrypted (E2EE) interpersonal communications services, such as WhatsApp, as well as to web-based email services, social media platforms, and app stores. The proposal also foresees the creation of an independent hub of expertise, the EU Centre on Child Sexual Abuse (EUCSA), which would review and assess the reports received from online providers, identifying erroneous reports and forwarding relevant reports to national authorities, while also supporting victims and ensuring the removal of CSAM in which they are depicted.
At the European Parliament, the file was allocated to the Committee on Civil Liberties, Justice and Home Affairs (LIBE), where the rapporteur presented his draft report on 19 April 2023. The draft report includes new amendments, such as restricting CSAM detection in end-to-end encrypted communications, and it emphasises the importance of integrating preventive measures into online services by providers. While supporting the creation of the EUCSA, the report also calls for the establishment of a Victims’ Consultative Forum within the Centre, and for the continuation of voluntary detection alongside mandatory measures. After it was adopted by the LIBE committee and approved in the Plenary session, it was decided to enter interinstitutional negotiations on 22 November 2023. At the Council, a common position was reached on 26 December 2025 under the current Danish Presidency. The Council position rejects mandatory detection orders to extend the voluntary scanning of companies’ services, therefore making the current temporary derogation measure from the ePrivacy directive permanent. Moreover, the Council introduces new risk mitigation measures, such as reporting tools, to be implemented by service providers based on a new tripartite service’s risk categorisation (low, medium, and high risk). Technical trilogues are expected in January 2026, following the first trilogue held on 9 December 2025.
Over the years, law enforcement, civil society groups, non-profit organisations, data protection authorities, and private citizens have expressed concerns about the mass screening of private communications, particularly encrypted messages. Since its proposal, many initiatives, such as the German party Young Liberals’ StopChatControl.eu and the citizen-led initiative Fight Chat Control, have actively and continuously opposed the CSA Regulation, while also raising awareness of its implications. The European Digital Rights association (EDRi) created the Stop Scanning Me movement, and signed, along one hundred thirty-four civil societies and professional organisations working across human rights, media freedom, technology and democracy – such as the European Center for Democracy and Technology, The Committee to Protect Journalists, the European Federation of Journalists (EFJ) and the Global Forum for Media Development – an open letter urging the European Commissioners to withdraw the Regulation. The letter “European Commission: uphold privacy, security and free expression by withdrawing new law” states that its application would be incompatible with the EU Charter of Fundamental Rights, and highlights that mass scanning of private communications would be detrimental to everyone’s privacy, security, and freedom of expression. The letter also shows that it would undermine media freedom and weaken journalists’ digital security. Experts and child protection services, such as the Dutch child protection hotline EOKM and the German child protection association Kinder-schutzbund, have also pointed out that the proposed measures would be harmful and likely ineffective.
The decision of the current Danish Presidency to give the CSA Regulation high priority has revived debates and protests across Europe and beyond. An open letter signed by more than 470 scientists and researchers from 34 countries across Europe states that the powers of the Regulation for detection, scanning and access to private communication “will create unprecedented capabilities for surveillance, control, and censorship and have an inherent risk for function creep and abuse by less democratic regimes”. The letter also considers the high potential of AI-based content scanning to produce false positives, which would lead to the incrimination of innocent individuals and conversations. In a statement coordinated by the European Centre for Press and Media Freedom (ECPMF), the ECPMF has shared the content of the letter and has condemned the proposal as an infringement of the right to privacy, ultimately undermining press freedom and democracy. Their contribution also highlights the potential misuse of mass surveillance by authoritarian regimes, which could redirect it toward monitoring political speech or journalistic investigations — a concern many critics of the proposal have raised. Tech companies have also expressed concerns about the impact that the legislation would have on the EU’s digital economy, stating that it would undermine the EU’s competitiveness compared to regions that protect digital rights and innovation. Another open letter, signed in October 2025 by over 40 VPN and encrypted messaging services, including Proton, NordVPN, Tuta, and Element, considers how the legislation would undermine European cybersecurity and digital sovereignty, while eroding the trust that European businesses have established internationally. Platforms, such as Meta and Signal, have also stated that the proposal would endanger privacy and freedom, with the latter threatening to leave the EU market.
The crucial role of media literacy and platform safety-by-design features
As the EU holds a central legislative role concerning children’s rights in the digital environment, educating children remains one of the key preventive measures to combat online child sexual abuse and exploitation. As studies show, digital and media literacy is fundamental for children to engage safely and effectively in digital environments (Ey, 2024). The ability to analyse digital content and communication provides users with the interpreting and decoding skills necessary to question who created the content they are engaging with, how it was made, what meaning it conveys, who the intended audience is, and what it represents. These skills are critical to prevent online abuse, as predators take advantage of children and gain their attention and trust by using fake photos, videos, and profiles. Moreover, the ability to critically evaluate content equips children with the capacity to assess its credibility, while creating and sharing online audio-visual and written content equips them with the skills necessary to understand the logic of media. This enables them both to recognise predators’ deceptive practices, such as photo editing, filters, and other forms of manipulation, and to discern between private and public forms of information sharing, while reflecting on the potential consequences of what they choose to share.
However, digital and media literacy is insufficient without legal frameworks that mandate that online platforms design safe and age-appropriate services (Eurochild, 2025). Vulnerable children, such as those with disabilities or living in poverty, are often left behind. It is important to remember that the Internet and many digital technologies were created by adults and for adults — they were not designed with children’s rights in mind. Despite all preventive measures in force, it is crucial that technology companies, particularly social media companies, create online services and products with child rights in mind (‘Child Rights by Design’). “Safety-by-design” technologies and features are “practices that put user safety at the core of designing online platforms and technologies” (Eurochild, 2025). These include measures such as child-friendly reporting mechanisms and effective content moderation, that can prevent harm before it happens.
It is only when technological corporations begin to prioritise the safety of their vulnerable users over profit and join efforts with other stakeholders that it will be possible to pave the way for the realisation of children’s rights online.
References
Childlight. 2025. Into the Light Index.
European Digital Rights (EDRi). 2025. CSA Regulation Document Pool.
Eurochild. 2025. The rights of Children in the Digital Environment.
European Parliament. 2025. New Legislation to Fight Child Sexual Abuse Online.
United Nations Children’s Fund (UNICEF). (n.d). Keeping Children Safe Online.
World Health Organisation. 1999. Report of the Consultation on Child Abuse Prevention.




































































































































































