An unprecedentedly broad range of stakeholders have raised concerns that despite its important aims, the measures proposed in the draft EU Child Sexual Abuse Regulation are fundamentally incompatible with human rights. This Article was orignally published at edri.org.

The proposed EU ‘Regulation laying down rules to prevent and combat child sexual abuse‘ (2022) (CSA Regulation, or CSAR) has raised concerns that it is incompatible with EU fundamental rights and case law – perhaps more so than any other EU law in recent memory.

Whilst all stakeholders agree on the importance of the aim to protect children, all formal legal and technical assessments have concluded that the proposed measures could amount to disproportionate violations of everyone’s privacy, personal data and free expression online, and rely on technically infeasible or dangerous measures.

Read on to see how a wide range of stakeholders, including child protection experts, survivors of CSA, police, national governments, UN officials, companies, NGOs and others have warned that the proposed measures are misguided and could do more harm than good.

“[The CSA Regulation] could become the basis for de facto generalised and indiscriminate scanning of the content of virtually all types of electronic communications of all users in the EU/EEA”
– The European Data Protection Board (EDPB) and Supervisor (EDPS)

EU institutions and case law

The Court of Justice of the EU enforces strong protections for privacy, data protection, free expression, non-discrimination and other fundamental rights both online and off, for children and adults alike, on the basis of the Charter of Fundamental Rights. The serious and widespread threat posed by the draft CSAR to fundamental rights is reflected in all formal legal analyses from EU institutions about the proposal.

  • The European Commission‘s own impact assessment (SWD(2022)209) to the CSAR recognises that there are no scanning methods that have good levels of privacy, security and feasibility. Available via local download here.
  • The European Commission‘s regulatory scrutiny board (RSB) warned before the law was officially proposed that it might amount to unlawful general monitoring.
  • The EU Council‘s official Legal Service Opinion – in a rarely-seen major critique of a legislative proposal – warned of a “serious risk” of generalised monitoring, undermining encryption and violating the very essence of the right to privacy. Description available here and full document published here.
  • The European Data Protection Supervisor (EDPS) and Board (EDPB)warned that the proposal will severely harm innocent people with little to no evidence that it will protect children.
  • The independent impact assessment undertaken on behalf of the European Parliament‘s civil liberties (LIBE) committee assessed the balance between safeguarding the privacy of users in general, and children’s rights. It found that the CSAR lacks evidence of effectiveness and could not justify the serious extent to which it violate Articles 7 and 8 (privacy and personal data rights) of the EU Charter of Fundamental Rights.
  • Furthermore, the European Commission has admitted that the proposal is based on claims of technical accuracy directly from suppliers, which have never been independently verified.

Young people

International child rights law requires that children’s views are incorporated into laws which relate to their rights and safety, in order to respect their autonomy.

  • According to a large representativesurvey, 80% of children in the EU say that they would not feel comfortable and safe being politically active or exploring their sexuality if authorities were able to monitor their digital communications on the basis of finding child abuse material.
  • The same survey shows that around two-thirds of young people in the EU rely on encrypted message services for communication, and around the same number disagree or totally disagree with the premise that providers should be allowed to scan their private chats. Instead, improving media literacy and reporting mechanisms are overwhelmingly favoured by the children surveyed.
  • Young activists warn of being suppressed by the CSAR.

Child sexual abuse survivors

  • CSA survivor Alexander Hanff warns that the CSAR will discourage survivors from seeking support.
  • Survivors representative group MOGiS e.V. warn that the proposal will harm survivors and other young people.
  • Deputy Chairperson of MOGiS e.V. spoke to lead MEPs working on the CSAR as a representative of CSA survivors, in a speeach to alert them to how the law will do more harm than good for survivors of CSA, and how the Commission’s proposed approach is “fundamentally flawed”.
  • Survivor of sexual violence, Marcel Schneider, is currently suing Facebook for the automated scanning of private messages, removing confidentiality for victims of abuse and criminal defendants.
  • As pointed out by MEP Patrick Breyer, several survivors of CSA have also provided feedback to the Commission’s CSAR proposal, criticising it for harming survivors.
  • Weisser Ring, a German CSA victim support group, and the Portuguese Association for Victim Support (APAV) collectively warn that despite important aims, the proposal could harm confidential communications, putting people at risk of blackmail and fraud, will lead to false positives, and has a risk of mass surveillance whilst lacking evidence of effectiveness.

Child rights & child protection experts

Police and prosecutors

  • Public prosecutors in Germany have warned that the CSAR will not help them to find perpetrators, as it does not tackle the actual issues that they face.
  • Police officers specialised in child protection in Germany have warned that the proposed measures will not help find more perpetrators, only more false alarms.
  • A senior police officer in the Netherlands warned that the police will be unable to deal with the volume of reports predicted as a result of the proposal.
  • The FBI warned Members of the European Parliament from the LIBE committee in a mission to Washington in May 2023 that “already now they [the FBI] do not have sufficient resources to deal with all CSAM cases they detect and they need to prioritize some of them. From this perspective, mass scanning of communication would not result in an increase in law enforcement”.

Technology experts

Independent legal analyses

  • A legal assessment from the University of Amsterdam’s Institute for Information Law (IViR) confirmed the threat to encryption.
  • Director of CDT, Iverna McGowan, a former Senior Advisor to the UN Office of the High Commissioner for Human Rights, warns that the proposed Detection Orders are based on a flawed logic which would undermine procedural rights and the presumption of innocence.
  • Assistant Professors at Leiden University, Center for Law and Digital Technology (eLaw), Dr Witting and Dr Leiser, warn that the proposal likely violates the EU prohibition of general monitoring, and its content moderation measures could put children at risk of harm.
  • Former EU Court of Justice judge, Professor Dr. Ninon Colneric, published an assessment that filtering for CSAM in a generalised and indiscriminate manner would be incompatible with EU case law.
  • Legal researcher at KU Leuven Center for IT & IP law, Charlotte Somers, calls the proposal “an attack on Europeans’ privacy and data protection” and notes the high likelihood of CJEU annulment.

National governments and parliaments

The European Parliament

  • 14 MEPs from the lead Parliamentary committee (Civil Liberties, or LIBE), coming from 4 political groups, demand rejection of the CSAR.
  • They are joined by many MEPs in the supporting IMCO (Internal Markets) committee who criticised the proposal’s scope, the threat its measures propose to encryption, and the serious risks posed by Detection Orders, with Renew group MEPs Hanh and Körner proposing full deletion of these parts.
  • MEP David Lega (EPP), co-chair of the European Parliament’s Intergroup on Children’s Rights, speaks out against the CSAR proposal.
  • The aforementioned complementary impact assessment to the CSAR on behalf of the LIBE committee further confirmed the lack of proportionality and fundamental rights compliance of the proposal.

Companies

  • Microsoft warned that the claimed 88% accuracy figure for grooming detection that the European Commission put forward is not reliable (see especially annex 9, pages 285 – 315).
  • An article from PhotoDNA user Cloudflare shows the ease of generating false alerts, despite claims from PhotoDNA developer Dr Hany Farid (which the European Commission has relied on in the draft CSAR) that this is as rare as 1 in 50 billion.
  • 9 European industry associations representing members like Google, Meta, Microsoft and Mozilla have voiced their concerns.
  • LinkedIn reports a false-positive rate of 59% for its current voluntary use of PhotoDNA, which strongly contradics the accuracy of PhotoDNA often cited by the European Commission and other proponents of the CSAR.

Privacy-tech companies

The United Nations

Professional secrecy associations (journalists, lawyers, etc.)

Civil society

Since you're here

… we have a small favour to ask. For articles like this, we analyse legal texts, assess official documents and read T&Cs (really!). We make sure that as many people as possible concern themselves with complicated legal and technical content and understand the enormous effects it has on their lives. We do this with the firm conviction that together we are stronger than all lobbyists, powerful decision makers and corporations. For all of this we need your support. Help us be a strong voice for civil society!

Become a supporter now!

Related stories: