skip to content

EU lawyers say plan to scan private messages for child abuse may be unlawful

EU lawyers say plan to scan private messages for child abuse may be unlawful
According to the internal EU legal advice, the regulation ‘risks falling foul of a judicial review on multiple grounds’. Photograph: DCPhoto/Alamy

EU lawyers say plan to scan private messages for child abuse may be unlawful

An EU plan under which all WhatsApp, iMessage and Snapchat accounts could be screened for child abuse content has hit a significant obstacle after internal legal advice said it would probably be annulled by the courts for breaching users’ rights.

Under the proposed “chat controls” regulation, any encrypted service provider could be forced to survey billions of messages, videos and photos for “identifiers” of certain types of content where it was suspected a service was being used to disseminate harmful material.

The providers issued with a so-called “detection order” by national bodies would have to alert police if they found evidence of suspected harmful content being shared or the grooming of children.

 

Privacy campaigners and the service providers have already warned that the proposed EU regulation and a similar online safety bill in the UK risk end-to-end encryption services such as WhatsApp disappearing from Europe.

Now leaked internal EU legal advice, which was presented to diplomats from the bloc’s member states on 27 April and has been seen by the Guardian, raises significant doubts about the lawfulness of the regulation unveiled by the European Commission in May last year.

The legal service of the council of the EU, the decision-making body led by national ministers, has advised the proposed regulation poses a “particularly serious limitation to the rights to privacy and personal data” and that there is a “serious risk” of it falling foul of a judicial review on multiple grounds.

The EU lawyers write that the draft regulation “would require the general and indiscriminate screening of the data processed by a specific service provider, and apply without distinction to all the persons using that specific service, without those persons being, even indirectly, in a situation liable to give rise to criminal prosecution”.

The legal service goes on to warn that the European court of justice has previously judged the screening of communications metadata is “proportionate only for the purpose of safeguarding national security” and therefore “it is rather unlikely that similar screening of content of communications for the purpose of combating crime of child sexual abuse would be found proportionate, let alone with regard to the conduct not constituting criminal offences”.

 

The lawyers conclude the proposed regulation is at “serious risk of exceeding the limits of what is appropriate and necessary in order to meet the legitimate objectives pursued, and therefore of failing to comply with the principle of proportionality”.

The legal service also raises concerns that by seeking to clamp down on the solicitation of children, the providers would have to introduce age verification technology and processes to popular encrypted services.

The lawyers write that this would necessarily involve the mass profiling of users, or the biometric analysis of the user’s face or voice, or alternatively the use of a digital certification system they note “would necessarily add another layer of interference with the rights and freedoms of the users”.

Despite the advice, it is understood that 10 EU member states – Belgium, Bulgaria, Cyprus, Hungary, Ireland, Italy, Latvia, Lithuania, Romania and Spain – back continuing with the regulation without amendment.

Patrick Breyer, a German MEP who sits on the European parliament’s civil liberties committee, said the EU should drop the plans, which he labelled as “dystopian” and claimed would tie up law enforcement and tech companies in chasing down false reports of harmful content.

He said: “The council of the EU’s services now confirm in crystal clear words what other legal experts, human rights defenders, law enforcement officials, abuse victims and child protection organisations have been warning about for a long time: obliging email, messaging and chat providers to search all private messages for allegedly illegal material and report to the police destroys and violates the right to confidentiality of correspondence.”

Facebook
Twitter
LinkedIn
Pinterest
Picture of Written by <span>Child Abuse</span>
Written by Child Abuse
About the Foundation
Logo
Preda Foundation Inc.

The work of Preda Foundation is focused on alleviating the physical, emotional, psychological and sexual abuse and suffering of children and preventing abuse through community education and social media.

Share this post
Facebook
Pinterest
WhatsApp
LinkedIn
Twitter