The Legal Service of the Council of the EU has asked Member States in a note dated 26 April to rework the ‘detection orders’ elements of the Commission’s draft Regulation on the removal of child sexual abuse material from the Internet (see EUROPE 12950/5). As it stands, the draft would, in its view, constitute a flagrant violation of fundamental rights.
The Regulation provides for the possibility for competent national authorities to issue injunctions to content providers and hosts to detect child sexual abuse material and “grooming” in interpersonal communications, by way of derogation from the 2002 Directive on confidentiality of communications. In general, platforms are invited to set up some kind of monitoring of these private communications on their own by choosing the most appropriate technology.
“As it stands, the provisions on interpersonal communications constitute a particularly serious limitation of the rights to privacy and personal data protection”, the legal service argued.
It appears, for example, that “widespread screening of communications content to detect such material would de facto require prohibiting, weakening or circumventing cybersecurity measures (in particular end-to-end encryption), to make such screening possible”, the legal service observes.
Such a regime “carries a serious risk of not being sufficiently clear, precise and complete, especially in view of the expected intensity of judicial review of a measure interfering with fundamental rights, and therefore of not being in line with the Court’s case law. It could also compromise the essence of the above-mentioned fundamental rights insofar as it would allow widespread access to the content of interpersonal communications”.
Nor would the regulation be proportionate “in that it would require a general and undifferentiated screening of data processed by a specific service provider and would apply without distinction to all persons using that specific service, without those persons being in a situation, even indirectly, that could give rise to criminal proceedings”.
“If the EU Council were to decide to retain interpersonal communications within the scope of the detection order regime”, the order should therefore “be targeted so that it applies to persons in respect of whom there are reasonable grounds to believe that they are in some way involved in, are committing or have committed, or are at least indirectly connected with the commission of child sexual abuse offences”.
“The draft regulation should provide more detailed and substantive elements regarding the technology to be used and the extent of the limitations to fundamental rights it may entail”, the legal service concludes.
This note was discussed in the EU Council working group on 27 April.
According to one source, some countries were not able to take a position on the opinion, needing more time. However, a group of countries reportedly maintained their commitment to these detection orders and even asked to include encrypted content.
Others would have liked further clarification on the obligations of service providers to detect known and unknown child sexual abuse material with these delegations questioning the distinction between known and unknown.
The detection orders could be discussed in mid-May at the level of the Member States’ representatives.
“The EU Council services today confirm in no uncertain terms what other legal experts, human rights defenders, law enforcement officials, victims of abuse and child protection organisations have long been saying: obliging email, messaging and chat service providers to search all private messages for allegedly illegal content and report it to the police destroys and violates the right to privacy of correspondence”, was the reaction of Patrick Breyer MEP (Greens/EFA) on 27 April.
Link to the opinion: https://aeur.eu/f/6ql (Original version in French by Solenn Paulic)