login
login
Image header Agence Europe
Europe Daily Bulletin No. 13148
Contents Publication in full By article 10 / 39
SECTORAL POLICIES / Home affairs

Withdrawal of online child sexual abuse material, Germany reiterates red lines on detection technologies

Member States have expressed a number of reservations about the Swedish Presidency of the EU Council’s recent compromise proposals on the Regulation on laying down rules to prevent and combat child sexual abuse.

Unsurprisingly, Germany, which gave a very cautious welcome to the Commission’s regulation presented in May (see EUROPE 12950/5, 13125/12), reiterated its rejection of intrusive telecoms surveillance technologies and the need to maintain end-to-end encryption of communications, according to a note from the Swedish Presidency of the EU Council on 15 March.

Estonia, on the other hand, insisted on data retention and the Netherlands warned that 24-hour removal rules would not be applicable to all companies, especially those without the necessary staff.

In this note compiling the positions of 15 Member States, Germany first states that combatting the sexual abuse of children and young people is a top priority for the German Federal Government.

But the provisions foreseen “must uphold fundamental rights, in particular when it comes to protecting the confidentiality and privacy of communication. The Federal Government has serious concerns about the provisions on detection orders in the proposed Regulation. For the Federal Government, a high level of data protection and cyber security, including complete and secure end-to-end encryption in electronic communications, is essential”. 

It is therefore necessary, among other things, to state in the draft text “no technologies will be used which disrupt, weaken, circumvent or modify encryption”, Berlin stresses.

For Estonia, the most fundamental issue is whether the data is retained early enough. “The current version focuses on data retention which applies only after discovering the criminal content”.

In order to investigate a crime, it is necessary to obtain data “which has been created long before the criminal content has been discovered at all, including metadata”. “For example, with the child sexual abuse material, if the service provider starts retaining the data let’s say 24 hours or even a week after the material has been posted (at the moment when it’s discovered) then there is no data to pinpoint the person who published it. It is already too late”.

This is a much broader problem that would require a solution which is not necessarily field- or sector-specific, but would then apply to all concerned regulations as an umbrella act”, Estonia adds, linking the High Level Group set up in mid-February to data access (see related article).

The Netherlands also has a problem with the age of sexual consent in the latest compromise, which was raised from 17 to 18 years.

In the Netherlands, the age of sexual consent is 16. The Dutch criminalisation of grooming is also based on this age limit. A solution would be to include ‘the age of sexual consent’ instead of an age in the definition of ‘child user’”.

The Netherlands also questions the definition of ‘content data’, which would include voice and text. “There is nothing in the regulation or the impact assessment about the mandatory detection of voice communication, but it refers to ‘images’, ’videos’ and ‘photographs’. The Netherlands is highly critical of the voice communication detection, because it has concerns about proportionality [...]. The Netherlands believes that voice communication and text should remain outside the scope of the regulation”.

Link to the document: https://aeur.eu/f/5zw (Original version in French by Solenn Paulic)

Contents

EUROPEAN COUNCIL
SECTORAL POLICIES
EXTERNAL ACTION
SECURITY - DEFENCE
ECONOMY - FINANCE - BUSINESS
Russian invasion of Ukraine
FUNDAMENTAL RIGHTS - SOCIETAL ISSUES
SOCIAL AFFAIRS - EMPLOYMENT
COURT OF JUSTICE OF THE EU
NEWS BRIEFS