login
login
Image header Agence Europe
Europe Daily Bulletin No. 13280
Contents Publication in full By article 11 / 34
SECTORAL POLICIES / Home affairs

With their political agreement on online child sexual abuse material, MEPs hope to encourage Member States to come to agreement of their own

On Thursday 26 October, the European Parliament’s negotiators on the draft regulation on the removal of child sexual abuse material from the Internet called on the Council of the EU to move forward on this issue, after reaching a political agreement on 24 October. They reminded him of the urgency of adopting this instrument when a derogation to the e-privacy directive expires in 2024, allowing platforms to detect such content on a voluntary basis.

The EPP’s Spanish rapporteur, Javier Zarzalejos, welcomed the European Parliament’s united front on this highly controversial issue, which in its initial version would require platforms and service providers to scan private communications to detect this material.

We have done our best to produce a report and an effective, legally sound and enforceable legal framework. And in my opinion, there is a positive balance between the protection of children, the digital sphere and respect for fundamental rights”, commented the rapporteur.

There will be no massive scanning or general monitoring of the web. There is no indiscriminate scanning of private communications or backdoors to weaken encryption”, he explained.

Under the political agreement reached, which will be confirmed by the European Parliament Committee on Civil Liberties on 13 November, there is already a major difference with regard to detection orders, which will only be decided by a judicial authority as a last resort.

The detection order must “be targeted and specified and limited to an identifiable part or component of the service, such as a specific channel of communication, or to individual users or a specific group of users, either as such or as subscribers to a specific channel of communication, in respect of whom there are objective evidence and reasonable grounds of suspicion that they reveal a link, even an indirect or remote one, with online child sexual abuse material”.

And the content of interpersonal communications to which end-to-end encryption is applied will not be subject to these measures.

The agreement also stipulates that all providers must assess the risk of abuse of the service for the dissemination of child sexual abuse material or for soliciting children, and put in place measures to mitigate these risks and, where appropriate, to detect, report and remove such abuse.

These mitigation measures must be reasonable, proportionate, targeted and effective, tailored to their specific services and to the risks identified.

These measures may consist of testing and adapting content moderation systems or service functionalities such as the speed, quality and efficiency of the processing of notifications and reports of online child sexual abuse with, where appropriate, the rapid removal of child sexual abuse material. It also involves adapting the default design, features and functions.

Specific mitigation measures will be mandatory for services directly targeting children, including: - limit the possibility for users, by default, to establish unsolicited contact with other users directly, in particular via private messages, by requesting confirmation from the user before authorising a stranger to communicate ; - provide meaningful and proportionate parental control tools, adapted to the age of the user, which enable parents or guardians to exercise control while respecting the fundamental rights of the child.

We have also provided for exceptions to the obligation to carry out a risk assessment for suppliers who are not substantially exposed to online child sexual abuse material, and established a simplified procedure for SMEs”, added the Spaniard.

Only if suppliers fail to comply with the obligations set out in the regulation will specific mitigation measures be mandatory.

Another difference with the Commission’s initial text is that, as a last resort, a judicial authority - and only a judicial authority - will be able to issue a detection order, which means that the provider will have to deploy certain technologies to detect known and new child sexual abuse material.

The rapporteurs also decided to exclude grooming (sexual solicitation) from the scope of the regulation.

However, there will be a revision clause: within three years of the entry into force of the Regulation, the Commission will submit a report and, if appropriate, a new legislative proposal to the European Parliament and the EU Council on the need and feasibility of including child grooming within the scope of detection orders.

The detection order will therefore be targeted at specific individual users and groups of users who are deemed reasonably likely to be linked to child sexual abuse material. The detection order will be time-limited and suppliers will decide which technologies they will use to comply with the detection obligations.

The role of the EU Centre has also been strengthened and renamed the ‘European Centre for Child Protection’, which will become the reporting channel for the whole of the European Union. It will receive, filter, evaluate and forward system reports to the national competent authorities and Europol.

The EU centre will be able to search for providers of hosting services for publicly accessible content. The role of the European Data Protection Supervisor and the national data protection authorities has been clarified, and a post of Fundamental Rights Officer has been created within the European Centre. (Original version in French by Solenn Paulic)

Contents

EUROPEAN COUNCIL
EXTERNAL ACTION
ECONOMY - FINANCE - BUSINESS
SECTORAL POLICIES
SOCIAL AFFAIRS - EMPLOYMENT
INSTITUTIONAL
SECURITY - DEFENCE
FUNDAMENTAL RIGHTS - SOCIETAL ISSUES
COURT OF JUSTICE OF THE EU
NEWS BRIEFS