On Wednesday 26 July, the Member States’ experts meeting in the Law Enforcement Working Party will discuss a new compromise on the regulation on the removal of child sexual abuse material from the Internet (CSAM).
While the issue remains controversial, with the Legal Service of the EU Council questioning in April the legality of detection orders for child sexual abuse material in interpersonal communications (see EUROPE 13176/3), the Member States have nevertheless decided to go ahead and maintain these detection orders in the regulation proposed by the Commission in May 2022.
The new text should, however, strengthen risk assessment, mitigation and reporting, to “ensure that detection orders are a measure of last resort”.
The text also re-establishes “other independent administrative authorities” to issue detection orders.
The text also insists on the technologies that companies can use to detect this material, and on the fact that nothing in this regulation should be interpreted as prohibiting or limiting communication encryption techniques.
A new article is therefore included on the new specific powers of the coordinating authorities to require providers of hosting services or providers of interpersonal communications services to adapt their risk assessment or mitigation measures to ensure compliance with the regulation.
To increase transparency, according to another new article, providers of hosting services and providers of interpersonal communications services should be able to inform their users, in an easily recognisable and officially authorised manner, of their compliance with the relevant parts of this regulation, for example by displaying “a sign of compliance where the coordinating authority is satisfied that they have carried out the risk assessment and taken all reasonable and necessary steps” to mitigate the risks and that there is no need to initiate the detection order procedure. However, the sign of compliance should not be interpreted as indicating that the risk of online child sexual abuse has been completely eliminated.
Exclusion of group calls
The Presidency also proposes to exclude group calls from detection orders. These must always be issued only after “a diligent and objective assessment leading to the finding of a significant risk of misuse of the specific service concerned for a given type of online child sexual abuse”.
One of the factors to be taken into account in this respect is the likelihood that the service will be used to an appreciable extent, i.e. beyond isolated and relatively rare cases, for this type of abuse. The criteria should vary, to take account of the different characteristics and types of abuse.
“For these reasons, it is appropriate to set additional limits on detection orders concerning the solicitation of children, such as the exclusion of calls, i.e. connections established by means of a publicly available interpersonal communications service allowing two-way voice communications, including such communications between a group of more than two persons”.
But other interpersonal communications services, such as recorded and transmitted voice communications, must remain covered.
Eurobarometer
According to a Eurobarometer survey published on 20 July, over 80% of respondents support the detection of abuse in messages and consider that parental control tools are not enough.
A large majority of Europeans (73%) consider online child sexual abuse to be a widespread or very widespread problem. In addition, 92% of citizens consulted agree that children are increasingly threatened online.
When it comes to the responsibility to protect them, 96% of respondents said that the ability to detect child abuse was more important or just as important as the right to privacy online.
Link to the compromise: https://aeur.eu/f/86d
Link to Eurobarometer: https://aeur.eu/f/86f (Original version in French by Solenn Paulic)