European Parliament and EU Council negotiators will resume in principle on 29 October their trilogue discussions on the regulation on the removal of terrorist online content, with the meeting still on the agenda at this stage despite the difficulties linked to the pandemic.
In any case, the issue regained importance after the attack in Conflans-Saint-Honorine, France, on 16 October, fomented by video sharing and the dissemination of hateful and radical statements.
The Commission reiterated this week that it would like to see this dossier, first presented in 2018, completed as soon as possible: as a reminder, the flagship measure of this regulation is the withdrawal within the hour of extremist material, and Paris has in recent days multiplied calls to consolidate this rule.
The regulation also proposed automated measures (automatic filters) to control content, which was quickly perceived as Internet censorship by political groups including the Greens/EFA or digital rights associations.
The entourage of the Polish rapporteur, Patryk Jaki (ECR), said that he wants to move quickly, as the dossier has languished for almost 19 months between Parliament’s renewal and the pandemic, and he also thinks that the two sides, Parliament and the EU Council, are reasonably close to agreement.
Negotiations resumed on 24 September, and there are still sensitive issues. The Polish office sees two in particular: the directly binding effect of a cross-border withdrawal order issued by a national authority against a content or hosting provider located in another Member State.
France has reiterated in recent days that without this immediate measure the effectiveness of the Regulation would be compromised.
In Parliament, particularly for the ECR rapporteur, there is nevertheless a strong attachment to the sovereignty of a national authority which must have the means to challenge a withdrawal order with which it disagrees. This authority in the country where the content is hosted should be able to refuse a request and should always be the only one able to decide whether or not to remove the content.
Another discrepancy is the use of specific or proactive measures including the use of automated filters. Parliament wants to fight to keep an open list of measures to be put in place, ranging from filters to human intervention, but it wants to avoid mandatory measures that would rely only on automated filters.
For other groups, including the Greens/EFA, there are other problems, such as the definition of terrorist content, which is based on the European directive on terrorism, and the fact that so-called journalistic or educational content may not be sufficiently protected from the effects of the regulation.
Other questions also arise as to the independent nature of the authority issuing such withdrawal orders, but the rapporteur’s office believes that these issues have already been more or less settled.
In his view, one or two more trilogue meetings might still be necessary in November to reach an agreement.
The 16 October attack should in any case have an impact “and help move the discussions forward”, says one source. However, the S&D believes that this should not be done at any price.
“It is clear that we urgently need to combat terrorist content online and to agree on the regulation as soon as possible. However, if we agree on a purely political document, which in the end will not only be impossible to implement but also legally invalid, it will not help us much”, commented Marina Karljurand (S&D, Estonia), shadow rapporteur.
She refuses to require the use of automatic filters by service provides to detect terrorist content, “as this would result in excessive blocking of legal content. Artificial intelligence is not capable of evaluating the context of content, so such filters may capture news articles and other legal content in addition to terrorist content. That’s why there must always be a human check”, she says. (Original version in French by Solenn Paulic)