On Wednesday 2 October, the European Commission announced that it had sent several requests for further information to three major social networks under the Digital Services Act (DSA).
The EU says it is concerned about certain potentially dangerous content disseminated by the platforms YouTube, Snapchat and TikTok, and on Wednesday demanded information about the algorithms that manage recommendations.
The Commission is especially concerned with protecting minors, who represent a significant portion of users on these platforms. The platforms “must adequately assess and mitigate the risks”, including “potential harm to the mental health of users” and the dissemination of “dangerous content linked to the design of these algorithms”, says the Commission in its press release.
YouTube and Snapchat must also communicate the measures taken to mitigate the influence of their algorithms on the promotion of hate speech and illegal drugs.
TikTok, for its part, has been asked to “provide more information on the measures adopted to prevent manipulation of the service by malicious actors and to mitigate the risks associated with elections, media pluralism and civic discourse”.
The algorithms and recommendation systems used by platforms subject to the DSA are at the heart of several formal non-compliance proceedings opened by the Commission in recent months against AliExpress, Facebook and Instagram (see EUROPE 13371/24, 13411/1).
YouTube, Snapchat and TikTok must provide the requested information by 15 November. (Original version in French by Isalia Stieffatre)