Several digital platforms have guaranteed that they will continue to track online child sexual abuse content, despite the end, on April 3, of the European regime that legally regulated the detection and reporting of such content.
In a joint statement dated April 4, technology companies Google, Meta, Microsoft and Snapchat reaffirmed the commitment to detect, remove and report child sexual abuse materialstressing that the protection of children “is not just a legal issue, but a fundamental responsibility”.
This position arises after the end of the derogation from the ‘ePrivacy’ directive, which allowed platforms to use specific technologies to identify this type of content, such as image matching systems (‘hash-matching’).
According to the companies, the end of this regime creates legal uncertainty and may compromise efforts which, until now, have made it possible to identify abuse, support victims and limit the spread of illegal content.
“The European institutions failed to reach an agreement to maintain already established mechanisms to protect children online”, they criticized, expressing concern about the impact of the lack of a clear legal framework.
Although the European institutions have not reached an agreement, the platforms guarantee that they will maintain voluntary actions in their interpersonal communication services, continuing to use technological tools to detect and report abusive content to the competent authorities.
The statement highlights thate these practices have been used “for years” and are considered essential to stopping ongoing abuse and preventing its spread.
Companies also highlight that these measures are compatible with protecting user privacydefending a balance between fundamental rights and online security.
They say they are aligned with the approximately 250 organizations defending children’s rights, which have been expressing concern about the end of the European regime and the risk of reduced protection for victims.
Companies call on European institutions to conclude negotiations “urgently”
Faced with this scenario, companies appeal to European institutions to “urgently” conclude negotiations on a new regulatory frameworkwhether of a transitory nature or of a lasting nature.
The objective, as they indicate, must involve ensure legal security for platforms and ensure the continuity of mechanisms for detecting and combating sexual abuse of children online.
As Ongoing negotiations in the European Union aim to define a new legal framework to combat this type of crime, including rules on the use of detection technologies and the protection of personal data.
Until then, the platforms commit to maintaining current efforts, warning that the absence of a solution could weaken the global response to a phenomenon they consider “one of the most serious” in the digital space.
The provisional rules for detecting and reporting images of sexual abuse of children have existed since 2021 and ended on the day April 3, a day that the project founderTito Morais, classified it as “a black page in children’s rights at European level”.