The National Data Protection Agency (ANDP), the National Consumer Secretariat (Senacom) and the Federal Public Ministry (MPF) made a series of recommendations to the controlling company of the digital platform
According to the document released this Tuesday (20), among the recommendations is the creation, within a maximum period of 30 days, of technical and operational procedures to identify, review and remove content of this type that is still available on X, when generated by Grok based on commands made by users.
The institutions also ask for the immediate suspension of accounts involved in the production of sexual or eroticized images, both of children and adolescents and adults, without their authorization, made with Grok.
It was also recommended to implement a “transparent, accessible and effective mechanism so that data subjects can exercise their rights”, including sending reports about irregular, abusive or illegal use of personal data, especially in cases of creation of synthetic sexualized or eroticized content without consent, ensuring an adequate response within a reasonable time.
The recommendations were drawn up in response to complaints from users who pointed out the generation of synthetic content of a sexualized nature based on images of real people.
Reports published in the national and international press and tests carried out by institutions were also taken as a basis, which indicate the use of the tool illegally by users to produce deepfakes, an artificial intelligence technique that creates false images from real people. These contents are sexualized, erotic and have a pornographic connotation, involving real women, children and teenagers.
The institutions warn that this type of content can impact both the protection of personal data, consumer relations, human dignity and other unavailable diffuse, collective and individual rights, especially those of women, children and adolescents.
“Among the joint recommendations is that measures be implemented immediately to prevent Grok from generating new images, new videos or new audio files that represent children and adolescents in sexualized or eroticized contexts”, argue the institutions.
The recommendations also seek to prevent the generation of content that represents identified or identifiable natural persons of legal age, in sexualized or eroticized contexts, without their authorization.
According to the document, although article 19 of the Marco Civil da Internet provides that internet application providers would not be responsible for harmful content published by third parties, except after a court decision, the removal must be carried out.
This is because sexualized deepfakes are not produced exclusively by third parties, but through an interaction between users and the artificial intelligence tool created and made widely available by X, “which therefore makes it co-author of such content, and not its mere intermediary”, says the text.
The document also highlights that, recently, the Federal Supreme Court (STF) declared the partial unconstitutionality of article 19 of the Marco Civil da Internet, precisely because it understands that it “does not offer sufficient protection to relevant constitutional rights”.
In the judgment, the Court determined that internet application providers would have a special duty of care, to adopt all necessary measures to prevent the massive circulation of especially serious crimes, such as those committed against women due to their status as females, including content that propagates hatred against them.
The institutions also remember that X’s own self-regulation policy related to non-consensual nudity prohibits the publication and sharing of explicit images or videos that have been produced or shared without the consent of the people involved.
The platform also prohibits the production, manipulation and dissemination of third-party images with sexual or erotic content, in contexts involving Grok, which makes “unsustainable, also from this perspective, the availability of an artificial intelligence tool, without strict filters, that favors conduct like these”.
If the recommendations are not followed or are implemented insufficiently to mitigate the identified risks, other measures may be considered and adopted by the three institutions. These measures may be adopted at administrative and judicial levels, to ensure adequate protection of the country’s citizens, especially women, children and adolescents.
*Brazil Agency
