Report organized based on a debate between 28 experts from different sectors lists some of the main challenges in the field and, at the same time, points out limitations in the scope and implementation of a possible decision by the (Supreme Federal Court) on the topic.
Part of a research project carried out in partnership by the University of Sussex (United Kingdom) and Insper, the document, obtained exclusively by Sheetprovides an overview of the conversation held by representatives of public authorities, academia, the private sector and civil society, in a face-to-face meeting held last November.
Held shortly after the end of the municipal elections and before the start of the discussion, it was organized following the Chatham House rule, according to which the identity and position of participants cannot be revealed — the aim is to create an environment for freer discussion among participants .
One of the main aspects explored was the convenience or not of adopting the concept of “duty of care” in regulating platforms in the country.
Behind this concept is the attempt to create obligations for social networks to act more proactively in combating harmful content on their services. In an approach that aims to act “wholesale”, in a way that complements the “retail” perspective of legal actions for individual content.
In recent years, the concept of “duty of care” has already appeared in the Chamber of Deputies, on electoral propaganda and was cited.
According to the report, several participants argued that the discussion about individual removal of content, as stated in , is insufficient to deal with the risks arising from social networks. And, as a result, they need to adopt preventive and more “structuring” measures.
For Beatriz Kira, law professor at the University of Sussex and one of the report’s authors, a common point in the debate was that the STF trial would not solve the problem.
“Both in terms of the nature of the decision –on the constitutionality of an article– and in terms of the institutional framework that is necessary for the implementation of a robust duty of care.”
Some of the debaters, the document points out, said they were concerned about a scenario of legal uncertainty, in case there is no clarity regarding what actually constitutes the duty of care and what measures companies should adopt.
Among the points discussed in the conversation and which indicate the limitations of the STF, if it seeks to adopt obligations of this type, there is still the challenge of the debate about which .
“It doesn’t matter what the rules are, what the design is, if whoever is going to supervise this is a bad body that has no expertise”, says Ivar Hartmann, law professor at Insper and also author of the study, adding that the option for the duty of care was not a consensus in the debate.
“It’s a matter of decision about institutional design. In this decision, the Supreme Court cannot define who will oversee proactive filtering obligations, for example, or propose a structure for auditing content moderation algorithms.”
Just as happened in the debate held in Congress in 2023, in the mediated conversation the scope and format of this body was the subject of great disagreements, but with emphasis on the need for specialized technical knowledge and real supervisory capacity.
There were also those who questioned the political feasibility of creating a regulatory body in Brazil and those who defended the strengthening of , for example.
One of those who already voted in the STF judgment, the minister and president of the court, argued that, given the absence of a regulatory body to monitor the measures, reports released by the networks would remain public and could be used by the Public Ministry to support possible actions for collective moral damages.
Given these factors, the researchers who followed and summarized the discussion highlight the importance of debate on the topic in the Legislature.
Among the concerns cited are that an overly generic formulation could generate excessive moderation on the networks – affecting freedom of expression. The need to create possible “duties of care” according to the type of platform, its size and the specific risks generated by them was also mentioned.
According to the report, participants from different sectors pointed to the adoption in the Brazilian context of a “duty of care” for platforms along with the obligation to “mitigate systemic risks”, which would be a mix of the law adopted in 2023 on digital security ( Online Safety Act) and the proposed European regulation (DSA or Digital Services Act).
On the other hand, there were also those who argued that, based on the interpretation of legislation such as the Civil and Consumer Protection Code, as well as the Child and Adolescent Statute and electoral resolutions, there are already new duties that platforms are subject to. submitted.
The document states that, as it is the result of exchanges between actors with leadership roles, it ends up being a portrait of the Brazilian debate on platform regulation. The project is funded by the British Academy, with support from the British government’s International Science Partnerships Fund.