STF: Big Techs Responsibility Degree has divergence – 06/11/2025 – Power

by Andrea
0 comments

Six ministers of the Federal Supreme Court formed a majority on Wednesday (11) to change article 19, the main law that regulates the regime of responsibility in Brazil, since 2014. Only one, believes that Article 19 is fully constitutional.

But ministers are divided into the degree of responsibility that Big Tech should have.

The ministers and are the most radical, because they believe that for a wide list of content considered threats, Big Tech can be punished if they do not quickly remove the post, even without receiving court order or user complaint. The minister creates a special removal regime without notification, but with caveats.

Already the president of, and the ministers and adopted an intermediate approach: neither, provided for in the votes of Toffoli and Fux, nor the maintenance of the status quo, defended in Mendonça’s thesis.

For Barroso, Dino and Zanin, companies also have a proactive obligation, but will not be punished if one or a few posts escape, will be judged by the effort to combat this type of content.

After seven votes of the ministers, some consensus are designed to prevail: liability after extrajudicial notification (private denunciation) for most contents; Offenses and crimes against honor remain under the civil framework; Responsibility for all sponsored content; Duty to care for systemic risks, and obligation to maintain a legal representative in the country.

Nowadays, according to Marco Civil, companies can only be punished for any damage resulting from content if it is not removed after a court order. There are only two exceptions – unanswered nudity, which is in Article 21 of the Civil Mark, and violation of intellectual property. In such cases, an extrajudicial notification, as a user complaint.

Barroso, Dino, Zanin and Gilmar predict that: an extrajudicial notification would be enough so that the platforms can be held responsible for damages resulting from the non -removal of content in these cases. This is the regime in force in the European Union – “notice and Take action”.

Offenses and crimes against honor, in the case of Barroso, Dino and Zanin, and also journalistic content, for Gilmar, would still be within article 19. In such cases, there would only be responsibility after court order. Toffoli and Fux consider that offenses and crimes against honor can generate responsibility already after private complaint.

This generates concern between experts and members of civil society, because predicting responsibility after private notification in case of offenses would fatally lead to a war of complaints between election campaign candidates, for example.

More strict, Gilmar, Toffoli and Fux predict a category of content that can generate immediate responsibility for the platforms, even without being previously communicated to the fact. Fux speaks of a “duty of active monitoring” for hate speech, racism, pedophilia, incitement to violence, apology for the violent abolition of the Democratic Rule of Law and apology for the coup d’état. Toffoli speaks of civil liability regardless of notification for these and various other types of content.

Gilmar speaks of a “special regime” of responsibility for content with potential “severe damage to social democratic fabric” that need to be unavailable immediately.

But he makes the exception that it is necessary to “consider hypothesis in which the content is not a patent representation of these crimes” – in these cases, there would be no liability before notification.

Big Tech claim that the obligation to remove certain content even without notification can create active monitoring by companies and excess removal to avoid liability.

The seven ministers foresee that internet platforms will be responsible for sponsored or driven content even before a court order for removal or extrajudicial notification. It is assumed that companies have prior knowledge of the content on which they profit and therefore must be responsible.

Dino, Zanin, Barroso and Gilmar provide in their theses the so -called duty of care, along the lines of what is adopted in European legislation, the Digital Services Law, in relation to certain systemic risks. Companies would have an obligation to proactively remove posts with crimes against children, suicide induction, terrorism, incitement to violence and crimes against the Democratic Rule of Law, as well as trafficking in persons (there are variations in each minister’s list).

Following the EU digital service law model, platforms need to prove that they did their best to mitigate systemic risks and to deal with illicit content.

But they would not be punished by unique or sparse content. The actions adopted to combat illicit posts are taken into account.

There is consensus, therefore, in relation to some points, such as driven content, but not as to the degree of responsibility that Big Tech should have in the new version of Marco Civil.

And there is no agreement on who will be the regulatory organ. Who, for example, will decide whether only some single illegal content have escaped from the platform filter and if she did what she could to mitigate systemic risk? Or if it has failed to care and can therefore be held responsible?

source

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC