The Federal Supreme Court in the decision that changed the. Neither those of the Ministers Dias Toffoli and, nor the maintenance of the status quo defended by Edson Fachin, André Mendonça and Kássio Nunes Marques.
But even though it triumphed the court’s intermediate thesis, it is still much more radical than the PL 2630, from the fake news, presented in 2020. That is, the Big Techs, which spent years, lobbying the laws, ended a much harder responsibility, dictated by.
No, there was no responsibility after extrajudicial notification, presumed knowledge (and punishment even before any notification) of advertised ads and posts, nor obligation to remove content considered systemic risks.
“It is a very robust decision, even if you measure the votes. It changes a lot of what is required from the platforms,” says Francisco Brito Cruz, IDP professor.
In the decision, the view of the Supreme Court, Luis Roberto Barroso, and the ministers, Carmen Lúcia, and Gilmar Mendes, prevailed. The thesis establishes the “Notice and Action” regime, similar to what is in the European Union, for third party content in cases of crime or unlawful acts. In this regime, big techs can be held responsible for effects arising from these contents if they do not act after extrajudicial notification, ie user or other person.
Nowadays, according to Article 19 of the Internet Marco Civil, providers are only responsible for any damage arising from third party content if they do not act after a court order. Mendonça, Fachin and Kássio Nunes Marques, overdue votes, argued that the regime of responsibility would remain this way.
There is only one exception, provided for in Article 21 of the Law, which deals with unreasonable nudity content, the so -called revenge pornography – in this case, was enough extrajudicial notification. With the Supreme decision, the cases that fit in Article 21 are expanded.
In addition, the provider of being careful – it should immediately remove content that configures bass crimes, specified in a exhaustive list. Among them, violations of the Law of the Democratic Rule of Law, violations of the Statute of the Child and Adolescent, and racism. The list is exhaustive, which brings a certain relief to the big techs – they didn’t want this list to be expanded and changing. They defended a fixed and well -delineated list about content they should remove.
In such cases, responsibility will not be unique or sparse content. A systemic failure should be detected-it should be noted that companies have generally adopted, in general, prevention and removal measures of these content types.
In some themes, the worst was avoided in the view of the platforms. In the case of crimes against honor, it did not avenge the thesis defended by that providers could be held responsible from user complaints. The fear was that a war of complaints and compensation industry would come. The Supreme Court has decided that crimes against honor remain within article 19.
It was also explained that there will be no objective responsibility (without proof of guilt or intent), which was the “nuclear bomb” option proposed by Toffoli. “There will be no objective responsibility in the application of the thesis stated here,” it appears in the STF’s decision.
Experts see some gaps for platforms to escape any punishments. For example, they can avoid liability if the judiciary consider that they have taken “adequate measures according to the state of the technique”.
For Filipe Fenon, professor at FGV Law Rio, in practice, is an open clause for companies to claim that they did what they could.
Still, the STF’s decision significantly reduces platform immunity regarding third party content. Some Big Techs must be thinking, “Back, PL of fake news.”