The (Federal Supreme Court) approaches a decision that can change the rules of the operation of Brazil.
The trial discusses the constitutionality of Article 19, sanctioned in 2014.
The device provides that digital platforms can only be held responsible for third party content if they disrespect court decision. Only André Mendonça vote for the constitutionality of the article and, therefore, for the maintenance of the current rule.
Minister Alexandre de Moraes voted on Thursday (12) thickening the evaluation of the majority in favor of.
And they had already voted for the total unconstitutionality of Article 19, arguing that Big Techs can be punished if they do not quickly remove the post, even without receiving a court order or user complaint. He defended a special removal regime without notification, but with caveats.
Already the president of, and the ministers and adopted an intermediate approach, arguing that companies have an obligation to proactive, but will not be punished if one or a few posts escape: will be judged by the effort to combat this type of content.
Analysis occurs amid growing pressure due to greater control over misinformation, hatred discourses and other forms of online abuse. Depending on the final decision, the interpretation of the law could be relaxed to impose new obligations on technology companies, with a direct impact on the way they operate social networks, search engines and messaging applications in the country.
Next, the main points that should change on the internet from the votes already given by the ministers.
Currently, according to Marco Civil da Internet, companies can only be held in court if they do not comply with a court order to remove content considered illegal. There are exceptions, such as cases of unanswered nudity and copyright infringement.
In the ongoing judgment, but they formed a majority for an expansion of this regime.
The proposal is that a simple extrajudicial notification, such as a user complaint made directly to the platform, may be sufficient to force the removal of illegal content. Otherwise, the company may respond in court for the permanence of the publication.
If the understanding of Barroso, Dino and Zanin prevails, the current rule would be maintained in cases of offenses and crimes against honor, that is, the platform would only be held responsible if it failed to comply with court decision.
Sponsored content
At this point, there is a consensus among ministers that internet platforms will be responsible for sponsored or driven content even before a court order for removal or extrajudicial notification.
As these companies profit directly by promoting these contents, it understands that they have prior knowledge and obligation to check their legality.
Duty of care
Inspired by European legislation, Dino, Zanin, Barroso and Gilmar proposed that platforms will have a careful duty to systemic risks.
This means that they would have an obligation to proactively remove posts with crimes against children, suicide induction, terrorism, incitement to violence and crimes against the Democratic Rule of Law, as well as trafficking in persons (there are variations in each minister’s list).
Zanin, Dino and Barroso point out that the company should not be automatically punished by each isolated content, but by the effort made to mitigate risks and remove illegal content.
Legal Representative in Brazil
The obligation of a legal representative of platforms in Brazil obtained majority support. This requirement aims to facilitate the dialogue with the judiciary and compliance with national legislation.
With this, foreign platforms that today operate with little or no legal structure in the country would be more easily triggered and responsible.
Regulator
The trial also raised the need to create a specific regulatory body to oversee the fulfillment of the new obligations. However, there was no consensus among the ministers about which entity should play this role.
Questions such as the assessment of the fulfillment of the duty of care or the definition of what a systemic risk is still lacking a clear decision of decision, which can generate gaps in the practical application of the new rules.
Gilmar said he considered that the National Data Protection Authority (ANPD) could play this role, as it already has attributions focused on data protection and impacts of new technologies.