The (Supreme Federal Court) decided on Thursday (26) to expand the obligations of digital platforms in Brazil by declaring the partial unconstitutionality of Article 19 of the Civil Marco da.
The device, in force since 2014, established that social networks could only be held responsible for content posted by users if they breached a court order of removal.
The trial discussed whether this protection was excessive and left unprotected users against harmful content.
By 8 votes to 3, the ministers have decided that the current rule does not adequately protect fundamental rights and democracy, creating new obligations that come into force immediately, but only apply to future cases.
Next, the main points that change the operation of platforms in the country.
List of bass content with mandatory removal
The Supreme Court has created a list of content that must be proactively removed by the platforms, before there was a court order:
-
Attacks on democracy and crimes against the democratic rule of law
-
Terrorism
-
Suicide inducement
-
Racial, religious or sexual orientation discrimination
-
Violence against women
-
Sexual crimes against children and child pornography
-
Trafficking
Companies will not be punished by isolated posts that escape, but by “systemic failure” – when they fail to adopt adequate measures to prevent and remove these contents.
According to the decision, it configures systemic failure not to act “responsibly, transparently and cautiously” in moderation. Platforms should use “higher security levels” available technically for their activity.
The decision does not specify which body will be responsible for assessing whether there was systemic failure.
The platforms are becoming responsible after extrajudicial notification (without a court order) by content that configures crimes or illicit acts, except for crimes against honor.
For crimes against honor, such as slander, defamation and injury, the court order is maintained for liability, although the platform may remove content voluntarily after extrajudicial notification.
The new rule also applies to accounts reported as false or inaictive, which must be analyzed after notification.
When a court decision has already recognized content as offensive, their replications must be removed by all platforms after simple notification, without a new court order.
Sponsored content
Driven posts or paid ads will have automatic platform liability, regardless of notification.
As companies profit directly with these contents, the STF understood that they should previously verify their legality. If the content is illicit, the platform responds even without being warned.
Providers will be excluded from responsibility if they prove that they have been diligently and in a reasonable time to make the content unavailable.
Compulsory legal representative
All platforms that operate in Brazil must be thirsty and legal representative in the country, with powers to respond to justice, comply with judicial determinations and pay fines.
The measure aims to facilitate the accountability of foreign companies that today operate without legal structure in Brazil.
Self -regulation
Platforms should create their own accessible channel systems accessible to users and non -users, notification analysis process and annual transparency reports on removal.
Application
Email services, video conferencing and private messages (such as WhatsApp) continue to follow the current rule – they can only be held responsible after court order because they are protected by the confidentiality of communications.
Marketplaces continue to respond for the Consumer Protection Code.
The new rules are valid immediately, but only apply to future cases. Proceedings in progress and decisions already transit in res judicata will not be affected.
The Supreme Court also appealed to Congress to create more detailed legislation on the subject.