The Government (PT) discusses a new project to regulate digital platforms and define companies’ responsibility for the content published on networks. The proposal establishes criteria for the removal of posts that violate existing laws and to combat discourses of hate and mass disinformation.
Conversations about this text have started in recent weeks, on the wake of and on its platforms, such as Facebook and Instagram. In the first two years of office, the government tried to approve in Congress a proposal on the subject, but failed.
The new project under study provides that the platforms are subjected to a precautionary duty, similar to the European model of the “duty of care”, with the attribution of removing content considered criminal, without the need for court decision. It would be up to the government to oversee the general compliance with the rules by companies.
The center line of the project stipulates that the control is done by the platforms themselves in the case of illicit content, from violations of consumer law to pedophilia e. The government’s intention, at this point, would be to restrict moderation to crimes already provided for in Brazilian law and try to reduce the resistance of groups that point to regulation as a trail for censorship.
The text, however, paves the way for platforms to be required to take action on posts that contain “misinformation about public policies.” Companies would have a duty to act when they receive extrajudicial notifications, as well as combat the mass distribution of such material.
Definition of the word misinformation is a critical point of discussions about digital platform regulation. Proposals opponents point out that the prediction of such content control would give governments a tool to silence their critics.
The new proposal under discussion today was prepared by. A draft was presented last Friday (24) to a working group with Casa Civil, (Attorney General of the Union), CGU (Comptroller General of the Union), and Secom (Secretariat of Social Communication).
There are still divergences about content and the following political path. Once there is a consensus on the merits, Lula’s team will decide whether to congress the new proposal or incorporate their ideas into a project of opposition parliamentarians-the favorite is a text by Deputy Silas Câmara (Republicanos-AM).
The direction chosen by the government will also depend on the conclusion of the judgment in the (Federal Supreme Court) on the. The Court should decide whether Big Techs can be held responsible for third party publications, even if there is no court decision.
The new Lula administration’s proposal initially received the name of the Legal Framework of Protection of Digital Services users. Project details under discussion were obtained by Sheet with authorities from four ministries participating in the working group.
According to the text under discussion, the platforms would have three main obligations. In addition to the duty of precaution, for the removal of criminal content, they would have to act more broadly in reducing systemic risks, which would encompass the distribution of misinformation and hate discourse.
In addition, companies would need to give transparency to the terms of use, the operation of content recommendation algorithms and moderation reports.
The proposal determines that individual content control (ie the evaluation of publications) is done by the platforms themselves, from the rules defined by law.
The government, in turn, would create a committee with the function of overseeing the general behavior of each platform to determine if they are following the established criteria. This group would have the role of liable and punishing companies in case of omission in content control – which would give a state body any power over the operation of the platforms.
The government has not yet defined which agencies would be part of this committee. Possible participants would be (National Telecommunications Agency), ANPD (National Data Protection Authority) and (Administrative Council for Economic Defense) – last to monitor market concentration in the sector.
The text under discussion in the government provides for three levels of responsibility. In addition to self -regulation (individual removal of illicit content) and extrajudicial notifications, platforms should act only in case of court decision on journalistic content, reputation protection and situations of offense to the honor of public agents.
The government will still discuss the creation of specific rules for the election period, with the objective of defining an accelerated rite for moderation of content in campaign time – the theme that caused controversy when the Superior Electoral Court published a resolution that established a period of two Hours for the removal of content considered untrue.
It is also necessary to clarify what will be the scope of regulation (social networks, e -commerce, search tools and instant messaging) and possible sanctions applied to platforms that fail to comply with new rules (warnings, fines or suspension).
See main points of the text under discussion in the Lula administration
- Platform obligations
1) Duty of precaution and prevention: liability similar to the “duty of care” of European legislation, with content control considered unlawful by current law;
2) Reduction of systemic risks: duty to combat mass disinformation, hate speech and content considered extremist;
3) Transparency: disclosure of terms of use, recommendation algorithms, reports on moderation and external audits.
-
Three levels of responsibility
1) Self -regulation: responsibility of the platforms in individual removal of illicit content;
2) Extrajudicial notifications: Companies would work when reported in episodes of public policy disinformation;
3) Judicial decisions: Platforms would only have a duty to act on journalistic content, protection of reputation and offense to the honor of public agents.
-
State supervision
Government organs committee would have the function of overseeing general behavior of each platform and punishing companies in case of omission.