
The European Commission has issued a very serious warning to TikTok, Facebook and Instagram: in a preliminary conclusion, the European Executive says it has found that none of these popular digital platforms adequately guarantees the ability to investigate and report potentially harmful illegal content, especially for minors, such as sexual abuse or terrorism material. In the case of Meta, Facebook and Instagram services, Brussels also criticizes them for how difficult it is for users to challenge a decision regarding content moderation, including a potential suspension of their account.
The EU is trying to better protect consumers and their fundamental rights online, combating illegal content and demanding more transparency from large platforms. Since then, Brussels has opened several investigations into various platforms for alleged violation of the premises of this regulation, highly criticized by Donald Trump’s Government, which has come to describe it as censorship. An extreme that Brussels categorically denies, explaining that what it seeks is to preserve the freedom of expression of users in the face of “unilateral” moderation decisions of these platforms, controlled by large technology companies.
The case that has now led the Commission to make a serious call to attention to TikTok, Facebook and Instagram does not deal in itself with the illegal content that these platforms can distribute, but with the mechanisms they have so that users can alert them if they come across material of this type: it considers them too opaque to guarantee their efficiency.
According to the preliminary investigation, the three platforms have such “cumbersome” procedures and tools that they do not allow researchers “adequate access” to public data. Something that, Brussels denounces, “leaves researchers with partial or unreliable data.” And this is not trivial: “Allowing researchers access to data from platforms is an essential transparency obligation under the DSA, as it facilitates public scrutiny of the potential impact of platforms on our physical and mental health,” the Commission recalls in a statement.
In the case of Facebook and Instagram, both belonging to Meta, European researchers also believe that the company is not complying with the obligation to guarantee “simple mechanisms for reporting illegal content.” The systems it has in place are “confusing” and “dissuasive,” because Meta imposes “several unnecessary steps and additional demands on users” that complicate a process that should precisely facilitate.
In addition, Brussels considers that on its platforms, Meta is also violating the right of European users guaranteed by the DSA to challenge a content moderation decision, for example, the reasons why an account has been suspended. Specifically, the appeal mechanisms of these platforms “do not appear to allow users to provide explanations or evidence to support their appeals,” the Commission notes. A situation, he explains, that “makes it difficult for EU users to explain in more detail why they do not agree with Meta’s decision on content, which limits the effectiveness of the appeal mechanism.”
The research is not yet definitive and the aforementioned platforms now have the possibility of replicating and providing their own data. But the step taken by Brussels this Friday is much more than a wake-up call: if the European Executive publishes preliminary conclusions, it is because it considers that it has sufficient evidence of an infraction, based in this case, community sources point out, on “hundreds of complaints” received. The investigations were opened in February (TikTok) and April (Meta) 2024.
“Our democracies depend on trust. That means that platforms must empower users, respect their rights and open their systems to scrutiny,” recalls Commissioner for Technological Sovereignty, Henna Virkkunen. The objective of this investigation is “to ensure that the services, as guaranteed by EU legislation, are in the eyes of users and society,” the statement adds.
If the platforms fail to convince the Commission and it ends up concluding, on a date not yet set but which will not be known in any case until well into 2026, that its preliminary conclusions were correct, it may issue a “decision of non-compliance”. Something that, Brussels reminds, can lead to a fine of up to 6% of the supplier’s total annual worldwide turnover. In addition, the European Executive can also impose periodic sanctions to force a platform to comply with the regulations.