Autoplay and infinite feed: what ECA Digital prohibits and why

The Digital Status of Children and Adolescents Law (ECA Digital) is the new Brazilian standard for the protection of minors on the internet. Sanctioned in 2025, it came into force in March 2026 with the aim of increase the safety of young people on online platforms.

Among the various measures, the law requires companies to verify the age of users, adopt moderation systems and supervision tools.

Furthermore, it establishes that digital responsibility is shared; platforms, families, society and the State must take care of young people. The idea of ​​the new measures is to avoid practices that encourage screen addiction, eroticization and other digital risks.

ECA Digital: a brief explanation

The ECA Digital (Law nº 15,211/2025) emerged after years of debate, motivated in part by reports of exploitation of children on the internet.

Nicknamed who denounced the adultization of minors on social networks, the law provides more complete guidelines on the rights of children and adolescents in the online world.

It is important to highlight that it does not replace the 1990 Child and Adolescent Statute (ECA), but extends the protection provided for to the digital environment.

Among the main requirements, determines which platforms should Adopt reliable age verification to prevent children from accessing inappropriate content.

In case of non-compliance, fines of up to 10% of revenue in the country are expected, limited to R$50 million, and up to the suspension or prohibition of operating in Brazil.

What ECA Digital prohibits and why

ECA Digital prohibits the use of practices that can manipulate young audiences using platform resources. The law explains that These resources exploit children’s vulnerability to keep them online longer.

In official law materials, infinite scrolling, notifications with a sense of urgency and obstacles to leaving the service are just some of the practices that are being curbed.

As the government explains in official documents, the objective of ECA Digital is also curb “manipulative designs” that create a false sense of urgency or dependence.

Manipulative platform design

Known as manipulative design patterns, these strategies exploit human behavior and emotional triggers to influence decisions and prolong the use of services.

Some examples include interfaces that look “cute” in and insistent notifications in apps. In general, they are resources that explore the digital and emotional vulnerability of children and adolescents to keep them connected longer.

In a message to CNN Brazilplatform specialist and technology partner at Licks Attorneys, Douglas Leite, states that this type of digital architecture can explore characteristics of human behaviorsuch as the tendency to repetition and the search for constant stimuli.

“In the online environment, this gains even more scale because the interface can combine design, personalization and recommendation in real time. The point of attention, especially from a regulatory point of view, is when these mechanisms start to operate more intensely on vulnerable audiences, such as children and adolescents”, Douglas explains.

Check out some manipulative design practices that are in the sights of ECA Digital:

Autoplay or automatic video playback

  • What is it: autoplay is the automatic playback of videos, which starts the next content as soon as the current one ends. This feature is present in applications such as YouTube, Facebook and TikTok and makes browsing continuous, without interruptions for the user.
  • Negative effect: By removing the need for user action, such as simply clicking “play,” autoplay can increase screen time and make it difficult to pause. The digital ECA aims to limit automatic reproduction for children and teenagers on platforms.

Infinite scroll/infinite feed

  • What is it: infinite scroll (infinite feed) continually loads more content when the user swipes down the page. It is quite common on social networks and causes new posts to appear without having to click on “see more”.
  • Negative effect: This constant possibility eliminates pauses during navigation, keeping the user’s attention for longer. In regulatory materials, infinite scrolling appears as one of the main examples of manipulative practices that ECA Digital seeks to restrict.

Urgency notifications

  • What is it: These are alerts sent by platforms with an urgent appeal to attract the user back. For example, notifications like “your friend is waiting” are used to encourage a return to the app.
  • Negative effect: These notifications create a feeling of immediacy, pressuring children and teenagers to access the service immediately.

Screen time rewards

  • What is it: These are mechanics in which the user receives “rewards”, such as virtual gifts, prizes or scoring systems. AND of platforms.
  • Negative effect: the user is more inclined to continue browsing in the expectation of receiving a new “prize”.

Privacy controls and parental supervision

  • What is it: Many platforms intentionally hide or complicate access to privacy settings and parental controls. For example, they require clicking on “hidden” menus to adjust time limits, block content or activate security filters.
  • Negative effect: This way, it is much more difficult to control what is appropriate for children and teenagers. Therefore, the new law requires apps to offer easy-to-use parental supervision and privacy tools for parents and guardians.

Challenges of implementing ECA Digital

The application of ECA Digital must face technical obstacles to apply the new rules. This is because some requirements, such as age verification mechanisms, still depend on technological solutions that raise doubts about effectiveness, balance and privacy.

Furthermore, each online service works differently, with different resources and risk levels, which makes it difficult to adopt the same guideline for all cases.

According to Douglas, another important challenge will be to transform the obligations of the law into clear inspection and compliance criteria. For him, it is not enough to create the rule, it will also be necessary to define how it will be applied in each digital environment.

“Platforms have a relevant role, without a doubt, but the effectiveness of the rule also depends on digital education, awareness and adequate use of available protection tools. In the end, the challenge is not just to monitor the law, but to make it produce concrete protection”, adds Leite.

source