The change is to reduce the comfort zone that has formed when it comes to exposing minors in the digital environment
The internet has opened up space for creativity and given a voice to many people. But it also created an uncomfortable phenomenon: childhood became an asset for engagement.
Today it is common to see children dancing, giving opinions, recommending products, accumulating followers and, in some cases, generating revenue. A lot of people treat this as “light content”. But behind it there is a very real economy. Influencer marketing has already been estimated to be worth around US$21.1 billion in 2023, according to Influencer Marketing Hub, which helps explain why attention has become currency, and why exposure to minors has entered the scale logic.
It is in this environment that the so-called ECA Digital comes into play, in force since March 17th. The main change is not “banning children from the internet”. This would be simplistic and, in practice, unfeasible. The change is to reduce the comfort zone that has formed when it comes to exposing minors in the digital environment.
A sensitive point is age. ECA Digital pushes for more effective age verification mechanisms and reduces the logic of self-declaration in age-restricted services. This changes the way the platform works in practice: registration, access, user experience and, of course, the growth of the service itself. Platforms that grew without asking much will have to choose between adapting the model or taking risks.
But the law goes beyond registration. It shifts responsibility from a reactive mode, based on reporting and removal, to a preventive logic: reducing risk before damage. This is where more functional parental controls, limitations of certain functionalities, care with design designed to keep the child stuck on the screen and barriers to the commercial use of children’s data come into play.
This is already happening in practice. Authorities and regulatory bodies have already signaled, in practice, an increase in supervision and demands on the processing of data from children and adolescents and on protection mechanisms in the design of digital platforms.
So far, the discussion seems technical. But the most difficult point is another: the problem is not just inappropriate content. It is the economic model that learned to profit from early attention.
This is where the gray area that will require more work comes in: child influencers.
When participation in a network stops being spontaneous and becomes a production routine, with metrics, contracts, publicity and performance expectations, the legal question changes: where does expression end and economic exploitation of a minor’s image begin?
There is also an element that is usually left out of the discussion: the role of parents. In many cases, exposure is not “an accident” of the digital environment, but an active decision by the family nucleus, with a recording routine, pressure for engagement, publicity and revenue expectations. This does not automatically turn parents into villains, but it imposes a greater duty of care. Children do not have the maturity to consent to their own economic exploitation, nor to measure the impact of a permanent exhibition that is recorded forever. When monetization becomes an objective, the border between family life and digital child labor can become blurred, and that is exactly where the responsibility of adults becomes a central part of the problem.
The institutional debate has already begun to touch on this. The topic of child influencers has been treated as artistic child labor in certain contexts and, increasingly, tends to migrate to concrete discussions, including involving platforms and brands.
In practice, ECA Digital does not need to respond to every detail to produce an effect. It changes the environment. By demanding a more active protective stance and pushing limits on data use and engagement, the law closes the door on improvisation and pushes companies towards clearer rules and controls.
What really changes for companies
You can’t improvise here. Brands and platforms will need to review assumptions, and quickly.
Platforms will have to review age verification, recommendation, engagement patterns, reporting flows and protection built into the platform’s operation. Brands will need to rethink campaigns with children, contracts, proof of consent and criteria to not encourage undue exposure. And the legal department will have to sit down with marketing, product and data to answer simple and uncomfortable questions: who is this audience, how is it identified, what data goes into segmentation and what is the real risk of profiting from child exposure?
If the company cannot map this clearly, it does not have compliance. Have hope.
The risk here is not just regulatory. It is reputational, operational and, increasingly, judicial. Issues involving childhood mobilize quickly, and the social reaction usually comes before the fine.
Therefore, ECA Digital is not the end point. It is the beginning of a reorganization of the market: more duty of care, more demand for prevention, less tolerance for monetization disguised as “content”. Companies that have not yet looked at their own exposure to this risk are not being prudent. They are betting that the problem will knock on their competitor’s door before theirs.
*This text does not necessarily reflect the opinion of Jovem Pan.