The European Parliament is calling for ambitious EU-wide measures to protect minors online, including setting an age limit of 16 and banning the most harmful and addictive practices, in a resolution adopted on Wednesday.
MEPs adopted a non-legislative report on Thursday with 483 votes in favor, 92 against and 80 abstentions, expressing deep concern about the physical and mental health risks faced by minors online. They call for better protection against manipulative strategies that can increase addiction and impair minors’ ability to focus and interact judiciously with online content.
To help parents manage children’s digital presence, the EP proposes a European-wide harmonized age limit of 16 years for accessing social communication platforms, video sharing services and AI interlocutors. According to the resolution, minors between the ages of 13 and 16 should have access only with the consent of their parents, informs the EP in a press release.
MEPs welcomed the European Commission’s efforts to develop an EU age verification application and European digital identity (eID) wallet and insisted that age verification mechanisms should be accurate and protect the privacy of minors. According to the EP members, these systems should not remove the responsibility of platforms to ensure that their products are safe and age-appropriate from the design stage.
In order to encourage compliance with the EU Digital Services Regulation and other applicable legislation, the EP suggests that senior management should be able to take personal responsibility in cases of serious and persistent non-compliance with the provisions, especially when it comes to the protection of minors and age verification.
Parliament calls for a ban on the most harmful addictive practices and the default disabling of other design features that create addiction for minors, such as “infinite page scrolling”, “autoplay”, “swipe to reload” or reward and incentive systems for continued use.
The EU legislature also calls for the ban of sites that do not comply with EU rules and measures against persuasive technologies, such as targeted advertising, influencer marketing, addictive design and deceptive interfaces (dark patterns), in the future legislative act on digital equity.
The resolution also proposes banning referral systems for minors based on interaction, extending the application of the Digital Services Regulation to online video platforms and banning “loot boxes” and other in-game content, including coins, wheels of fortune and pay-to-advance mechanisms.
At the same time, it is desired to protect minors from commercial exploitation, including by prohibiting financial incentives offered to children-influencers.
It is very important, the EP considers in the adopted resolution, to take urgent measures to regulate the ethical and legal challenges raised by tools based on generative artificial intelligence, including deepfakes, company chatbots, AI agents and synthetically generated nudity applications, which can produce manipulated images without consent.
“I am proud of this Parliament, that we can be united for the protection of minors online. Together with a firm and consistent application of the Digital Services Act, these measures will dramatically increase the level of protection of children. We are finally deciding on some limits. We are telling the platforms clearly: your services are not designed for children. And the experiment ends here,” said the Danish rapporteur Christel Schaldemose (S&D), during the plenary debate on Tuesday evening.
The EP report cites research showing that 97% of young people use the internet daily, and 78% of those aged between 13 and 17 check their devices at least once an hour. At the same time, one out of four minors falls into problematic or dysfunctional use of smart phones, adopting behavioral patterns similar to addiction. According to the 2025 Eurobarometer survey, more than 90% of Europeans believe urgent measures are needed to protect minors online.
