An unprecedented measure in the world comes into force in December and requires platforms like TikTok and Instagram to deny access to children and teenagers
From December 10th, it will require blocking access to social networks for children under 16 years of age. The new legislation, considered a pioneer globally, obliges digital platforms to prevent the creation of new accounts by this age group and to remove existing profiles. The responsibility for complying with the law lies with technology companies (the so-called big techs). Platforms like , , YouTube, (formerly Twitter), Snapchat and Threads they will have to adopt strict age verification systems and apply “reasonable measures” to block access by minors.
The central objective of the measure is to protect the mental health and safety of young people. The Australian government seeks to reduce children’s exposure to harmful content and combat . Official data presented by the authorities indicate that 96% of Australian children between 10 and 15 years old use . According to the government, the vast majority of this public has already been exposed to risks in the online environment.
Challenges and criticism
Despite the intention to protect minors, the proposal is not unanimous. Technology and digital rights experts raise concerns about the effectiveness of the measure. The main criticisms involve:
- Technical difficulties: The complexity of creating age verification systems that are not easily circumvented.
- Privacy: The risk that the massive collection of data to confirm the age of users compromises the privacy of the entire population.
*With information from Bruno Meyer
