Almost 18 months after an International Investigation investigated to have warned you to risk that you are attracted to a “black hole” of toxic content in Tiktok – related to depression, suicide, extreme diets and body image – the organization accuses the platform of continuing without recognizing the problem or present new measures to solve it.
The investigation simulated the behavior of 13 -year -olds and revealed that, only 20 minutes After creating an account and showing interest in mental health topics, the feed “for you” began to present a High quantity of content on depression and suicide – Some of them romanticize or encourage these themes.
According to Amnesty International, in a statement, the platform did not recognize the problem of “black hole” content related to depression and Suicide has not even shown to have implemented new specific actions to solve it.
In its risk assessment of 2024, under the O tiktok admite que crepetitive ontocks can negatively affect mental well-beingespecially of the younger ones, but continues to resort to existing technical solutions, such as content filters e feed redefinition.
Amnesty also highlights the way the Tiktok collects and uses data from usersincluding signs of your emotional state, to customize content – something that, according to the organization, violates the right to privacy.
a platforma nega, but admits to use automatic learning algorithms to avoid repeated recommendations of sensitive themes, which will not have solved the problem identified in 2023.
“The suggestion of amnesty International that Tiktok is some way aware and uses the user’s emotional state to recommend Contents is a wrong characterization of the operation of our platform, “the platform said in response to the NGO analysis.