The French affiliate of Amnesty International denounces, in a report published on Tuesday, the “spiral” effect of the algorithm of the social network TikTok, accused of amplifying exposure to content related to suicide or self-mutilation. For this reason, the NGO contacted Arcom, the audiovisual and digital regulator.
“Amnesty International France has decided to contact Arcom to file a complaint under the DSA (Digital Services Act, the European regulation on digital services, editor’s note) against TikTok for failure to comply with its obligations,” Katia Roux, advocacy officer at Amnesty International France, told reporters.
New evidence…
According to the NGO’s conclusions, adolescents “who show interest in content related to sadness or psychological discomfort” are directed in less than an hour to “depressive content.”
After publishing a report in 2023 on the social network’s algorithm, Amnesty carried out new experiments in France.
This report “provides new evidence of how TikTok exposes young people on its platform to content that can be harmful, which can normalize, trivialize or even romanticize depression, self-harm or suicide,” Ms. Roux said.
Asked by AFP, the social network estimated that “without taking into account how real people use TikTok, this ‘experiment’ was designed to achieve a predetermined result.”
Arcom confirmed to AFP that it had received the referral from the NGO. The authority “intends to communicate to the European Commission (…) any evidence in its possession that reveals, after an investigation, a violation by the platform, in France, of its obligations under the DSA,” he added.
In February 2024, the European Commission opened an investigation against TikTok for alleged deficiencies in the protection of minors.
“Suicidal thoughts”
The NGO created three fake profiles of 13-year-old teenagers on TikTok and scrolled through the personalized feed, called “For You”, to see content that evoked “sadness or mental health problems” for several hours.
“Within 15 to 20 minutes of the start of the experiment, all three feeds contained almost exclusively mental health videos, with up to half of them featuring sad and depressive content. On two of the three accounts, videos expressing suicidal thoughts appeared within 45 minutes,” the report says.
Twelve automated accounts were then created, in conjunction with the Algorithmic Transparency Institute association, integrating the history of the first three accounts.
The NGO noted an expansion of content on mental health, although less significant than in the manually managed accounts.
“TikTok has not taken adequate measures to identify and prevent the risks to which the platform exposes young people,” said Katia Roux, pointing out the failure to comply with the obligations imposed by the DSA since August 2023.
The social network claimed to “proactively offer a safe and age-appropriate experience for teenagers.” “Nine out of ten videos that violate our rules (are) removed before even being viewed,” TikTok insisted.
Source: BFM TV
