The TikTok algorithm is one of the best kept recipes. And even with the creation in April 2021 of a European transparency and accountability center, the different elements that make the Chinese platform so addictive remain unclear.
At the beginning of September a complaint was filed against the social network. The parents of Marie, a 15-year-old teenager who committed suicide in September 2021, blame TikTok for contributing to her daughter’s unhappiness. They believe the company’s referral system worsened her condition, pushing her to end her life.
The parents’ lawyer, Laure Boutron-Marmion, denounces to Franceinfo an “extremely powerful algorithm.” She believes that “through the algorithm, the teenager received a large number of these videos that deal with the same topic and which can only lead to being even worse.”
Identify each person’s preferences.
Interviewed by Tech&Co in March 2023, the co-director of AI Forensics (NGO specialized in the study of recommendation algorithms), Marc Faddoul, recalled how TikTok works.
As with all platforms, the goal is to keep users on your app for as long as possible. To do this, social networks try to offer content that everyone likes. This is where algorithms come into play. They are used to identify a user’s preferences in order to increase the duration of their interaction.
But in this game, TikTok seems the most competent. Especially thanks to the short format of his videos. “In an hour we will see perhaps fifteen videos on YouTube, but on TikTok we will have seen several hundred,” highlights Marc Faddoul.
For each video watched, the app includes a lot of information: did you watch it, did you like it, did you share it, did you ignore it? So much data that is used to refine the tastes that the algorithm attributes to a person.
More engaging negative content
But to this is added a point that contributes to the addiction of the social network. “The algorithm also especially favors polarizing or controversial content, because they are the most attractive,” adds Marc Faddoul. Therefore, negative content is more likely to be highlighted.
It is this specificity that Marie’s parents point out. Shortly before committing suicide, the young victim of weight-related bullying posted a video in which she talked about her discomfort. According to the complaint, which is currently being evaluated by the Toulon prosecutor’s office, it is from this publication that the TikTok algorithm recommends other content related to depression and harassment.
On its site, TikTok explains that it analyzes the content viewed by its users to recommend other videos. But the social network does not specify that the publication of content by the user themselves can influence the recommendations made by the algorithm.
The complaint from Marie’s parents was based on three reasons: “provocation to suicide”, “propaganda of means to commit suicide” and “not helping a person in danger”.
Source: BFM TV

