The end of the hearings of the Research Commission on the psychological effects of Tiktok is approaching. After more than a hundred audiences of specialists, investigators, victims or even influential, it is now the turn of the application officials that the 30 deputies listen.
In the heart of discussions, the platform recommendation algorithm, accused of numerous studies, as the most likely they lock internet users in an avalanche of homogeneous and repetitive content, sometimes very dangerous.
Less and less moderators?
And the consequences of this algorithm can be dramatic. 11 French families have assigned Tiktok to the Court, criticizing the social network for having recommended videos of minors who promote suicide, self -control or eating disorders.
The other point of tension is moderation in Tiktok. The application of short videos is regularly accused of not implementing sufficient measures to protect their community or prevent young people from accessing the platform … Although Tiktok is prohibited for less than 13 years.
For this, the company has its moderation algorithms and, above all, its moderators. “100% of the content on the platform is moderate and less than 1% of the content does not meet our conditions of use. For these, 80% are automatically eliminated. Human moderation then arrives,” says Brie Petum, responsible for the technical aspects of the regulation.
Therefore, these moderators are responsible for the most sensitive content, which has been analyzing for longer. The problem is that if videos or comments are increasingly numerous, the number of moderators does not increase. As proof, 509 are responsible for the French -speaking content in Europe. It is 125 less than in the first quarter.
Many moderation defects
“Everything depends on local changes in terms of content volumes. (…) We also invest a lot to prevent someone from seeing this content,” replies Brie Petum, in particular betting on algorithms or AI.
Another problem, Tiktok says that only a small part of the “problematic” content passes between the network meshes. “The moderation of the content has many layers to achieve a maximum degree of security,” continues Brie Petum.
However, the reality is quite different. Even if there is only a small amount of problematic content “the algorithm highlights them”, observes Laure Miller, rapporteur of the Investigation Commission. Worse, the device allows “a lot of content.”
And the deputies do not lack an example. Last March with Hashtag Skinnytok, which promotes thinness, sometimes extreme. However, many of these contents have been pushed into the “for you” cable. And, so almost 100,000 contents and the hashtag have been eliminated by the platform, the videos continue to swarm in Tiktok.
Simply change the key word and write “thin” or “Fear Food” in the search bar to notice it. “Of the first ten recommended videos, half are problematic,” said Arthur Delaport, president of the Committee, who performed the live test.
For their part, Tiktok moderators kick. “In January, our team began to follow Skinnytok’s trend,” said Brie Peche. “There will always be Skinnytok content on the platform because some videos do not violate the rules and encourage healthy eating.”
“A moderation error”
The same observation for clear references to the Holocaust and antisemy codes. If an association reported these videos, the platform assured him that he did not contradict the rules of the community. “A moderation error,” he acknowledges Brie Pugm.
And if it is known that some influence accounts flirt with the rules of the community with gray areas, or even to break them, few of them are banished or eliminated. And that, despite the reports of the users or alerts launched by several members of the Government, including Aurore Bergé.
This is, for example, the case of Alex Hitchens, who multiplies sexist words. Therefore, he advises to his community that “slapped” women, or occupies “whores.” After a long silence, Brie Pugm recalls that “it is difficult to look at all the accounts and see that these users do not respect all the rules.”
Regarding the lives, the parliamentarians of the commission remember that some minors manage to avoid the rules and age verification processes to access it or even participate. However, this format is prohibited by children under 18. They indicate an addiction and confinement scheme, through a gamification process around the gifts that some young people offer to live to influencers.
A girl even managed to make lives and the platform paid him. If his mother informed Tiktok that his daughter was a minor, the bud designer continued to be paid for the request.
“Complete out of tune with reality”
Again, Tiktok officials are struggling to answer. “It should not have happened (…) It is not our philosophy,” admits Nicky Soo, after several seconds without responding. “The team will think about how to act.”
“Your answers are full of scrubbing with Tiktok,” elevates Laure Miller several times. After long exchanges, moderators do not end up recognizing it. “Obviously, there is content that passes through the system holes, there are failures,” Brie acknowledges.
Regarding age verification, again, deputies point out several deficiencies. If the platform is prohibited for children under 13, several studies remember that 50% of children from 8 to 11 years old have an account on the platform.
“The age verification does not stop when the account is created,” says Nicky Soo. In fact, if the user has a “suspicious” behavior, several verification methods are offered to Internet users, such as selfie with an identity document. Therefore, “642,000 accounts of children under 13 prohibited” in 2024 the manager continues.
A method considered insufficient for deputies. “One million users under 13 are present at Tiktok,” says Arthur Delaport. Again, silence is heavy in the room. “There is no final solution, we take proactive measures,” says Nicky Soo. “We work with external partners, with regulators or platforms and telecommunications to find solutions for minors.”
These hearings continued this afternoon with Tiktok France officials. These various testimonies should help the 30 deputies to better understand Platform algorithm forks repercussions In young people. The final conclusions of the Commission will be presented no later than September 12.
Source: BFM TV
