HomeTechnologyModeration, algorithm, protection of minors: we visit the Tiktok transparency center ......

Moderation, algorithm, protection of minors: we visit the Tiktok transparency center … not so transparent

Under pressure from regulators in the United States, Europe and France, the short video app is trying to restore its image. To do this, it opened the doors of its transparency and moderation center in Dublin to a handful of French journalists.

It’s a last chance charm operation for Tiktok. This time, the social network plays the transparency card and invited some representatives of the French press to go to Dublin. It was there, in the dock district, that the short video application decided to establish its transparency center, a kind of European headquarters. Or rather, his attempt to show the white leg to the regulators.

A building that is discreet

The “Transparency and Accountability Center,” or TAC for Friends, is the fourth to open a year and a half ago, after Los Angeles, Washington and Singapore. The European version is quite discreet. Here, there is no Tiktok logo on the big tower. Videos in front of the building are formally prohibited. “Even the Visio in the corridor,” we are slipping. The TAC address is not found on the Internet.

Despite all these mysteries, Tiktok assures you: the Transparency Center allows visitors to learn a little more about how the application works. “The TAC is the opportunity to dive into how the algorithm and our moderation system works,” introduces Valiant Richey, global head of awareness and partnerships, trust and safety at Tiktok.

The words are chosen carefully. It must be said that in recent months, burned files have accumulated for the social network. Threatened with a ban in the United States, the Chinese application is also closely scrutinized in France. In mid-September, the Parliamentary Commission of Inquiry into Tiktok alarmed the “devastating” effects of the social network on the mental health of young people. Moderation is considered “insufficient and negligent”…as are the responses from Tiktok leaders. Given this observation, the deputies proposed, among other things, prohibiting social networks for children under 15 years of age.

So in Dublin, nothing is left to chance. The colorful presentation videos are Legion, the well-established speech. “Like other large platforms, we rely on automatic and human moderation,” Valiant Richey said quickly.

Before being published on Tiktok, each video is analyzed through a filter. The tool determines whether the image, text or audio violates the platform’s rules, according to hundreds of parameters and taking into account the context. For example, if a weapon is detected or if a person is naked, the video is deleted.

600 French-speaking moderators

In the first quarter of 2025, the social network deleted 211 million videos. 87% of them were eliminated by automatic systems, the rest is for the human eye to appreciate.

In total, the company employs 4,600 moderators dedicated to European languages, including 600 for French-speaking content in the EU. A figure that continues to decrease. A trend that may seem surprising since, according to Adam Rows, head of consciousness and associations: “Humans are better at analyzing context or cultural specificities.” Take, for example, the relay of automated systems on hate subjects online.

Tech & Co was able to step into the shoes of a moderator for a few minutes. If a video of two young girls turning their fingers on is undoubtedly prohibited, the choice of other videos is more difficult. Is eating frozen honey dangerous for your health? Maybe. However, the system does not indicate that the video breaks the rules.

Some users also have fun cheating moderation by modifying certain keywords to avoid being detected by the algorithm. On Tiktok, “sex” becomes “seggs” and dead becomes “not alive.” So many signals that can fool the moderators, who don’t have all day to decide.

It is impossible to know how many videos are examined per day by the moderators, nor the time dedicated to each content. “It can last a few seconds,” Tiktok admits. “It’s difficult to give a specific number of videos, you have to find the right balance between efficiency and well-being. But they have breaks,” said Adam Rows.

This system would allow Tiktok to remove 99% of the content before they are even viewed. This margin of error seems negligible. However, 1% of content is still a considerable mass… especially when you know that no less than 270 videos are published every second on the platform.

A platform prohibited for children under 13 years of age?

Enough to bring out devastating trends, like #skinnytok. Last April, several very young Internet users were recommended by the algorithm for content that promotes thinness, sometimes extreme.

Precisely, the visit particularly insists on the protection of minors, the central point of the discussions of the Titkok Commission in the National Assembly. Therefore, a number of restrictions apply depending on age. For example, 13-15 year olds’ accounts are private by default, private messages blocked, lives banned, and usage limited to 60 minutes per day. Beyond that, you need a parental code. Children under 1.18 cannot purchase or receive virtual gifts.

The application is prohibited for children under 13 years of age. At least in theory, since registrations are based on a user declaration. The minor user is free to enter a false date of birth. No evidence will be requested. In France alone, three quarters of young people under 13 years of age in 2025, born after 2012, declare that they regularly use social networks, including Tiktok, according to the social study born last September.

When asked about this hole in the racket, the Tiktok boot got in touch. The company recalls that its algorithms “scan videos posted online by accounts suspected of belonging to minors and analyze their online behaviors, such as likes or videos viewed,” Listing Valiant Richey. Thanks to this method, 21 million profiles have been deleted.

DSA Limited Efforts

Latest hot topic, algorithm, secret jealously protected by the company. Considered “highly addictive”, the system offers Internet users the most suitable videos for their tastes to keep it as long as possible. A device accused of locking Internet users in a “filter bubble”, that is, only proposing content similar to each other. The phenomenon can be particularly dangerous when it comes to videos related to suicide, self-control, or even hate speech.

“If our teams identify a ‘filter bubble’, we ensure that new videos are injected into the cable to disperse the content,” says Adam Stirs, who does not forget to list the different methods available to the user to customize the application. “Internet users can reset recommendations or filter out certain words or hashtags.”

Nothing very revolutionary, since very large online platforms, of which Tiktok is a part, are obliged, since the European digital services regulations (DSA), actively fight against filter bubbles.

A priori regulation more or less respected by the platform. In fact, the short video app is under investigation by the European Commission for non-compliance with the DSA. The concerns of the European digital gendarme relate in particular to “the protection of minors”, “access to data for researchers”, as well as the “risks linked to the addictive design” of the platform and “harmful content”. In the event of a violation, the company could be fined up to 6% of its annual worldwide turnover.

Therefore, the platform is trying to take the lead. Servers intended to accommodate part of the recommendation algorithm are under construction at the TAC. “They will be accessible to organizations and governments that we approve,” says Valiant Richey. A certain effort, which affects a strategic point, particularly opaque. The “code” behind Tiktok will obviously not be accessible to everyone, therefore the audience is logically limited.

But behind this Tiktok trailer sometimes shows a certain reluctance. At the end of the Transparency and Accountability Center, we remembered that the Tiktok Commission of the National Assembly had been invited to visit it. At that time, the Chinese social network wanted to demonstrate its good faith. Except French elected officials had rejected the offer, why visit the TAC, after being denied access to a real moderation center in Portugal? Believing that Tiktok has transparency with variable geometry, under control, half measures, when it adapts to it. This transparency has another name: Opacity.

Author: Salome Ferraris
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here