More than 24 hours after the June 8 stabbing attack in Annecy, videos continue to circulate on Twitter. Despite requests to remove content and filters, it’s still easy to bypass the algorithms.
The delegate minister in charge of the Digital Transition and Telecommunications, Jean-Noël Barrot, declared Thursday on Twitter that “the government is in contact with the Twitter France teams to ensure the removal of any shocking image.”
Camille Chaize, spokeswoman for the Ministry of the Interior, invited on Thursday night by TMC, nevertheless acknowledged that the withdrawal “is not done”, “it is always complicated”, she admitted.
In fact, governments don’t have much leeway when it comes to platforms. And if these videos are still present, this is related to two factors according to Olivier Ertzscheid, professor-researcher in information and communication sciences at the University of Nantes, interviewed by Tech&Co.
On the one hand, there is the lack of moderation and the unwillingness of Twitter to remove these videos. “This is explained in particular by the dismissals of the moderation teams and although these have never been very numerous, it is not Twitter’s priority”, he explains. “Human restraint is largely insufficient in terms of numbers.”
On the other hand, it is difficult to delete these videos once and for all “because it is easy to fool the algorithms”, continues the researcher.
The algorithms do not detect if the same video has been published, but in a slightly different version. “This fools the unique identification system associated with each video.” For example, if an original video is trimmed for a few seconds and then reposted, Twitter’s algorithm cannot recognize that it was originally a violent video that was first removed. The same with a change of music.
“Muddy and Dull”
A violent video will tend to get users to react more and will be offered in the “for you” feed. “Social networks are first and foremost a community of emotions,” recalls Olivier Ertzscheid.
Regarding moderation, “the internal process is still quite cloudy and opaque.” The algorithm is fallible, analyzes frame by frame and does not take context into account. Despite numerous complaints about the attack videos, Twitter may consider that they do not violate the platform’s rules. The algorithm may take this as a movie scene.
On Facebook, moderation can be more difficult if, for example, it is a video capture and it complicates the detection of violent content.
The relationship between the Government, which calls on Twitter to do what is necessary, and the platforms is above all a matter of “discretionary power”. The action of removing this content is the responsibility of the platforms. Please note that they may be prosecuted if they have not removed the flagrantly illegal content that has been reported.
Olivier Ertzscheid regrets that platforms like Pharos only have 60 people. “A totally destitute operation in terms of what is at stake”, he annoys himself.
On the other hand, it emphasizes that the Digital Services Law could advance from the point of view of the obligation of the platforms before the States.
Source: BFM TV
