HomeTechnologyLindsay's suicide: what responsibility do social networks have in cases of harassment?

Lindsay’s suicide: what responsibility do social networks have in cases of harassment?

Lindsay’s family is targeted on Facebook for failing to moderate and combat hate speech. In the past, the platform has already been the subject of complaints in cases of cyberbullying.

What responsibility do social networks have in cases of cyberbullying? Among the four complaints filed by Lindsay’s family, this 13-year-old schoolgirl who committed suicide on May 12, one points to Meta, the parent company of Facebook and Instagram. The two French entities are accused of “not helping anyone in danger”.

Interviewed by Tech & Co, the family’s lawyer denounces “enormous failures” by social networks regarding “content moderation and the fight against hate speech directed at the victim”, before his suicide , but also after. Instagram is also flagged by Me Pierre Debuisson, who cites an “account in which ‘Lindsay is finally dead’ is written.”

Contacted by Tech&Co, a Meta spokesperson responded: “We send our deepest condolences to Lindsay’s family and loved ones. We do not tolerate bullying and harassment on our platforms and continue to take action against content and accounts that violate our policies when we realize”. from them.”

“A space of anarchy”

“We are going to put pressure on social networks in a much more accentuated way,” said the Minister of Education, Pap Ndiaye, BFMTV guest this Thursday.

He also states that his ministry will consult “with others” to “act extremely assertively when it comes to social media.”

“Cyberbullying never stops,” recalls Samuel Comblez, director of operations for the E-Enfance association, interviewed by BFMTV. “Where social networks have a responsibility is that from the moment content is reported, they have the obligation to delete it or even close accounts, and it is difficult to obtain answers.”

He also points out that in “cyberbullying, the longer we wait, the more content will continue to be disseminated.” “There is no reason for the digital space to be a space of anarchy,” he insists.

Other complaints against Meta

This is not the first time that Meta has been the subject of allegations of cyber bullying. In France, in 2019 a complaint was filed against Facebook in the context of a case of cyberbullying at a school in Nîmes. University students were pointed out for having spread pornographic images and montages of their victims on social networks. Messenger and Snapchat have been the subject of a complaint for “dissemination of images and photographs of a pornographic nature and complicity in the corruption of minors.”

In October 2022, the English justice recognized the responsibility of social networks in the suicide of 14-year-old Molly Russel, five years earlier. Investigators tasked with establishing the cause of death have established that “negative effects of online content” “contributed” to her death. Some praised acts of self-harm committed by young people.

Facebook and Pinterest were partially responsible, but were not fined and no charges were filed.

In the United States, in January, a mother filed a complaint against Instagram after her daughter’s suicide. While her daughter was addicted to social media, the complaint claims that Facebook and Snapchat “knowingly and deliberately designed, manufactured, and marketed social networks that are excessively dangerous because they are intended to be addictive to underage users.”

Twitter is the subject of a complaint of Magali Berdah, filed in April for complicity in aggravated moral harassment. The influencer agent demands in particular the permanent deletion of Booba’s account in a war that has been pitting the two personalities for several months.

Social networks ensure action

In 2021, Facebook announced that it had strengthened its fight against online bullying by tackling mass bullying. The network can thus remove private messages or comments that are harassing based on the context and additional information.

On the Instagram side, the platform has abandoned its idea of ​​creating a network reserved for the youngest. For the past two years, direct message requests containing offensive language are supposed to be masked outright and not appear in the main voicemail. An artificial intelligence was launched in 2019 to warn people who want to post abusive messages.

On Twitter, it is possible to ask the social network to remove photos or videos posted without a person’s consent. “This update will allow us to take action on content that is not explicitly abusive but has been shared without the consent of the person appearing there,” the company said. in a press release.

Except that shortly after his arrival at the helm of Twitter, Elon Musk fired a good part of the moderation teams. At the end of 2021, he was already suffering from a lack of human resources, with less than 2000 moderators worldwide for about 400 million users, that is, one moderator for every 200,000 users.

Researchers regularly ask Twitter to identify stalkers by their IP address. but the platform based in ireland, does not always respond favorably to requests, which is against French law. YO’article 60-1 of the Code of Criminal Procedure imposes the sharing of data during judicial requests, under penalty of a fine of 3,750 euros

Author: margaux vulliet
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here