HomeTechnologyModeration: Superintendency of Meta wants to end the free pass of personalities

Moderation: Superintendency of Meta wants to end the free pass of personalities

On Facebook or Instagram, celebrity posts go unnoticed by moderation. Blame it on a special device that the Meta supervisory board wants to remove.

Meta’s supervisory board criticized the social media giant’s platforms on Tuesday (Dec 6) for giving preferential treatment to problematic content posted by politicians, bosses, celebrities and other personalities.

In his report, he calls for a “significant review” of the so-called “cross-verification” double-checking program, to make it more transparent, more responsive and fairer.

A two speed system

Currently, when posts or images that potentially violate Facebook or Instagram policies are flagged, they are immediately removed if they are deemed high risk and come from unknown users.

But if its author is “whitelisted”, this content remains online while it is examined more closely, a process that usually takes several days and sometimes several months.

This “inequitable” system of double speed, therefore, “offered additional protections to the expression of certain users, selected in part based on the economic interests of Meta,” the report details.

Due to “cross-checking,” “content identified as contrary to Meta rules remains visible on Facebook and Instagram, while it spreads virally and could cause harm,” the supervisory board warns.

Publicly identify affected Copts

It recommends speeding up secondary reviews of content from personalities who may post important human rights messages and removing high-risk ones pending internal verdict.

It also requests the company to publish the eligibility criteria to benefit from the program and to publicly identify, on the platforms, the accounts of the users in question.

The entity is made up of 20 international members, journalists, lawyers, human rights defenders and former political leaders. It was created in 2020 at the proposal of boss Mark Zuckerberg and is in charge of evaluating the content moderation policy of the Californian group.

The members launched a ‘cross-check’ review in October 2021, following revelations from whistleblower Frances Haugen, who leaked internal documents to the press and accused the company of putting the ‘profit over the safety’ of its users.

Meta’s responses to the survey were “sometimes insufficient” and “its own understanding of the practical implications of the program leaves much to be desired,” the council said.

Author: PM with AFP
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here