“If I open Instagram and scroll down for five seconds, there’s a 50/50 chance I’ll come across a gory video from a meme account,” Jackson Weimer said. Since last year, the New York-based creator of memes (viral comedy videos) has noticed a proliferation of extremely violent content on Instagram. The arrival of increasingly explicit images coincides with the prominence of Reels on the platform, says the Washington Post.
The US media cite in particular videos showing a woman about to be beheaded, a man being tortured, a sexual assault or even particularly shocking scenes of animal abuse. A footage of a woman being burned alive was shared by an account with more than 560,000 subscribers.
HE Washington Post It also mentions the video, still in “Reels” format (created to compete with TikTok), of a child with a gunshot wound to the head, viewed 83,000 times. With, in the comments, testimonials from surprised users.
A young and male audience
If the Meta group (owner of Instagram) is well alerted to the problem, it ensures that these videos represent only a small percentage of the content present on the platform. According to the latest report on compliance with community standards, this content represents only about three videos per 10,000 published.
However, some shocking scenes are viewed tens (even hundreds) of thousands of times on Instagram. Thus, the images of a bloody pig thrown into a meat grinder have added more than 223,000 views.
In most cases, it is the meme accounts that feed the algorithms with this new type of content. They bring together many users, who can accumulate more than a million subscribers. The only problem is that it is mostly young teenagers who follow these accounts, according to a study carried out by YPulse. 43% of 13-17 year olds follow at least one meme account.
The promotion of the OnlyFans model
The publication of these Reels allows you to accumulate numerous commitments. Because even negative, a comment is counted as an interaction, which favors the recommendation by the algorithm of the social network. And this, even with a user who does not follow the sender’s account.
With such statistics, meme accounts can offer sponsored content and thus inflate their turnover. HE Washington Post points out that paid posts are mainly used to promote OnlyFans accounts, a platform where pornographic (and paid) content is allowed. And the higher the commitments, the higher the prices.
Specialized agencies are in charge of buying publications from these influential accounts. But “they buy so many ads from meme accounts that they don’t review everything,” said influencer marketing consultant Sam Betesh.
This is how some OnlyFans models appear between a video of a woman beaten to death and burned alive and another showing people hit by a car or a train.
The Instagram ecosystem noted
At the OnlyFans account management agency he runs, Nick Almonte receives customer complaints “every week.” The latter do not want to be associated with accounts that spread images of violence.
Sarah Roberts is an assistant professor at the University of California. Specialized in social networks and content moderation, she calls on the Meta group to act. “Of course, the meme accounts are to blame, but what is fundamentally responsible is the ecosystem that provides fertile ground for these metrics to have intrinsic economic value.”
Despite the findings of Washington Post, a Meta spokesperson reminds that this type of content is not eligible for recommendation and that videos that violate the rules are removed. He also says that the group is working to improve how to prevent bad actors from using new tactics to avoid detection and evade the app’s device.
Source: BFM TV
