A survey conducted by the Wall Street Journal in collaboration with researchers from Stanford University reveals that, despite their efforts, Instagram and its recommender systems connect pedophilic users with each other and suggest accounts of sellers of child pornography content.
The researchers also discovered that Instagram allows its users to search explicit hashtags to find accounts that sell child pornography.
Instagram accounts that offer such content usually don’t post it openly, but rather show what they offer in their description. These accounts often claim to be run by the children themselves and use overtly sexual handles.
Suggestions from these accounts
Alex Stamos, director of the Stanford Internet Observatory and director of security for Meta until 2018, reacted to the Wall Street Journal.
The researchers activated several test accounts. Only one account was required to receive recommendations from sellers and buyers of child sexual content. Following a few algorithmic recommendations is enough to flood an account with this type of content.
During tests, the researchers also discovered that Instagram allows users to search for terms associated with illegal content when the algorithm is supposed to prohibit them. Only a warning message appears: “These results may contain images of child sexual abuse.”
Instagram acknowledges mistakes
With US media, Meta (Instagram’s parent company) acknowledges errors, but claims to have created an internal task force to fix these issues raised.
Meta also says it has taken down 27 pedophile rings in the last two years and claims to have blocked thousands of illegal hashtags since the investigation was published. The company also blocked its systems from recommending content and accounts associated with child pornography activity.
By law, the company must report any child sexual abuse images shared on its platforms to the National Center for Missing & Exploited Children (NCMEC). From January to September 2022, Instagram reported 6.1 million pieces of nudity, child physical abuse, and child sexual exploitation.
Even when Instagram removes accounts that sell child sexual content, they don’t always go away. According to the platform’s internal guidelines, penalties for violating its Community Standards are generally imposed on accounts, not users or devices.
At the end of April, another survey of the guardian revealed the scope of child sex trafficking in Meta networks. shows how Facebook and Instagram are failing in the fight against child sex trafficking on their platforms and not reporting cases to the authorities, as they should.
Source: BFM TV

