HomeTechnology“Our worst nightmares”: on the web, more and more images of child...

“Our worst nightmares”: on the web, more and more images of child pornography generated by AI

Pedophiles use artificial intelligence to integrate former underage victims, or artificially “rejuvenated” celebrities, into scenes of sexual abuse. Enough to generate this type of images incessantly, warns a British association.

“Our worst nightmares have come true.” In the darkest corners of the Internet, more and more people are using generative AI (Midjourney-like programs) to create gigantic images of child pornography, according to an alarming new report from the Internet Watch Foundation (IWF). Artificial images that sometimes feature celebrities or real rape victims.

The association, based in the United Kingdom, explains that it carried out the research for a month on a dark web forum. It found and examined more than 11,000 AI-generated images. Among them, almost 3,000 represent sexual abuse of minors and therefore breach British law, explains IWF leader Susie Hargreaves.

Among these images generated by AI, false sexual assaults on minor celebrities, BDSM scenes involving teenagers or even rapes of babies and very young children stand out.

Some AIs are also used to “rejuvenate” photographs of important celebrities, singers, actresses, etc. Photographs that are later modified to integrate them into images that represent sexual assaults.

The report also mentions the case of AI being used to “undress” a person in a photograph, replacing their clothing with a fake naked body without their consent. In Spain, the police have already opened an investigation after a dozen young people between 11 and 17 years old reported that they had received photographs of themselves modified in this way.

Old victims, new scenarios

Most companies that make these AIs available to the general public, such as Midjourney or Dall-E 3, have also implemented safeguards to limit the creation of illegal content. But these limits are not foolproof, and users can modify other AIs (such as Stable Diffusion), even for illegal uses.

Some then personalize these AIs with photographs of real victims, to generate new images in the same style. For example, researchers say they found a file containing 500 images of a real sexual assault victim when she was around 9 years old, and a custom AI model to create even more images of her.

In these forums, users exchange advice and resources to modify these models, highlights the IWF. Others are even starting to advertise their custom models, with monthly paid subscriptions.

The IWF hopes to put this topic on the agenda of the global AI summit, hosted by British Prime Minister Rishi Sunak on November 1-2 (to which very few international leaders have responded so far). In France, the law is clear: viewing, possessing or transmitting an image of child pornography is a crime, regardless of its origin, whether it is a real image, a photographic montage or an AI-generated image.

Author: Luc Chagnon
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here