Artificial intelligence tools allow the content of child pornography to multiply, especially online, which trivializes sexual violence against children and increases the risks of committing the act, warns the Children’s Foundation in a report published this Tuesday, October 29 .
Videos of virtual raped children or images of naked teenagers, of which only the face is real: generative artificial intelligence makes it possible to create “infinitely” content of this type, stored on computers or shared online, describes the Foundation for Children in a release. summary.
The easy consumption of child pornography content also increases the risks of committing the act. “When there was an event, very often, there was a viewing” of this type of content, according to the manager.
Modify the Penal Code
Faced with this situation, the Children’s Foundation asks political, legal and technological actors to launch a “strong, rapid and coordinated response.”
The main “difficulty” in combating these practices: artificial intelligence is in “constant evolution” and people “who consume children’s criminal content, very aware of new technologies, are quickly taking advantage of it,” explains the head of defense.
The Children’s Foundation recommends, on the one hand, raising awareness among the general public.
The foundation also asks public authorities to modify the Penal Code to criminalize the creation of sexual montages that represent minors. Currently, the “legal vacuum” on this issue “allows the practice to intensify.”
Another consequence of this proliferation of images and videos of child pornography: it is more difficult for law enforcement agencies to identify and protect children who are real victims of sexual violence and who appear in this content.
To this end, the foundation recommends encouraging private actors to cooperate to create tools that allow content generated by artificial intelligence to be distinguished from real content.
Source: BFM TV
