Generative artificial intelligence is a great tool to generate code, especially when it has no programming. But a simple line of code generated by error can make a program vulnerable.
A particular threat alerts experts in the field: “slopsquatting”. This defect, when exploited, creates an entrance door to computer pirates in the program, without the knowledge of the unconscious user.
Attention, code trapped
The word “slopsquatting” is broken down into two. “Slop” is an English term used to designate low quality content generated by AI.
“Squatting” comes from “Typosquatting”, a technique used by computer pirates that consists in appropriating an Internet link with an existing name. Computer pirates expect that with a typographic error, Internet users fall on this trap site.
In the same way, “slopsquatting” is a technique in which the hacker expects AI to be incorrect when it generates a computer program. AI, because she doesn’t always know what to say, she can invent cod lines. Sometimes place cod lines that download files online through false invented links.
Computer pirates can write a generic warning, asking a chatbot to generate code. They look if I invent links in the generated code. We tested with ChatgPT that, for example, generated a code that downloads a file on the “Example.com” site, a site that today is for sale.
They can then appropriate this site to place malicious files. Any user who writes the same notice, generates the same code and, therefore, download the site poisoned files.
In a report, a team of American researchers stimulates the code of code generated with potentially harmful invented content.
Poisoned packages
Take a specific example. In Python, IT language is commonly used, there is, for example, the “Pip Install” command. It allows you to download packages in a large online library called Python Packet Index.
“Pip Install My_fichier”, for example, will go and download the package called “Mon_fichier.” AI, when invents, often likes to take generic names.
It is enough for a hacker to realize that the AI generates “pip install mon_fichier”. Then he had the opportunity to create the “My_fichier” package himself and integrate the malicious code. Then put it online (if it still does not exist) while waiting for someone to bite the hook.
The next user who will generate a code thanks to this AI where we find the “pip install” mon_fice “line will download the poisoned package placed online by the hacker on your computer.
Obviously, this works with any other order that allows you to download content online.
Source: BFM TV
