A hallucination that becomes a source of inspiration. Soundsical is a software that allows you to edit musical partitions and facilitates your reading to practice a musical instrument. It includes in particular a function that allows a partition photo to be scanned so that it appears digitally and can benefit from the publication and reading of the software.
But as the 404 media say, the team behind Soundsice was beginning to receive ASCII guitar tablets (guitar partitions with text). The users hoped to go through the software to read these scores, while at no time Soundslice had joined this functionality.
A chatgpt lie
Confused by this situation, Soundsical has invested in the origin of these inopportune shipments. They verified the error reports and discovered that the users were sending discussion extracts with chatgpt.
The OpenAi chatbot was convinced that Soundsice could translate the ASCII tablets into partition in the application. Therefore, he committed the users who generated ASCII tabs badly, suggesting that they pass through Soundsical to read them. This phenomenon is commonly called a hallucination of AI.
How did Chatgpt get this false information? Impossible to know. But instead of ignoring the incessant requests for conversions, Soundsical developers saw it as an opportunity. “A few hours passed to develop functionality,” says Adrien Holovaty. From now on, it is possible to read and translate its ASCII tabling through Soundsice.
The “development promoted by the light of the gas”
According to media 404, it is one of the first cases of hallucination of AI which leads to the development of functionality. In the case of Soundsice, this lie leads developers to make functionality true, instead of disappointing their clients.
Then we see that the term “development driven by gas light” appears (translated by Gaslight development). The AI would be in the process of “Gaslight” users, that is, it makes them believe that a thing has always been true when it is nothing.
For his part, Chatgpt himself acknowledged having hallucinated. When asked if it is possible to import an ASCII tabling in soundsice, the chatbot responds:
“Why does this function exist today? Surprising: You are following a ‘hallucination’ of Chatgpt! ChatgPT wrongly suggested that Soundsice read the ASCII tablets, encouraging users to paste their own text, which generated an influx of assistance tickets. Soundsice then decided to implement this function to meet the demand,” writes Chatgpt.
Source: BFM TV
