Feelings, sexuality, conflicts or digestive problems … there are things that we would like to know not to know. Especially in your colleagues. One day, Louise* despite herself in the intimacy of her office neighbor. The Common Chatgpt account failure. The employees of their company connect with the same identifiers, and all have access to the requests of others.
When Louise points out in the story an investigation entitled “Message to a sexologist”, he cannot avoid clicking. “I know that I shouldn’t have done it, but I read: My colleague said she had difficulties in bed with her boyfriend, that he was a fetishist and worried her: she asked Chatgpt to help her formulate an email to a sexologist,” he says.
“Do not do it clear information”
“I was very embarrassed, I eliminated her order. I don’t understand how she could leave that because she dominates the tool very well, she knows that the investigation is visible,” explains Louise, who did not dare to speak with the interested party.
“The first reflection you have in this type of situation is not to make clear information or a ragot, delete the compromising application and go to talk about it discreetly to the interested person,” advises Agathe Lemaire, lawyer of the labor law.
“In general, you must apply the precautionary principle and never make personal requests in a professional tool, either by email or by the company’s AI, because everything you can write perhaps,” he insists.
“I am underwater and I see him talking about his personal life with Chatgpt”
As the days go by, Victor* also discovered a very personal facet of his colleague. The latter tends to use the artificial intelligence tool of the box as a work tool instead of a work tool, without erasing the history accessible to all. “I learned that he will marry next year, but he has doubts, says Victor. He trusts all aspects of his relationship: complicity, sexuality, money …”
Her colleague continues the applications during her working time, while ensuring that she is overwhelmed. “It’s really annoying, I’m underwater, and I see him talking about his personal life with Chatgpt,” he breathes to the employee of an audiovisual production company.
“You can no longer look at the person normally, at first it seemed fun, but very quickly there is a feeling of voyeurism,” said the employee.
“I was third in the list of employees to be fired”
Thomas*, a small business employee in the technological field, has had a slightly different experience. “One day, I read a request from our director: it asks Chatgpt how to discard ‘in a human way’,” says the young man.
“He asked ChatgPP to classify us according to how much we cost the company, our productivity and especially the amount of stops of evil we had, I was third in the list,” recalls the employee.
The company was flowing and one day the boss took action. “He dismissed employees one after another, in the exact order recommended by Chatgpt,” Thomas recalls.
A “completely illegal” practice, according to Agathe Lemaire. “There is already an GDPR subject to protecting personal data. So, we cannot make the lists of nominative employees dismissed, and even less agree with the health criteria, it is clearly discriminatory,” he explains.
“You can also disadvantage the employee more insidiously”
The information presented in an AI tool can clearly become against reckless people who do not use the “ephemeral” mode or even simply their personal account.
“I have a colleague who had stomach pain, described all his digestive problems in the shared account … with the smallest detail,” says William*, an employee of a consulting company with less than 100 employees. “There is another one looking for another live job in Chatgpt, but nobody knows what they want to give up, it’s a bit annoying.”
“It must really be careful, the employer can already blame the employee for looking for work during his office schedule, and if there is a trial, he can trust applications,” explains Agathe Lemaire.
The law professional believes that it also depends on the company to confiscate these problems. “The employer also has an information obligation, he must remember that the tool is not personal,” he explains. Therefore, it encourages companies to amend their letter in this regard.
* The names have been modified
Source: BFM TV
