With the rise of artificial intelligence, concern among some employees about keeping their jobs is spreading. And the decision by the National Eating Disorders Association (NEDA) is unlikely to allay fears. In fact, the leaders of the American association dedicated to eating disorders have decided to replace their advisors with Tessa, a chatbot.
According to Abbie Harper, one of the employees thanked by NEDA, this decision follows four of the association’s employees to form a union. With the support of around 200 volunteers, six employees have so far formed the team responsible for responding to people by message or phone. According to the employee, the announcement of the replacement would have been made only four days after the creation of the union.
A “wellness” chatbot
The employee evokes a months-long battle with the leaders to obtain more profitable working conditions for everyone. Several hundred thousand people have used NEDA’s services since its inception 20 years ago. A substantial workload that, according to Abbie Harper, deserved more human resources.
But therefore, the leaders decided to go for the technological solution with Tessa, a chatbot that uses artificial intelligence. However, the tool is not a duplicate of ChatGPT. The chatbot, created by the University of Washington School of Medicine, is more like a “wellness chatbot,” only meant to answer questions about image issues. Therefore, Tessa does not have unlimited responses and can only respond through specific therapeutic methods.
fight in progress
According to this same spokesperson, Tessa would have been previously tested on 700 women between 2021 and 2023 and more than half gave it a 100% rating. Volunteers from the association will also be invited to play the role of Tessa’s testers in the future, instead of her support activity.
The dismissed employees have decided to continue their fight to recover their jobs by filing a file with the National Labor Relations Board, a public body in charge of investigating illegal practices in the world of work.
It is not the first time that a controversy has arisen over the use of a chatbot for medical purposes. At the end of 2022, the platform to help people with mental health problems Koko decided to replace its volunteers with ChatGPT, all without notifying the patients. An experience that came to an end after numerous negative reactions from Internet users.
Source: BFM TV
