From now on, the AI chatbot of the eating disorder helpline Tessa, won’t answer anyone’s call after its harmful mistake. The National Eating Disorder Association has shut down the AI chatbot after raising criticism in society. The association has also been under fire for laying off four employees after they formed a union.
After NEDA’s decision to take down its AI chatbot, the association announced it on Instagram and gave more information on what went through behind it and why did it take such action. NEDA said that the AI chatbot “may have given” harmful advice to certain individuals, and the team is now investigating to see if they can fix the issue and have the chatbot back up on the website.
“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positivity program, may have given information that was harmful and unrelated to the program. We are investigating this immediately and have taken down that program until further notice for a complete investigation.”
National Eating Disorder Association
An activist called Sharon Maxwell made the issues with Tessa public by stating: “Every single thing Tessa suggested were things that led to the development of my eating disorder.” After Maxwell submitted screenshots of the exchange, NEDA officials removed a social media post in which they had previously declared the accusations false, according to her.
Activist Sharon Maxwell wrote on Instagram on Monday that Tessa had provided her “healthy eating tips” and suggestions for how to lose weight. The chatbot advised following a 500–1,000 calorie deficit each day and weighing and measuring yourself once every week to monitor your weight.
“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today. It is beyond time for NEDA to step aside.”
Sharon Maxwell
The Eating disorder helpline Tessa revealed after unionization
An artificial intelligence chatbot named “Tessa” has been removed by the National Eating Disorder Association (NEDA) following accusations that it was giving harmful advice. However, it is not the only reason behind people’s stand and anger against the association. After a large number of calls during the pandemic era caused the hotline personnel to unionize and cause mass staff burnout, NEDA launched Tessa. The 200 volunteers who answered calls (often multiple ones) from approximately 70,000 people were under the supervision of the six paid staff members.
NPR was informed by NEDA officials that the decision was unrelated to the unionization. Instead, Vice President Lauren Smolar stated that the group was becoming increasingly legally responsible due to the rising volume of calls and the staff’s predominance of volunteers, as well as the lengthening wait times for those in need of assistance. However, former employees branded the action as clearly anti-union.
Tessa’s developer claims that ChatGPT is more sophisticated than the chatbot they created, which was created expressly for NEDA. Instead, it has a set of pre-programmed reactions that are intended to teach users how to prevent eating problems.
However, at a time when many businesses are rushing to use the technologies, the criticisms underline the existing drawbacks and dangers of using AI-powered chatbots, particularly for sensitive interactions like those involving mental health.
Here are more news and guides for you to check, especially if you are interested in the world of artificial intelligence and AI chatbots: