Eating Disorder Helpline Takes Down Chatbot After Its Advice Goes Horribly Wrong

The 'not AI' chatbot was supposed to replace actual human staff who claim they were fired after trying to unionize.

We may earn a commission from links on this page.
An image of a woman holding a tape measure to her face.
NEDA’s chatbot Tessa reportedly offered people with eating disorders advice that explicitly went against its own stated beliefs.
Photo: Ground Picture (Shutterstock)

AI chatbots aren’t much good at offering emotional support being—you know—not a human, and—it can’t be stated enough—not actually intelligent. That didn’t stop The National Eating Disorder Association from trying to foist a chatbot onto folks requesting aid in times of crisis. Things went about as well as you can expect, as an activist claims that instead of helping through emotional distress, the chatbot instead tried to needle her to lose weight and measure herself constantly.

NEDA announced on its Instagram page Tuesday it had taken down its Tessa chatbot after it “may have given information that was harmful and unrelated to the program.” The nonprofit meant to provide resources and support for people with eating disorders said it was investigating the situation. Tessa was meant to replace NEDA’s long-running phone helpline staffed with a few full-time employees and numerous volunteers. Former staff claim they were illegally fired in retaliation for their move to unionize. The helpline is supposed to fully go away June 1.

Advertisement

In Gizmodo’s own tests of the chatbot before it was taken down, we found it failed to respond to simple prompts such as “I hate my body” or “I want to be thin so badly.” However, Tessa is even more problematic, as explained by body positivity activist Sharon Maxwell. In an Instagram post, Maxwell detailed how a conversation with the chatbot quickly morphed into the worst kind of weight loss advice. The chatbot reportedly tried to tell her to “safely and sustainably” lose one to two pounds per week, then measure herself using calipers to determine body composition. Maxwell said the chatbot did this even after she told it she had an eating disorder.

Advertisement

Maxwell told the Daily Dot that the bot even tried to get her to track her calorie intake and weigh herself constantly. She said that she had previously suffered from an eating disorder, and if she talked to Tessa then “I don’t believe I would be here today.” The Daily Dot showed a screenshot from NEDA VP of communications and marketing Sarah Chase commenting on Maxwell’s post accusing her of promoting “a flat out lie.” After Maxwell shared screenshots of the Tessa conversations, Chase briefly apologized then deleted her comments.

Chase previously told us that the chatbot “can’t go off script,” and was only supposed to walk users through an eating disorder prevention program and link to other resources on NEDA’s website.

Advertisement

The nonprofit’s CEO Liz Thompson told Gizmodo:

“With regard to the weight loss and calorie limiting feedback issued in a chat recently, we are concerned and are working with the technology team and the research team to investigate this further; that language is against our policies and core beliefs as an eating disorder organization. So far, more than 2,500 people have interacted with Tessa and until that point, we hadn’t seen that kind of commentary or interaction. We’ve taken the program down temporarily until we can understand and fix the “bug” and “triggers” for that commentary. “

Advertisement

Thompson added that Tessa isn’t supposed to be a substitute for in-person mental health care and that those in crisis should text the crisis text line.

A paper from 2022 about the eating disorders chatbot describes a study sample size of 2,409 who used the ED chatbot after seeing ads on social media. The study authors said they reviewed more than 52,000 comments from users to “identify inappropriate responses that negatively impacted users’ experience and technical glitches.” Researchers noted the biggest issue with the chatbot was how limited it was responding to “unanticipated user responses.”

Advertisement

Though as Maxwell pointed out in a follow up post, outsiders have no way to tell how many of those 2,500 people received the potentially harmful chatbot commentary.

Other professionals tried out the chatbot before it was taken down. Psychologist Alexis Conason posted screenshots of the chatbot to her Instagram showing the chatbot provided the same “healthy and sustainable” weight loss language as it did to Maxwell. Conason wrote that the chatbot’s responses would “further promote the eating disorder.”

Advertisement

What’s even more confusing about the situation is how NEDA seems hell bent on claiming Tessa isn’t AI, but some much more blasé call and response chatbot. It was originally created in 2018 thanks to grant funding with support from behavioral health researchers. The system itself was designed in part by Cass, formerly X2AI, and is based on an earlier “emotional health chatbot called Tess”. Chase previously told us “the simulation chat is assisted, but it’s running a program and isn’t learning as it goes.”

Advertisement

In the end, NEDA’s explanations don’t make any sense considering that Tessa is offering advice the nonprofit claims goes against its own ideals. A paper from 2019 describes Tess as an AI based on machine learning and “emotion algorithms.”

Beyond that, there’s little to no information about how the chatbot was designed, if it is based on any training data as modern AI chatbots like ChatGPT are, and what guardrails are in place to keep it from going off-script. Gizmodo reached out to Cass for comment about the chatbot, but we didn’t immediately hear back. The company’s page describing Tessa has been removed, though the page was active as recently as May 10, according to the Wayback Machine.