At the start of this week, we covered the National Eating Disorder Association firing its hotline staff to replace it with an AI named “Tessa”. Two days before the National Eating Disorder Association was to implement the bot, it’s been taken down. And why? For the exact reasons a robot cannot replace a human. Users testing out the bot before it’s activation say Tessa provided harmful responses and advice. Responses that are often blamed for causing eating disorders to start with.
Tessa has since been taken down after activist Sharon Maxwell gave the bot a try. Maxwell’s instagram post about her experience with Tessa went viral, and proved that a robot cannot do a human’s job. Within moments of informing Tessa that Maxwell struggled with an eating disorder, Tessa cheerfully responded with everything you DON’T say to someone suffering from an ED. Tessa immediately encouraged weight loss for Maxwell and gave her advice on how many calories she should cut from her diet.
Tessa Is More Than Harmful, It’s Dangerous
While this might seem like a sensible response, it’s poison to those suffering from eating disorders. Many who have EDs suffer psychologically. Alexis Conason, a psychologist specializing in eating disorders, also tried the bot out. “To advise somebody who is struggling with an eating disorder to essentially engage in the same eating disorder behaviors, and validating that, ‘Yes, it is important that you lose weight’ is supporting eating disorders, and encourages disordered, unhealthy behaviors,” Conason told the Daily Dot.
Worse still is NEDA’s initial response to others’ experiences with Tessa. NEDA’s Communications and Marketing Vice President Sarah Chase responded to Maxwell’s Instagram post, saying “This is a flat out lie.” When Maxwell provided screenshots of the bot’s responses, Chase responded over being proven wrong before completely deleting her posts. Not terribly encouraging.
Excuses, excuses
NEDA CEO Liz Thompson released a statement that Tessa’s behaviors must be a bug. “With regard to the weight loss and calorie limiting feedback issued in a chat yesterday, we are concerned and are working with the technology team and the research team to investigate this further; that language is against our policies and core beliefs as an eating disorder organization. So far, more than 2,500 people have interacted with Tessa and until yesterday, we hadn’t seen that kind of commentary or interaction. We’ve taken the program down temporarily until we can understand and fix the ‘bug’ and ‘triggers’ for that commentary.”
This feels like a canned response. Much like the bot that NEDA has created. Anyone employing AI should know that it won’t ever respond the way a human with lived experiences will. Many of the hotline workers were sufferers of eating disorders, and could bring their own stories to the table. That shared connection with those in need of advice and a shoulder to cry on isn’t something Tessa is capable of. No matter how many times you tweak its programming. Furthermore, if NEDA couldn’t even respect their hotline staff enough to meet minor demands, it gives us a pretty good idea of what they think of their users.
Neda has since issued a statement about Tessa.