When a person is in need of a hotline, the human connection can often be the only thing that can help. Especially if that other human can relate to the situation. This is apparently the opposite of what the Eating Disorder Helpline wants for those in need of their assistance. The National Eating Disorder Association replaced their hotline workers with a chatbot, four days after their workers unionized.
“NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders. But do not be fooled—this isn’t really about a chatbot. This is about union busting, plain and simple,” Abbie Harper, helpline associate and union member, said. According to Harper, the NEDA hotline workers were severely understaffed and overworked, hence the decision to unionize. The helpline was comprised of roughly 200 volunteers at any given time, six full-time staffers, and a couple supervisors. Not nearly enough numbers to handle the volume of a mental health hotline.
“We asked for adequate staffing and ongoing training to keep up with our changing and growing Helpline, and opportunities for promotion to grow within NEDA. We didn’t even ask for more money…When NEDA refused [to recognize our union], we filed for an election with the National Labor Relations Board and won on March 17. Then, four days after our election results were certified, all four of us were told we were being let go and replaced by a chatbot,” Harper continued.
A Robot Can’t Be a Human
Tessa the chatbot is described as a “wellness chatbot,” but it already comes with its share of problems. Like predetermined answers, which means that users may experience canned responses when speaking to the bot more than once. According to Motherboard, the bot didn’t even respond when they sent messages like “I’m feeling down” and “I hate my body.” Phrases that would have a human volunteer responding immediately to talk. Harper sees it as an extreme disrespect to users.
“Some of us have personally recovered from eating disorders and bring that invaluable experience to our work,” Harper said. “All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference.”
There’s also something to be said of the dangers of a chat bot left to its own devices. Programs like ChatGPT have been known to completely derail depending on what users say to it. And while Tessa isn’t a ChatGPT model, it could result in the same problems. A robot misunderstanding a person’s needs and responding in the wrong way. And the extremes can be deadly. One horrific case, in particular, involved a Belgian man taking his own life after chatting with an AI bot called Eliza. We’re on the side of Harper for this one. Eliminating the human element of a mental health hotline speaks only to how worthless the NEDA sees its users. And how little it values those willing to do good for others.