Crisis Text Line has ended its controversial relationship with Loris.ai, a for-profit sister company that promises to help companies “handle their hard customer conversations with empathy.”
The decision followed outrage over a Politico story that detailed how Loris leveraged Crisis Text Line insights and anonymized user data in order to sell companies customer service optimization software, a use that struck some as unseemly given Crisis Text Line’s mission to prevent suicide and help people in distress.
The article prompted Brendan Carr, commissioner of the Federal Communications Commission, to request that Crisis Text Line and Loris “cease their practice of sharing — and monetizing — the data they obtain from confidential text messages of people reaching out for mental health counseling and crisis interventions.”
Without mentioning the FCC’s request, Crisis Text Line acknowledged in a statement that it understood users didn’t want their information shared with Loris.
“We heard your feedback that it should be clear and easy for anyone in crisis to understand what they are consenting to when they reach out for help,” the nonprofit said.
When Loris launched in 2018, its founder and former CEO Nancy Lublin told Mashable that the company would use Crisis Text Line’s proprietary software to create an employee training program focused on the skill of having challenging conversations. It’s unclear whether Lublin knew at the time but didn’t explicitly state that Loris would have access to anonymized Crisis Text Line user data, or if the company’s practices changed after its launch.
At the time, Crisis Text Line had received more than 60 million messages from people in emotional or psychological distress. Loris would draw heavily on insights generated by the service’s use of artificial intelligence to analyze the best practices and strategies for deescalation. That included information about “magic words” that calmed an agitated texter as well as effective ways to pose questions, using “how” or “when” to spark an open-ended discussion.
8 online experiences linked to suicide in kids and teens
During her 2018 interview with Mashable, Lublin suggested that its uses could extend to customer service interactions, noting that it might help a representative express more empathy for someone who couldn’t pay their bill. Yet Lublin’s description of Loris focused primarily on its potential to help workplaces become fairer and more equitable by teaching elusive conversational skills.
“When people avoid hard conversations, think about who loses,” she said. “It’s super important to us that people learn how to have hard conversations so women, people of color, and people who are marginalized can have a seat at the table.”
The concept seemed like a natural extension of Crisis Text Line’s work, but Politico reported last week that Loris ultimately ventured deep into commercializing the service’s insights and data to develop software for optimizing the customer service experience.
Loris currently bills itself as “conversational AI for customer-first teams” while offering “scale without sacrifice” and “insights to boost productivity and improve customer conversations.” One testimonial from the meal delivery company Freshly described Loris as making its “support team more productive and empathetic” and noted that its agents were able to complete more tickets and handle more chats at the same time.
Crisis Text Line CEO Dena Trujillo originally stood by the company’s partnership with Loris, emphasizing that the nonprofit didn’t share personally identifying information with it or any other partner.
“Our vision is to build a more empathetic world where no one feels alone,” Trujillo said in a statement issued Friday. “This applies to everyone, not just for the people who text us.”
Critics, however, found the partnership alarming, particularly because Loris could access anonymized exchanges conducted by Crisis Text Line.
Dr. Stacey Freedenthal, Ph.D., an associate professor of social work at the University of Denver and a licensed clinical social worker who treats suicide loss survivors, called the arrangement “concerning.”
“Unless the Crisis Text Line starts all conversations w/”YOU CONSENT TO THIS CHAT BEING ANONYMIZED & SOLD FOR PROFIT,” it’s wrong, IMO,” she wrote on Twitter.
“These are people at their worst moments,” Jennifer King, privacy and data policy fellow at Stanford University, told Politico. “Using that data to help other people is one thing, but commercializing it just seems like a real ethical line for a nonprofit to cross.”
This is not Crisis Text Line’s first brush with controversy. In summer 2020, Lublin resigned as CEO following allegations of abuse and racial discrimination, and she stepped down from Loris as well. Since then, Crisis Text Line hasn’t shared any user data with Loris. The nonprofit said in its most recent statement that it has requested Loris delete all the data it previously received.
If you want to talk to someone or are experiencing suicidal thoughts, Crisis Text Line provides free, confidential support 24/7. Text CRISIS to 741741 to be connected to a crisis counselor. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. You can also call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources.
Source : Crisis Text Line ends controversial relationship with for-profit AI company