Crisis Text Line Received Flak Over Sharing Data With Loris.ai

30 Mar 2022

Data is a valuable asset and sharing personal data without the consent of the users has been a pressing issue in the recent times. A similar incident happened with one the leading mental health helpline organisation named Crisis Text Line (CTL).

The agreement of data sharing between CTL and Loris.ai, an AI-based customer support firm was called off on 31st January 2022. CTL received flak after Politico, a news organization uncovered this data-sharing agreement.

Following this CTL immediately dropped its association with Loris.ai and requested them to erase all the previously shared data.

Crisis Text Line, a mental health helpline called off the agreement with Loris.ai, an AI-based customer support service, on issues corncerning data privacy breaches on 31st January 2022. Crisis Text Line has been sharing data with Loris.ai without the consent of the help seekers. CTL stopped its data-sharing partnership with Loris and requested that its data be erased.

History of Association Between the Two Companies

Crisis Text Line, a nonprofit organization based in the U.S. offers a free mental health messaging service through SMS messages. It is one of the most well-known mental health support lines in the world.

Loris.ai seeked data from CTL to train the AI-based robots to provide customer support service. It bought the data to train its AI robots to speak to people in any kind of situation. People on mental health helpline are most vulnerable and have all kinds of conversations; due to this, the company deemed it fit to buy data from CTL

Approximately 200 million messages are the heart of AI, where they analyze the robot's ability to handle the most difficult conversation.

CTL’s Stance on the Situation

Crisis Text Line is a mental health helpline driven with technology that combines big data and artificial intelligence to help individuals cope with traumas, including self-harm, emotional abuse, and suicidal ideation.

As per one of the spokespersons at CTL, “Any information shared with Loris.ai was kept anonymous”. This means all the data shared with Loris.ai  has been cleansed of any information that can be used to identify the people who called on the hotline in distress.

The CTL claims that they had asked for consent from people before sharing the data. However, the CTL is not sure of how informed the help seekers were about the kind of data that would be shared.

The hotline claims to be honest with its users when it comes to data sharing, claiming that its terms of service and privacy rules are sent "as an automatic auto-reply to every initial text message that Crisis Text Line receives, and to which all the texters consent.”

The Bandwagon Effect

Other organizations, such as Shout 85258, a free text message mental health support service headquartered in the U.K., use CTL's technology.

The organization recently revealed that it had completed one million chats with those in need of assistance.

It only supplied fully anonymized data with CTL, and its data, which is stored on secure servers in London, had not been shared with Loris.

According to Shout, the US helpline is only allowed to use the data to create and enhance a "technology platform for operation of the Shout Service," which the charity licenses on a pro-bono basis.

The possession of what Crisis Text Line calls "the greatest mental health data set in the world" underscores new facets of the tech privacy battles raging in Washington for Crisis Text Line, a nonprofit with financial backing from some of Silicon Valley's biggest companies.

Data Regulatory Concerns

A large number of people are resorting to mental health devices and platforms for mental support. However, there has been a shortage of professional healthcare representatives and an attempt of filling this gap is made by using artificial intelligence (AI).

The fact that such AI platforms need substantial amount of data to be trained creates a conundrum of executing healthcare services and balancing ethical concerns.

According to a market study by BIS research on mental health devices and platforms market was valued at $1,362.4 million in 2019 and is projected to increase at a CAGR of 24.69 percent from 2021 to 2030.

To request a sample of the mental health devices and platforms market click here.

Facebook and Google, for example, have generated large revenue through data sharing. However, sensitive information is also in the hands of charity organizations that are exempt from federal regulations governing commercial firms – with little outside control over where that data is used.

In conclusion, while seeking help on the mental health helpline, people commonly overlook the consent forms and agree to anything and everything that comes their way. On the other hand, the efficiency of AI also depends on large numbers of data and so far there are no regulatory notures in place to protect the privacy of the individual. The regulatory bodies, data providers, and AI companies are yet to find a middle ground.

 
 

Twitter Feeds

 

OUR CLIENTS