Mental health apps have ‘terrible’ privacy, report says

An analysis released this week by researchers at Mozilla brings an important warning about mental health and prayer apps. According to a guide that analyzed 32 apps of the kind, these software have worse privacy protections than mostbeing able to expose sensitive data of its users.

According to the leader of the so-called guide *Privacy Not Included (privacy not included in English), Jen Caltrider, the vast majority of mental health apps “track, share and capitalize on users’ most intimate personal thoughts and feelings, such as mood, mental state and biometric data,” the statement reads.

Popular especially during the Covid-19 pandemic, these apps were designed to interact with their users, but by gathering information, absorb a large amount of personal data, under poor privacy protection, according to experts. Of the 32 apps analyzed, 29 received the “privacy not included” label, which means poor security practices or lack of concern for the quality of the passwords used.

Which mental health apps have weak privacy rules?

Source: Woebot Health/Disclosure.Source: Woebot Health/Disclosure.Source: Woebot Health

Mozilla’s expert analysis revealed that the worst-practice apps currently are: Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace. In the case of Woebot, a mental health chatbot specializing in cognitive behavioral therapy, information collected from users is shared for advertising purposes. The New York online therapy platform Talkspace collects transcripts of chats with users.

During the covid-19 pandemic, particularly during periods of social isolation, mental health apps took on a leading role, given the difficulty in finding traditional treatments in this area. As a result, these apps offered quick and affordable assistance, but at a seemingly high cost, according to the report.

For Mozilla researcher Misha Rykov, these software “operate as data suction machines with a mental health app veneer.”

Leave a Comment