Mozilla: Lack of Security Protections in Mental-Health Apps Is ‘Creepy’

Popular apps to support people’s psychological and spiritual well-being can harm them by sharing their personal and sensitive data with third parties, among other privacy offenses.

While they have good intentions to foster mental health and spiritual wellness, the majority of mental-health and prayer apps can harm their users in other ways by exposing personal and intimate data due to a severe lack of security and privacy protections, researchers from Mozilla have found.

Of 32 mental-health and prayer mobile apps investigated by the open-source organization, 28 were found to be inherently insecure and slapped with a “Privacy Not Included” label, according to a report of the same name published online this week. Moreover, 25 apps failed to meet Mozilla’s Minimum Security Standards, such as requiring strong passwords and managing security updates and vulnerabilities, researchers said.

The apps in question deal with some of the most sensitive mental-health and wellness issues people experience—such as depression, anxiety, suicidal thoughts, domestic violence and eating disorders. The apps, researchers found, appear to be among the most insensitive when it comes to protecting that intimate data.
Infosec Insiders NewsletterMozilla’s Jen Caltrider, the lead researcher for the report, went so far as to call the majority of mental health and prayer apps “exceptionally creepy” in a blog post about the study.

“They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data,” she said. “Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”

Overall, Mozilla researchers spent 255 hours, or about eight hours per product, peering under the hood of the security of a variety of mental health and prayer apps.

The apps that they investigated have functionality such as connecting users with therapists and offering AI chat bots, community support pages, and prayers. They also offer mood journals and well-being assessment, among other features that require collecting sensitive data about users.

Some of the offensive behaviors of the apps include sharing users’ intimate data, allowing weak passwords, targeting vulnerable users with personalized ads, and featuring vague and poorly written privacy policies, according to the post.

For example, at least eight of the apps reviewed allowed weak passwords that range from “1” to “11111111,” while one—a mental-health app called Moodfit–only required one letter or digit as a password, “which is concerning for an app that collects mood and symptom data,” researchers noted in the post.

“Despite dealing with incredibly sensitive information, some apps’ security practices are akin to a flimsy lock on a diary,” they said.

Worst Privacy Offenders

Among the apps investigated, six were designated with the dubious distinction of being the “worst offenders” of user privacy: Better Help, Youper, Woebot, Better Stop Suicide, and Talkspace.

Two of those apps—Better Help, a popular app that connects users with therapists and Better Stop Suicide, a suicide-prevention app—have “vague and messy” privacy policies that provide little to no detail about how the apps protect user data and what users can do in case they have concerns, researchers reported.

Three others—Youper, a digital mental health service for treating anxiety and depression;, which encourages a daily prayer practice; and Woebot, an AI chat bot to foster better mental health—go even further by sharing personal information from the apps with third parties.

Woebot, for example, collects personal info such as a user’s name, email, phone number IP address, as well as all of the sensitive info users share in conversations with the bot. It also obtains user info “from other sources, including through third-party services and organizations to supplement information provided by you,” according to its privacy-policy notes.

“So Woebot can collect a good deal of personal information, [and] add to the information you give them with even more information gathered from third parties,” researchers noted in the report. “Then they say they can share some of this information with third parties, including insurance companies and a seemingly broad category they call ‘external advisors.'”

The other top offender–an online therapy app with celebrity sponsors such as champion swimmer Michael Phelps and musical artist Demi Lovato called Talkspaces—collects a significant amount of personal information on users, including name, email, address, phone number, gender, relationship status, employer, geolocation information, chat transcripts and more.

Talkspaces even goes so far as to ask for users’ written permission to use their health info and therapy notes for marketing purposes, which Mozilla researchers said is “bad form” for any app, especially one dedicated to mental health.

Suggested articles