Mental health apps are not good for you
Is it a breach of confidentiality when the fine print says it’s okay? (Image: Dreamstime)
In the world of mental health apps, privacy scandals are seemingly the norm. Every few months, research uncovers unscrupulous data-sharing practices in apps like the Crisis Text Line, Talkspace, BetterHelp, and others. Users gave information to those apps in hopes of feeling better. It turns out their data was used in ways that help companies make money.
When further investigated, the apps often changed or adjusted their policies. Researchers revealed that mental health apps have some of the worst privacy protections of any app category.
But most users rarely read privacy statements before ticking the accept box. And even if they do read the agreements, they’re often complex and hard to comprehend—it’s impossible to know the implications at a glance.
“They operate like data-sucking machines with a mental health app veneer,” said Mozilla researcher Misha Rykov in a statement. “In other words: A wolf in sheep’s clothing.”
In the latest iteration of the Privacy Not Included Guide that Mozilla created, the researchers analyzed 32 mental health and prayer apps. 29 were given a “privacy not included” warning label, indicating concerns about how the app managed user data.
It’s ironic that these apps are designed to address sensitive issues such as mental health, yet they collect large amounts of personal data under vague privacy policies. Most apps also had poor security practices, letting users create accounts with weak passwords despite containing deeply personal information.
The research presented the apps with the worst practices: Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace. For instance, the AI chatbot Woebot says it collects information about users from third parties and shares user information for advertising purposes.
Better Help, one of the most prominent “therapy-on-demand” apps to launch over the last few years said their methods were standard and that they “typically far exceed all applicable regulatory, ethical and legal requirements.”
It’s true, there are no laws against a therapy app telling Facebook every time a person talks to their therapist, or sharing patients’ feelings about suicide with an analytics company that helps clients measure how “addicted” users are to an app.
Unless people who use and trust Better Help read and analyze the fine print, they will never have an idea of how far their intimate information is being shared in a way that’s designed to make companies bigger and richer.
“The vast majority of mental health and prayer apps are exceptionally creepy,” revealed Jen Caltrider, the lead researcher of Mozilla’s guide, in a statement. “They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.”
Of all the information we share with the technology companies that dominate our lives, health and mental data are some of the most valuable and controversial. And though social media conditions a person to share every aspect of their lives, at every moment, a company automatically telling Snapchat and Facebook you’re signing up for therapy is plain creepy, even if it’s covered in the fine print.
Again, it raises questions on how someone's intimate, supposedly private sessions can be exploited by advertisers, who aren’t always known for operating in good faith.