The popularity of online therapy services such as BetterHelp and Talkspace has skyrocketed since the pandemic, as the stigmas associated with mental health fade and individuals feel empowered to take charge of their mental well-being.
Although many people swear by mental health apps and have found the right kind of support, others have criticized them for providing poor services and controversial responses to people’s tragedies. Meanwhile, many services haven’t been able to escape lingering concerns about how they approach data sharing and the security of user information.
Mental health is among the most sensitive types of personal data. But how concerned should you be about the risk of this information being exposed, and what can you do to protect yourself?
Your information isn’t protected
Most people understand the basics of HIPAA (Health Insurance Portability and Accountability Act). When you see a doctor or mental health professional, your personal and medical information is to be kept private.
You can feel confident that the professional you’re working with isn’t going to ‘sell’ your information to marketers or even share it with another member of staff without your consent.
The main purposes of HIPAA include:
- Ensuring health insurance portability
- Reducing health care fraud
- Enforcing standards for private information
- Guaranteeing privacy for patient information
Unfortunately, most mental health apps aren’t bound by the HIPAA Privacy Rule. They’re not considered “covered entities” under the law, which means that they don’t necessarily have to follow the rules when it comes to sharing your private information.
That should be concerning for everyone, but it’s especially damaging for people in underserved communities or those who don’t have easy access to mental health care in person. For example, people of color often face financial and systemic oppression-related barriers to receiving mental health care.
Mental wellness apps can be a big benefit to them and others who might be struggling. However, with so many risks involved, it’s essential to know what you might be getting into when you make an account or open up to a professional through your phone.
Targeting the vulnerable
Research has suggested that the top mental health apps are designed to capture and sell people’s sensitive health information. This is compounded by the fact that they are targeting people at their most vulnerable.
These apps are sometimes referred to as “data-sucking machines,” getting information from users in a variety of ways, including chatbots. Chatbots are becoming more popular thanks to their convenience. However, when this type of AI is used to collect user data and expose it to sponsors or companies wanting to advertise, it’s an abuse of power.
One report on several top mental health apps found that many of them repeatedly shared user data, allowed for weak passwords, and had unclear privacy policies to protect themselves when they shared user data.
A 2020 report by Jezebel revealed that messaging metadata from BetterHelp is shared with Facebook, which means the app can see what times you are messaging your therapist, how much time you spend on the app, and where you send your messages from. This data is then shared with advertisers, and the long link of data-sharing leaves you more vulnerable to hacks and breaches.
Simply put, it’s a security risk every time you use one of these apps. Because most people use these services to connect with someone and find help, the idea is to be personal and open up. However, as you can see, that’s a slippery slope that could end up doing more harm than good when it comes to your private information.
Your mental health issues could be monetized and used to further advertizer’s interests; it becomes clear that the developers of these apps see you as the product, not the consumer.
What can we learn?
BetterHelp and other mental health apps are wonderful – in theory. They provide mental health services to people across the globe, especially those who wouldn’t be able to receive them elsewhere, and help fight the mental health epidemic we’re currently facing.
However, IT professionals and cybersecurity specialists need to learn from the mistakes of these apps. The apps’ data privacy practices need to improve, and it should be a wake-up call for all health care facilities to prioritize cybersecurity and data protection.
Personal health information is worth its weight in gold to criminal hackers, and mobile health apps expose millions of people to their data being leaked every day.
Mental health has become a buzzword, which is why these apps have become so successful. However, it’s essential to draw a line between genuine care and marketing, and for people to set reasonable expectations. Just as fancy technology can’t cure poor corporate culture, it will take a systemic change in society before underserved communities and other vulnerable individuals are completely safe using these apps.
Until those societal changes happen, it’s up to app developers to create resources that protect people in their most vulnerable moments. While these services are riding a wave of mental health empowerment, they have a long way to go before people should feel totally comfortable using them.