Mental health apps and data privacy: What end-users and HR must know

Mental Health
Team Intellect
Sep 9, 2022

Get the mental health support your company needs

CATEGORIES

TAGS

Article

Written By

Team Intellect

TABLE OF CONTENTS

It’s no secret that mental health garnered special attention following the COVID-19 pandemic.

Overnight, changes in our personal and professional lives unveiled issues that have been swept under the rug for years. This stigma has also prevented people from seeking help “specifically in Asia,” according to Intellect CTO Anurag Chutani.

“People were unaware of how to deal with mental health, what it is, and how they can work on it,” he explains. 

Against this backdrop, interest in mental health apps grew by over 500% since the pandemic, according to a Deloitte study. The demand saw more mental health apps entering the market, and organisations offering mental health apps to support employee mental wellbeing. 

However, data privacy remains a challenge for individuals and organisations. These implications can be far-reaching in the context of mental health apps, which can contain highly personal information about a user’s moods, mental health triggers, anxieties, and stressors.

Data privacy risks of mental health apps 

According to an analysis by Mozilla researchers, several well-known mental health apps were found to have poor privacy and security policies in place.

Of the 32 apps evaluated, over 29 were flagged with a “privacy not included” warning, hinting at issues with user data management.

The flagged applications often had issues with weak passwords being allowed. The privacy policies also only explicitly covered the websites and not the apps. Other apps expressed that they only sold data to third parties upon explicit consent, while some state that they do share information with third parties, like health insurers or health plan administrators. 

Some privacy policies were simply too short and did not cover enough important points like collecting and sharing data. Mozilla also flagged that some apps allowed users to log in or create profiles using third-party platforms, like Facebook or Google Mail, which opens them up to vulnerabilities.

How Intellect protects your data

mental health apps

Employees and employers may wonder: Is my data protected? Who can see the data I share on these platforms? 

“If you talk about employees’ concerns,” says Chutani, “their fear is if we are sharing details with HR managers; it can affect their career progression.”

survey by Paychex revealed that more than half of their respondents were afraid to talk to their managers about mental health for fear of getting fired, furloughed, or losing a promotion. The results of tiptoeing around the problem? Worsened employee motivation, morale, productivity, and stress.

Confidentiality, then, takes centre stage when choosing the right mental health app for your organisation. When dealing with employees’ mental health concerns, companies must operate on trust, ensuring that information shared with providers is not visible to others. 

“[Users] can reach out or access our privacy policy and go through our methods–what we are doing and how we are doing it,” says Chutani of Intellect’s efforts to keep a tight lock on data privacy.

Their most important features, he highlights, are their Zero-Knowledge Encryption technology, multiple layers of cybersecurity, and the company’s compliance with Singapore’s Personal Data Protection Act (PDPA)

Chutani describes the PDPA as “one of the strictest data privacy laws in the region… and helped [Intellect] build trust between employees and the company.” It was created to protect personal data from misuse, and foster trust between individuals and organisations.

At the same time, Intellect’s Zero-Knowledge Encryption policy keeps users’ data completely private not only from their employers but also from the engineers behind Intellect’s services. According to an article by Website Rating, “this level of security means that only you have the keys to access your stored data.”

This is a unique selling point of Intellect and ensures that individuals can avail of their services safely without any risk of leakage. Chutani explains that “[employees’] personal feelings, anything which they enter into the application, is encrypted on the device itself and even we (Intellect) cannot access it.” 

“If you talk about best practices,” he adds, “we are not sharing any individual reports with any of the managers or the HR of the company which we are dealing with.” 

Mental health apps and data privacy

With any new product or service, end-users have to pay attention to two things: the personal data required of them, and how the company intends to use it.

Mozilla’s researchers have also signalled that an app’s reliability is correlated to the responsiveness of its user support team. When in doubt, you can always check if the latter is open to answering questions about data privacy.

In a workplace that advocates mental wellbeing, employees must be assured of both psychological safety and confidentiality. On top of partnering with a company that prioritises data protection, Chutani says that leaders can “set an example themselves by using Intellect’s coaching in [their] day-to-day function”.

This sends a message to team members that their data is well protected. 

Get started on your journey to a better, safer, and more open work environment with Intellect

About Author

Team Intellect

A healthy company is a happy company

Employees need mental wellbeing support now more than ever. With Intellect, you can give them access to the mental healthcare they need, when they need it.

YOU MIGHT ALSO LIKE