A look into mental health and prayer applications uncovered a problematic lack of regard for user security and privacy. Mozilla announced the results of a new study on these sorts of apps on Monday, which typically deal with delicate themes including depression, mental health awareness, anxiety, domestic abuse, PTSD, and more, in addition to religion-themed services.
According to Mozilla’s newest *Privacy Not Included guide, considering the sensitive information these applications handle, they “routinely exchange data, enable weak passwords, target vulnerable users with tailored adverts, and include ambiguous and poorly worded privacy rules.” In reviewing 32 applications aimed at mental health and religion, Mozilla discovered that 25 of them failed to fulfill the organization’s Minimum Security Standards.
The *Privacy Not Included reports are measured against these standards. Mismanagement or illegal sharing and sale of user data, ambiguous data management rules, a lack of encryption, weak password restrictions, a lack of a clear vulnerability management system, and other inadequate security practices can all be used by Mozilla to degrade a vendor’s product. The “*Privacy Not Included” warning label is placed on an app or service that fails to satisfy these basic norms.
The apps for mental health and prayer have won an award, but it’s not one you’d want. According to the firm:
“When it comes to protecting people’s privacy and security, mental health and prayer apps are worse than any other product category Mozilla researchers have reviewed over the past six years.”
Talkspace, Better Help, Calm, Glorify, 7 Cups, Wysa, Headspace, and Better Stop Suicide were among the applications studied by the group. Consequently, each program now has its own section where users can learn more about the software’s privacy and security ratings. For example, the suicide prevention app Better Stop Suicide failed Mozilla’s test.
“Holy vague and messy privacy policy Batman! Better Stop Suicide’s privacy policy is bad,” says Mozilla. “Like, get a failing grade from your high school English teacher bad.”
While the app collects specific personal data and states that users may contact them in case of any questions, they did not reply to Mozilla’s attempts to contact them and didn’t specify who their “trusted partners” were when exchanging data. Only PTSD Coach and the AI chatbot Wysa, two of the apps on the list, seemed to take data management and user privacy seriously.