Home / Others / That mental health application could share your data without notifying you.

That mental health application could share your data without notifying you.



A recent study reports that free applications that are marketed for people with depression or who want to stop smoking cause a haemorrhage in the data of users to third parties such as Facebook and Google, but often do not admit it in their privacy policies. This study is the most recent to highlight the potential risks of entrusting confidential health information to our phones.

Although most applications for depression or smoking cessation that are easily found in Android and iOS stores share data, only a fraction of them reveal it. The findings add to a series of disturbing revelations about what applications are doing with the health information we entrust to them. For example, a Wall Street Journal Research recently revealed the period of follow-up application. And previous studies reported health applications with security flaws or shared data with advertisers and analytics companies.

In this new study, published on Friday in the magazine. Open JAMA network, The researchers searched for applications using the keywords "depression" and "quit smoking." Then they downloaded the applications and verified whether the data was shared by intercepting the application's traffic. The data shared by the applications did not immediately identify the user or was strictly medical. But 33 of the 36 applications shared information that could give advertisers or data analysis companies information about people's digital behavior. And some shared very confidential information, such as health diary entries, self-reports on substance use and user names.

This type of details, plus the name or type of application, could provide third parties with information about the mental health of someone that the person may want to keep private. "Knowing that a user has a mental health or quit-smoking application downloaded to their phone is valuable health-related information," said Quinn Grundy, an assistant professor at the University of Toronto who studies corporate influences on health and did not participate in the study says The edge in an email

John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and co-author of the new study. "It's really difficult to make informed decisions about the use of the application if you know who will have access to some information about you," he says. That's why he and a team from the University of New South Wales in Sydney conducted this study. "It's important to trust, but verify, to tell where your health care data goes," says Torous.

When intercepting data transmissions, they discovered that 92 percent of the 36 applications shared the data with at least one third party, mainly Facebook and Google run services that help with marketing, advertising or data analysis. (Facebook and Google did not respond immediately to requests for comments). But approximately half of those applications did not reveal the exchange of data from third parties, for different reasons: nine applications had no privacy policy; five applications did but he did not say that the data would be shared in this way; And three applications said that this type of data exchange would not happen. Those last three were those that stood out in Steven Chan, a doctor in the Palo Alto Health Care System of Veterans Affairs, who has worked with Torous in the past, but participated in the new study. "They're basically lying," he says of the applications.

Researchers do not know what these third-party sites do with this user information. "We live in a time when, with enough crumbs of bread, it is possible to identify people again," says Torous. It is also possible that the crumbs stay there, he says, but for now, they just do not know it. "What happens with these digital data is a mystery." "Potential advertisers could use this to compromise someone's privacy and their treatment decisions," he says. For example, what happens if an advertiser discovers that someone is trying to stop smoking? "Maybe if someone is interested in smoking, would they be interested in electronic cigarettes?" Says Chan. "Or could they potentially introduce them to other similar products, like alcohol?"

Part of the problem is the business model for free applications, write the authors of the study: since it is possible that the insurance does not pay for an application that helps users to stop smoking, for example, the only way in which a developer of applications can stay afloat is to sell subscriptions sell data And if that application qualifies as a welfare tool, developers can bypass laws designed to maintain the privacy of medical information.

So Torous recommends caution before sharing confidential information with an application. "But I think it means that you want to break twice and say:" Do I trust the person who created the application, and I understand where this data is going? " Some quick checks might include making sure that the application has a privacy policy, that it has been updated recently and that the application comes from a reliable source such as a medical center or government. "None of those questions will guarantee a good result, but they will probably help you evaluate," he says.

In the long term, a way to protect people who want to use health and wellness applications could form a group that can give a seal of approval to responsible mental health applications, says Chan. "Something like having the approval of the FDA for things, or the FAA certifying a particular aircraft for safety," he says. But for now, you are a user of the application, be careful. "When there are no such institutions or the institutions themselves are doing a good job, it means that we need to invest more than a public good."


Source link