select search filters
briefings
roundups & rapid reactions
before the headlines
Fiona fox's blog

expert reaction to study on data sharing by popular health apps

Research published in The BMJ warns that the sharing of user data by popular health apps is routine and not transparent.

Professor Vladimiro Sassone, Roke/RAEng Research Chair in Cyber Security at University of Southampton, said:

“Although it is difficult to comment on the specific numbers provided by the authors without looking at the details of their research and validate their methodology, our own research at the University of Southampton suggests similar conclusions. Using the facilities at our IoT Cyber Security Testbed, our analysis shows that IoT devices across the board consistently infringe on user’s consent. Typically and regularly this includes 3rd-parties not declared, communication to unlisted server, and similar, which might indicate trading sensitive users data without consent.

“On the other hand, communication between apps and their homebase is often necessary for usability, performance, accounting and more, and does not necessarily imply a privacy breach, but just a privacy risk. This is true also for communication with 3rd-parties, which may just provide suitably safe and private data storage. It all depends on how and what data is sent, processed and stored, and the specifics user consent obtained by the developers. From the details made available about this research, it is not possible to determine conclusively that this might not be the case, at least for some of the apps.

Dr Andrea Margheri, Research Fellow in the Cyber Security Group and Manager of the Cyber Security Research Academy, University of Southampton, said:

“In our experience the questions raised as a conclusion by the authors about the fairness of the privacy cost to use health apps is too vague and unmeasurable to be helpful. There will surely be cases in which any privacy cost is well justified by the risk (eg to save lives). In all cases, our approach to Cyber Security the University of Southampton is to base such assessment on proper and fit-for-purpose analysis of risk and individual circumstances, as well as informed user consent.”

Aisling Burnand MBE, Chief Executive of the AMRC, said:

“Patients need and want new tools to help them manage their conditions and live freer, more independent lives. But the public and patients expect that their health and care data to be handled sensitively, responsibly and securely.

“This research shows that health app developers must be much more transparent in how they are using data. We encourage a “no surprises” approach to how health data is used, as put forward by the National Data Guardian for health and social care.  No-one using a health app should be surprised by how their health and care data is used, and they should be offered choice about how their data is used and shared. We also want to see more transparency around how and why digital health solutions generate the outcomes they do, this is particularly relevant to AI.

“If health apps and other technologies fail to be transparent, the risk is that people will become less engaged in them and may miss out on the benefits they offer.”

Dr Mirco Musolesi, Reader in Data Science, UCL & Turing Fellow at the Alan Turing Institute, said:

“The work underlines potential privacy risks related to this type of apps, but, at the same time, I would like to point out the incredible benefits that these technologies can bring to society. Smartphones applications provide an extremely cost-effective way for positive behaviour interventions and therapies, while healthcare budgets for national health service are shrinking around the world and private healthcare costs are sky-rocketing. At the same time, these apps are extremely power-tools for prevention and for improving our personal life-styles. Data sharing practices should be clear and open and users should provide informed consent for the use of their data. In this space, implementing effective mechanisms for proving informed consent is not easy and, for this reason, I think that first of all, this is a problem of education of the general public, for example through schools. Regulation plays only a complementary role in this.  But, again, we should be careful in thinking that health apps are evil per se. This paper highlights the risks, but several solutions for dealing with them are possible and easily deployable.”

Prof Gil McVean, Professor of Statistical Genetics & Director of the Big Data Institute, University of Oxford, said:

“This work demonstrates widespread, and largely opaque, sharing of personal data, including medical conditions and drug treatments, across health-related apps. The authors do not provide any evidence for wrong-doing, or unethical behaviour, but they do show how behind-the-scenes sharing of information among a network of tech companies can potentially be used to create a detailed understanding of an individual’s health and activity. Although the authors note that the recent GDPR regulations have made a difference, and many of the findings are indirect, the work nevertheless emphasizes the lack of transparency and genuinely informed consent that one might expect around such a sensitive area.”

Dr Peter Bannister, biomedical engineer and Exec Chair of the Institution of Engineering and Technology Healthcare Sector, said:

“The research references examples where there has been deliberate sharing of data, for example to health insurance provider, but then provides a much broader representative view of what could be perhaps charitably described as “incidental” sharing of data by many of today’s health apps. While regulators are still adapting their guidance to the specific needs of this growing market, this doesn’t obviate manufacturers’ responsibilities to comply with prevailing guidelines for data privacy (e.g. GDPR) and software as a medical device. Even in cases where data is not shared with another unauthorized entity, the numerous examples of key identifying information (such as email addresses) being exposed by apps creates a heightened risk that anonymised health data stored elsewhere on the user’s phone or in their linked cloud storage could be re-identified. This is particularly concerning in the context of personalised, preventative treatment pathways which heavily rely on patients collecting and storing their own longitudinal health records.”

Prof Alan Woodward, Visiting Professor of Computing, University of Surrey, said:

“Medical apps are becoming increasingly popular yet, as this research shows, users still have little understanding of how the data they entrust to these apps is being shared. The research showed that apps were particularly poor at transparently informing users so they could make an informed choice.

“The research shows that many of the apps were free (albeit with in-app purchases), but as we know from all sorts of technologies, if you’re not the paying customer you are the product. In this case they are sharing data that might be considered some of your most private.

“Whilst many apps attempt to anonymise the data as a means of justifying why they are able to share the user’s data, it has long been apparent that when data is shared in large volumes such as here, it is possible to deanonymise the data. The fact that this can be done with such personally sensitive data is, frankly, disturbing. Some of those to whom the research identified the data being sent would find it almost trivial to deanonymise those data. 

“Sharing medical data could potentially lead to significant advances in medicine. Epidemiology has been one of the great tools in identifying causes and cures in the past. However, this data is so sensitive that any sharing must be done under the strictest guidelines, with oversight and, where there is any doubt, with total transparency to the user so they can give their explicit consent. These findings suggest that medical apps are right back in the bad old days of the web when data was being shared without user’s knowledge, for purposes that no one had control over, and being retained for use in progressively targeting individuals for various purposes. I suspect some of the apps identified in this research may fall foul of the GDPR, and if they don’t there is a case for extending the regulations.

“This research throws into stark relief a subject that really needs a lot more public debate. Yes, it can be beneficial to share medical data, but how is it to be done, by whom, and who oversees the process.  The study results suggest this is happening by profit driven evolution rather than because of any medical benefit it might accrue.”

Prof Nello Cristianini, Chair of Artificial Intelligence, University of Bristol, said:

“This study reminds us that apps can communicate personal information to developers and dozens of third parties along the way. This information can be used in a variety of ways. This is a powerful reminder of what is technically possible today, and why we need clear and transparent regulation of this domain. Regulation is the only way to give trust to users, and clarity to developers. GDPR is a good starting point, we can build on it.”

Dr Daniel Leightley, Post Doc Research Associate, King’s College London (KCL), said:

“This is an important study that has highlighted the issues surrounding data leak in popular healthcare apps. But it’s important to remember that this study was conducted before the implementation of GDPR, with a follow-up finding that many apps were transparent after this date. A key outcome of this study is that developers should seek to develop mobile apps with data minimization in mind, that is, collecting only the most basic information to be able to deliver the service. 

Does the press release accurately reflect the science?

“Yes, however we must be cautious that the geographical and geopolitical landscape does impact the interpretation of the findings. To an extent we’re comparing apples and oranges when it comes to healthcare apps from across the world, as many countries have different legal frameworks, and with many apps not having local approval. Therefore, the extent to which these issues reflect a global problem is in question – it would be more appropriate to draw comparisons on a national rather than international scale.

Is this good quality research?  Are the conclusions backed up by solid data?

“The work is novel in its approach, however by creating only four user persona, without justification as to why they’ve been created, it does skew the findings. It also isn’t clear how/why these persona have been created. This study was not able to explore iOS apps.       For alcohol misuse alone there are over 600 apps, many lacking any evidence base, but initiatives such as NHS Digital App Store in the UK are helping to create trust between the developers and end-users. It is important to remember that some of the network traffic observed is analytics data recorded to help developers understand how users are using the app. The problem is that we don’t know what data aggregators are doing with the data. Big companies like Google/Facebook could be using this information to develop fingerprints of users.

How does this work fit with the existing evidence?

“The findings replicate what we already know. While apps have become more transparent with the introduction of GDPR, there is a lack of undertaking and research on how best to inform users of what is being collected and shared.

Have the authors accounted for confounders?  Are there important limitations to be aware of?

“It is important to be aware that healthcare systems of the UK, US, Canada and Australia, while similar in design are different and the authors should take this into account.

What are the implications in the real world?  Is there any overspeculation? 

“This study represents a small number of mobile medicines related apps and caution is strongly advised. In the UK, NHS Digital has gone to great lengths to review and approve apps via the NHS Digital app store to ensure they’re user friendly and compliant. Developers need to consider app usability and acceptability during development, specifically surrounding the issues of visualising privacy sharing/consent.”

Dr Oliver Buckley, Lecturer in Software Engineering, University of East Anglia (UEA), said:

“Data is an increasingly valuable commodity, and this is often the way that ‘free’ apps are paid for. This kind of data sharing is becoming more common but as this study shows the information being shared is not always obvious. One of the biggest problems is that users have become accustomed to not reading terms and conditions or privacy agreements. We’re asked to agree to so many things now that it would be impossible to actively read and understand exactly what we are agreeing to, especially when simply agreeing is far more convenient than finding an alternative app.

The study focuses on the Android app store, it would be interesting to see if the results were the same on the Apple Store as there is a more rigorous approval process to become an Apple developer.”

Professor Delaram Kahrobaei, Chair of Cyber Security, University of York’s Department of Computer Science, said:

“This study highlights an important issue for health professionals and mobile app users.  Health mobile apps have long legal documents that users are ‘accepting’ in order to use the app, but they don’t actually address security protocols. When people use apps on a mobile, invariably they lose their privacy, but gain some comfort by ‘accepting’ the legal framework.

“This study, and many others like it, provide scientists with the opportunity to see some of the issues that come with the big data that people give away every day. New computational techniques for example can identify people by combining just a few snippets of data.

“There are ways to protect the private information hidden in big data files, but they limit what scientists can learn about important issues, such as public health, so a balance must be struck between privacy and access to valuable data that could help improve lives.  

“Some medical researchers acknowledge that keeping patient data private is becoming almost impossible; instead, they’re testing new ways to gain patients’ trust and collaboration. Meanwhile, how we think and feel about privacy evolves over time and this study demonstrates that it is important to keep pace with this in order to best serve the consumer.” 

‘Data sharing practices of medicines related apps and the mobile ecosystem: traffic, content, and network analysis’ by Grundy et al. was published in The BMJ at 23:30 UK time on Wednesday 20 March.

Declared interests

Prof Gil McVean: “I have no COIs”

Dr Peter Bannister: “I am a Technical Advisor to the Health Data Exchange (HDX), a start-up which leverages digi.me privacy software to broker direct connections between patients and clinical research organizations without HDX itself ever accessing medical records and other app-derived phenotypic data stored on the patient’s device.”

Prof Alan Woodward: “No conflicts.”

Prof Nello Cristianini: No COIs

Dr Daniel Leightley: “I have no conflicts of interest.”

Dr Oliver Buckley: “I have no conflicts of interest.”

Prof Delaram Kahrobaei: “No conflict to declare.”

in this section

filter RoundUps by year

search by tag