This confusion stems from one fundamental issue: there are two intersecting regulatory regimes governing the use of health data that are inconsistent with one another, but nevertheless overlap:
- on the one hand, there is the traditional healthcare regulatory framework. This includes the common law duty of confidentiality (which may apply to patient data), clinical trial legislation and the regulation of medical devices and pharmaceuticals.
- separate to that, there are legal concepts which have been traditionally applied to regulating big data and big tech. These appear in data protection legislation like the GDPR (and now, the GDPR as incorporated into UK domestic law) - the GDPR employs concepts like data controllers and data processors which have been developed and cultivated totally outside the healthcare context, and were originally designed by legislators with quite simple supplier-customer concepts in mind. These 'black-and-white' concepts do not quite work in healthcare, where there are multiple players with nuanced roles, such as healthcare providers, researchers and developers, manufacturers and distributors.
It's time to talk about the elephant in the room. This dichotomy is at the heart of most, if not all the misunderstandings around the regulation of health data. As a result, we find that many innovators (and even NHS organisations) veer between two extremes:
- being far too risk-averse with their use of health data, sitting on rich datasets which harbour huge possibilities, but perceiving that the regulatory environment is too prohibitive to permit them to use that dataset to its full potential; and
- being far too cavalier, inviting significant risk and regulatory scrutiny.
There is huge potential for regulatory guidance in this space to clarify this intersection between these two regimes. Such guidance would need to involve multiple stakeholders, given the interplay of regulatory regimes, including the National Data Guardian, the Information Commissioner's Office, NHSX, the Medicines and Healthcare products Regulatory Agency, and the Health Research Authority.
The volume of soft guidance is growing exponentially in the health data sphere, but we urge policymakers to focus on streamlining guidance by considering the full depth of regulatory regimes that apply to health data in the UK from the outset. The piecemeal approach of considering confidentiality, data privacy and product regulation in isolation is not working - it is creating a complex web of laws and soft guidance that is impossible for innovators to navigate. This is an opportune moment for regulators to create a harmonised, consistent regime for data-driven innovation in the life sciences industry.
What should be on the agenda for policymakers?
1. The different thresholds for anonymisation
Developers and researchers often request access to 'anonymised' datasets in order to develop (for example) a new AI algorithm with a diagnostic function, or as part of a registry-based study. The problem is that thresholds for anonymisation between the GDPR and the common law duty of confidentiality are very different. We constantly see innovators and NHS organisations get this issue wrong because they conflate the 'confidentiality' standard for anonymisation with the 'GDPR' standard:
- Truly anonymous information falls outside the remit of the GDPR and its compliance obligations, making it an attractive concept for researchers. However, anonymisation under the GDPR is a high bar and difficult to achieve in practice. It involves removing personal identifiers, both direct and indirect, that may lead to an individual being identified.1 It is often difficult to argue that medical datasets are ever truly 'anonymised' for GDPR purposes.
- The GDPR position is more stringent than under the traditional understanding under the common law duty of confidentiality. Traditionally, researchers in the health space have assumed that removing certain key identifiers (such as name, address, DOB, etc.) will be sufficient to 'anonymise' a dataset for medical confidentiality purposes.
- Often, data considered 'anonymised' for confidentiality purposes are actually 'pseudonymised' data for GDPR purposes. Pseudonymised data may include data where key identifiers have been removed and the data can no longer be attributed to a specific individual without the use of additional information.2 This additional information must be kept separately and subject to certain technical and organisational measures to ensure non-attribution to any individual. The key takeaway is that pseudonymised data is still personal data subject to the GDPR.
So what's needed from policymakers?
- We would welcome guidance on the thresholds for anonymisation that takes into account both the GDPR and the common law duty of confidentiality.
- We urge policymakers to consider the status of medical datasets where key identifiers are removed in greater granularity: when are medical datasets truly 'anonymous' and when are they 'pseudonymised'?
- Most importantly, if there is a risk that an innovator is accessing personal data, we need clear guidance on issues such as legal bases for processing and transparency under the GDPR, which leads us to our next point.
2. Consent, legal bases and the messy intersection between the GDPR and the common law duty of confidentiality
In the life sciences industry, we are very familiar with the concept of consent. However, our familiarity with consent is having unintended consequences: innovators conflate consents required for confidentiality purposes or for clinical investigations or interventions, with GDPR consent (often with the result of stifling innovation).
In the healthcare context, when an innovator perceives a requirement for consent, it is always worth stepping back and considering where that requirement for consent is coming from:
- Under the common law duty of confidentiality, healthcare professionals may only disclose confidential patient information outside the direct care setting on the basis of consent or certain other statutory grounds.3 This 'confidentiality consent' is a relatively low standard of consent (at least when compared to the GDPR); the wording can be quite generic but still be sufficient to permit disclosure of data.
- Separately, there may also be a regulatory requirement for consent. A prime example is the requirement for the 'informed consent' of clinical investigation participants.4
- However, this is very different to the GDPR position. Under the GDPR, every processing of personal data requires a legal basis for processing under Article 6. An additional ground is required under Article 9 if processing a special category of data, such as health data or genetic data. It is true that consent appears as a ground under Article 6, and explicit consent is a potential ground under Article 9 of the GDPR.5 However, the key point is that GDPR consent is one of several grounds which may be available to innovators, even in the life sciences industry. There are a range of other potential grounds, which are far wider than those available in the confidentiality context.
Alternatives to GDPR consent
These alternative grounds are very useful. Under Article 6, grounds include: legitimate interests;6 performance of a contract;7 and compliance with a legal obligation.8
Article 9 grounds include processing for:
- scientific research purposes;9
- public interest in the area of public health, such as ensuring high standards of quality and safety of health care and of medicinal products or medical devices;10 and
- medical diagnosis and the provision of health or social care or treatment.11
At the same time, these grounds ensure data privacy principles are respected, such as requirements to ensure processing is only conducted if 'proportionate' and subject to 'suitable and specific' safeguards.12 Certain grounds for processing must be on the basis of law, or pursuant to a contract with a health professional, or subject to 'professional secrecy'.13 The Data Protection Act 2018 sets out further safeguards and hurdles when relying on the above Article 9 grounds.14 At all times, controllers need to ensure they process only the minimum personal data necessary to fulfil their purpose (the 'data minimisation' principle).15
Further, these alternative grounds do not come with the burden of obtaining GDPR consent (which is a very high bar, may not always be practicable, and may be withdrawn by data subjects).
We see that these alternative grounds are under-used, even though they facilitate more 'friction-free' use of data whilst maintaining robust data privacy safeguards. Innovators lean heavily on GDPR consent, conflating the requirement for consent in other contexts (such as under the common law duty of confidentiality) with GDPR legal bases for processing. They mistakenly believe that they require GDPR consent in order to use personal data throughout the product lifecycle, such as for post-market surveillance, clinical follow-up or scientific research. As a result, they are reluctant to maximise the use of their datasets, given that often, GDPR consent has not been obtained. Innovators do not appreciate that alternative (and less onerous) legal bases are already available to them under the GDPR – innovators are just in need of guidance and clarity that they can use these alternative grounds for their selected purposes.
So what's needed from policymakers?
- We would welcome guidance outlining how GDPR legal bases for processing align with use cases that are fundamental to the development of data-driven innovation in the life sciences.
- We urge policymakers to focus on key areas of the product lifecycle, such as post-market surveillance, clinical follow-up and scientific research. These should be mapped against the various legal bases described in both Articles 6 and 9 of the GDPR. In the highly complex lifecycle of medicines and medical devices, this is the level of granularity required to foster an innovative ecosystem.
This is why we have contributed to the Goldacre Review, a review launched by the government this year to focus on the more efficient and safe use of health data for research and analysis, to complement the forthcoming Data Strategy for Health and Social Care. We urge Dr. Ben Goldcare to cut bureaucracy and streamline health data governance in the UK.
Do these issues impact your organisation? Would you like to discuss any of the above?
If so, don't hesitate to reach out using our contact details.
1 Recital 26, GDPR
2 Article 4(5), GDPR
3 For example, section 251 consent under the National Health Service Act 2006
4 Regulation 104, Medical Devices Regulations (SI 2002 No 618, as amended)
5 Articles 6(1)(a) and 9(2)(a), GDPR
6 Article 6(1)(f), GDPR
7 Article 6(1)(b), GDPR
8 Article 6(1)(c), GDPR
9 Article 9(2)(j), GDPR
10 Article 9(2)(i), GDPR
11 Article 9(2)(h), GDPR
12 Article 9(2)(j), GDPR
13 Articles 9(2)(h), together with Article 9(2)(3); Articles 9(2)(i) and (j)
14 Sections 10 and 11 of the Data Protection Act 2018; Schedule 1 of the Data Protection Act 2018
15 Article 5(1)(c), GDPR