5 Key Data Protection Issues for HealthTech
15 January 2021
The world of “healthtech” has grown at breakneck speed over the past few years. Amidst a global pandemic, further opportunities have arisen given that many people have not been able to physically see a doctor or healthcare professional. Healthtech companies based in the UK have raised the most funding in Europe during 2018 and 2019 and raised over £1 billion in funding during 2020 (from data available via Dealroom).
Healthtech products can be of great help to people in their everyday lives: whether helping them stop smoking, tracking calorie intake , monitoring menstrual cycles or reminding people to take medication. But with those benefits, come certain risks. Given the nature of the data being collected it is vital that privacy issues are considered from the outset. Data protection must be embedded in the design of the product and the way in which it uses, and potentially shares, personal data.
Data relating to physical and mental health falls under the definition of “special category data” under GDPR (or the “UK GDPR” as is now the case) and is afforded additional safeguards as a result. While most people have a general understanding of what constitutes health related information, it can include some not so obvious areas such as the wider range of addictions that we now have awareness of. Information does not need to directly involve a healthcare professional or clinical treatment to be considered health related. Data relating to sex life, sexual orientation and genetics also fall within the definition of special category data, so the same issues are relevant where a product collects that information.
Here we look at some of the key areas of data protection to be aware of when your business is entering or growing in the healthtech sector.
Central to all uses of personal data is the principle of transparency. Users should genuinely understand how their data is being used, who it is shared with, what rights they have and how long it will be kept. This is important for all personal data but is especially important here as there are higher risks involved when someone is entrusting a company with holding their health data. As a starting point, the business itself must understand all the ways in which personal data is being used – otherwise how can it communicate it effectively to the user!
Using the rule of exception can be useful here, so bringing to the surface those non-obvious uses of personal data that a user may not expect to happen as a matter of course. These may include:
- Sharing data with third parties
- Individual profiling based on the data a user has submitted
- Data being combined with other datasets to provide a wider picture of the user’s characteristics.
And what your business does with personal data must always be accessible and in plain language that people can understand, not hidden or described in technical or legal jargon.
Data Protection by Design and Default
GDPR created a requirement to embed data protection throughout the design and development of a product or service. So, from the outset, there needs to be consideration of the privacy aspects of the development (and probably a Data Protection Impact Assessment, see below) and risks mitigated appropriately, through measures that might include encryption and pseudonymisation. The “default” part of this provision is vital to bear in mind – any optional data sharing, profiling or other non-essential use of data must be turned off as standard with the user able to choose whether they are happy to allow it.
The data minimisation principle also feeds in here (the requirement to only collect data that is relevant and necessary for providing the service). What will be mandatory for the user to provide? Could you provide the service without a certain piece of information about a user?
Having detailed knowledge of how your product uses personal data is also now a pre-requisite before uploading or updating apps on Apple’s App Store, including any interactions with third party services that may involve sharing of personal data or an impact on a person’s privacy.
Data Protection Impact Assessments (DPIA)
DPIAs are mandatory (previously they were good practice) in certain circumstances. Most relevant to healthtech is where an organisation is collecting and using special categories of data (which includes physical and mental health) on a “large scale”. The use of “new technologies” is also referenced within the legislation so if you are looking to create a new and innovative healthtech product, a DPIA is almost certainly required.
There is no precise definition of what constitutes “large scale processing” but, if it is planned (or hoped) that a product will be taken up by thousands of users it would be advisable to conduct a DPIA to ensure that the risks relating to the use of the personal data are documented and managed. In any event, a DPIA is a helpful process to go through to ensure you have considered all relevant risks to users and have plans to mitigate them appropriately.
A DPIA must include:
- A description of the planned use of personal data – what does the product do and how does it use personal data?
- An assessment of the necessity and proportionality of the use of the data – can the need and amount of data collected be justified?
- An assessment of the risks to the rights and freedoms of data subjects – what is the impact on users if things go wrong? These risks will be increased where it relates to health data.
- The measures that are in place to address the risks identified – examples of mitigations may include data minimisation, pseudonymisation, clear user choice (where relevant) and enhanced security measures. How are these implemented and managed?
It is important to note that a DPIA exercise is not a one-off box ticking exercise but it should inform how products are designed and any changes that may be necessary. It should also be an ongoing process that requires reviewing and potentially updating as the product offering may change over time.
Tied closely to transparency is the clarity for users over sharing of their personal data. While some uses of personal data may be obvious to a user, data sharing with third parties is one that is not always expected.
Are there links to the NHS or other healthcare partners that need to be made clear?
Do users have a choice in relation to the sharing and, if so, is this clear, specified, freely given and able to be withdrawn (to meet the conditions for valid consent)?
Appointing a Data Protection Officer?
Linked to the requirement to conduct a DPIA is the appointment of a statutory Data Protection Officer (DPO). This has the same threshold in relation to the “large scale” collection and use of health data. So, again, if you already have thousands of users using your product, or you hope to have that many users, you need to consider whether you need to appoint a DPO. And there is no exemption for small companies in appointing a DPO – it is to do with the amount and sensitivity of the data being used and the risks to individuals, not turnover or number of employees.
The DPO role is intended to be an independent role to advise an organisation on its management of data protection risks and GDPR makes it clear that the person fulfilling the role must not “receive any instructions” in relation to fulfilling the DPO role. It can be fulfilled internally or externally but, if the role is assigned to someone with an existing role, that role cannot conflict with their duty to advise impartially and objectively as a DPO. This is not always easy to fulfil, especially in smaller and growing companies. For example, a CEO or a product owner may have vested interests in using personal data in a particular way or protecting the company’s reputation, which can create a conflict with their role as an impartial DPO.
There is more to data protection management than this but these some of the key areas that are most relevant to the healthtech sector, and ones that often get missed in the early stages of developing or growing a product. Building this thinking in early on is the best way to protect users and the company to avoid reverse engineering of design decisions, which can be expensive and time consuming, in addition to the risk of enforcement action in the event of a breach or a complaint.
According to a new County Court judgement, yes…
On 05 October 2021 the UK Information Commissioner (ICO) confirmed that the…
If you use Google to support your business (who doesn’t?!) you’ve probably received a notification from them…
Stephenson Law has advised decentralized network Minima on a $6.5m Series A round with…
In two weeks time, we’ll be dropping 3 NFTs available for the public…