Skip to main content

Is the use of contact tracing apps the answer for organisations to get out of lockdown?

29 April 2020

Contact tracing apps are gaining momentum as a possible way out of lockdown. However their success will ultimately turn on the extent to which they are installed and used correctly.

In an age where people are increasingly concerned about how their data is used, millions of us are going to need to trust these apps and follow the advice they provide. To earn that trust, it is imperative for organisations who wish to deploy them to maintain high standards of privacy compliance, security and ethics.

This article outlines the developments so far, as well as possible data & privacy and employment issues stemming from Covid-19 tracing apps.

What are contact tracing apps?

Lockdown restrictions have been a key part of the fight against coronavirus over the past few months, but now discussion is focussing on how countries can safely ease the restrictions. Digital contact tracing is being posited as a tool that could help reduce the risk of a ‘second wave’ of Covid-19 transmissions once lockdown measures are reduced.

Contact tracing – at its core – involves tracking who has been in contact with a person infected with Covid-19. Such persons are then notified, and appropriate steps can be taken – such as self-isolation or testing. The goal is containment of the virus and breaking infection chains.  Traditional contact tracing is labour intensive and requires health authorities to work with infected individuals to identify and notify persons with whom they have had close contact. Digital contact tracing is an attempt to digitalise and automate this process, normally through mobile phone apps, to allow for a quicker and more widespread solution.

Google and Apple have put forward a plan for using Low Energy Bluetooth (i.e. Bluetooth that is always on) to establish a contact-tracing framework (‘CTF’), allowing links to be made between phones that have been in close proximity with each other.

The idea is that each phone will collect identifiers (random cryptographic tokens not associated with any location or identity) from devices within a certain range, which would then be stored on the phone for 14 days.  Then, if someone is told they have contracted the virus or believe they have symptoms, they can notify the app, giving permission to share their last 14 days of tokens. Following this, the app would be able to send a notification to the users of devices that have been in contact with that person’s device.

The plans come with some restrictions to protect user privacy – for example the CTF does not track location (unlike GPS tracking), and the matching is done in a decentralised manner (i.e. the matching process takes place on user handsets as opposed to a centralised server), limiting the use of the data for wider analysis and preventing combination with other health records. The UK’s ICO has given a cautious approval of this framework, noting that the proposals “appear aligned with the principles of data protection by design and by default”, the main focus of its comments being on the use of cryptographic tokens to keep identification risk low.

How are governments and health authorities using these apps?

Most major governments and health authorities have outlined plans to use such apps to tackle Covid-19 – for example the NHS has confirmed it will release the NHSX app, where people can self-report symptoms of the virus, as well as receive advice on what action to take if they have been near someone with symptoms. The NHS has also indicated that future iterations of the app will enable people to “choose to provide the NHS with extra information about themselves to help us identify hotspots and trends.” The NHS has been working closely with the ICO, as well as the National Data Guardian’s Panel and the Centre for Data Ethics and Innovation to ensure the app meets data privacy and ethics standards.

Interestingly, although the NHS will continue to rely on Bluetooth technology, it has along with the French health authorities rejected the CTF and adopted a more centralised approach (i.e. matching will happen via a server) despite the privacy concerns of holding the data all in one place.  The NHS is of the view that the centralised system will give it more insight into Covid-19's spread, and as a result be much more effective. German health authorities on the other hand, who until recently favoured the centralised approach, have decided to do a U-turn and rely on the decentralised CTF on the basis that, in its view, the centralised approach only works if changes are made to iPhone settings, which Apple is so far not willing to make.  The UK and French authorities clearly think otherwise so it will be interesting to see how this evolves.

What are the key data & privacy considerations?

These plans clearly involve the extensive collection and use of personal data, and it is critical the users of the app have faith that their data will be processed safely, securely and fairly. Without this trust, these apps will not be used, and without mass adoption they will not generate meaningful results and help keep users safe.  How these apps deal with data privacy will therefore be crucial to building and maintaining that user trust and confidence. 

It is imperative that privacy by design is at the forefront of the mind of any app developer, or any organisation intending on using the app within its organisation at both the development and deployment stages.  We set out below some key issues from a data protection perspective:

Security - one of the biggest concerns (both for GDPR compliance and for adoption generally) will be the security protocols adopted by the app. The EU Commission has even suggested that the information security practices would need to be ‘state of the art’ – a tall order perhaps given the speed at which these apps need to be developed and implemented, but perhaps understandable given the sensitivity of the information,  and the trust that is required to ensure they are a success.

Data minimisation and purpose limitation – given that the data is highly sensitive, privacy campaigners have raised concerns over the potential for it to be used for other purposes, something Apple and Google’s CTF looks to address through its restrictions on limiting what data is collected, and how it can be used. The Commission’s guidance is also highly conscious of this, and it is something that governments and health authorities (and organisations) will need to be conscious of when deploying these apps.

Transparency – any app will need to meet a high standard on transparency on release in order to comply with articles 13 and 14 of the GDPR, and to encourage enough people to use it.  This means that privacy notices will need to be comprehensive but at the same time easy to understand and drafted in a way that generates trust, not fear.

Legal basis - there are a few potential legal bases for processing this data, and these will likely depend on the nature of the data being collected and the manner of collection.  The ICO for example understands that consent is likely to be the lawful basis for processing any personal data by contact tracing apps; however, it has also expressed concern about how this consent might be obtained, and there are questions about how functionality may be impacted if consent was withdrawn.  For this reason, organisations may look to rely on other legal bases such as where processing is necessary under relevant member state law, or for reasons of public interest in the area of health. Despite these additional legal bases, the Commission’s guidance (as well as the ICO’s opinion) makes clear that individuals must “remain free to install the app or not and to share their data with health authorities.”

Responsibility for compliance – ultimately it will be the named data controller of the personal data collected by the app who will be responsible for the majority of GDPR compliance, including the giving of fair processing information and dealing with data subject rights requests. In reality, this will mean responsibility will lie with the app developers e.g. the health authorities or governments, or other organisations who release the specific apps (although depending on how each app turns out, joint controllership with other organisations may be on the cards). Recent guidance from the European Commission supports this broad position. 

Data privacy impact assessments and ongoing due diligence – those responsible for GDPR compliance will likely need to undertake a formal risk assessment to identify areas of high-risk processing and mitigate those risks.  These risk assessments will need to be kept under continuous review for the life of the app. 

Engaging third parties - to the extent developers use third parties to develop and maintain the app, procedures and processes (as well as compliant contracts) should also be put in place to ensure those third parties are invested in ensuring data privacy risks are kept to a minimum.  Clear reporting lines will be critical to ensuring any threats or issues are quickly addressed, and their impact minimised.

What as an organisation should you be thinking about when considering the use of these apps within your workforce or otherwise to provide your services safely?

The rise of these apps is going to have some wider societal consequences, not least for organisations who require people to work in close proximity (e.g. in an office space or on a production set), or organisations hosting public events or providing spaces where the public can interact

Whilst any organisation putting on a large public event, employing key workers, or carrying out work with vulnerable people will have a particularly keen interest in the benefits these apps can offer, all organisations are going to want to ensure that their workplace or event or retail space is as safe as possible when lockdown restrictions are lifted, and these applications may well form a crucial part of how this is achieved. 

Key questions from an organisation’s perspective are likely to be:

Can I require my workforce, event attendees or visitors to install the app?

Those organisations who provide devices to their workforce can likely mandate that workers download the app (the device is after all owned by the employer), but what about those organisations who do not provide devices?  Can they force their workforce to download something on their personal devices?  This question is far more nuanced and there will invariably be a difference between a BYOD system and one where workers don’t need to use phones for their work (this of course might apply to large retail organisations where the potential for transmission is significant). 

If a BYOD policy is clear that an employer can request that apps are downloaded, there might be an argument that this can be mandated (albeit invariably such a policy would have been drafted and directed at security apps and similar, and not pandemic response apps).  However, where your workforce is made up of contractors (e.g. actors on a production set) who may not be subject to your BYOD policies, this argument may be more challenging to run, although you could of course include similar provisions in their engagement contract. 

It is difficult to see how organisations could require event attendees or site visitors to download an app unless such a requirement is made a condition of ticket purchase/entry, but this would no doubt have various consumer law implications.

Of course, the government could legislate to make downloading the app mandatory (and even use of the app) but that would be a step too far perhaps even in these times.

Can I require my workforce, event attendees or visitors to use the app?

Even if organisations can force their workforce or event attendees to download the app, can they then make them use it?  There will be arguments on both sides as to whether this is necessary and proportionate in terms of protecting the health and safety of the workforce and/or visitors to offices, retail areas, event spaces, production sets etc. 

There are already many situations where an organisation will effectively require a person to disclose some data to a third party (having a mobile phone, email address, and bank account are necessary for almost all jobs, and all require workforce to provide data to third party data controllers – often more than one). However, the information processed in the use of these contract tracing apps is far more sensitive, and there is clearly some conflict between these sorts of policies and the EU Commission’s position that use of the applications must remain voluntary (a view shared by the ICO and other EU authorities and data privacy regulators).   There are also of course still other avenues that organisations can potentially take – for example temperature testing on entry to buildings, event spaces or production sets; relying on disclosure of symptoms, if present; and their own “mini” manual contact tracing within an organisation.

There is also the problem of enforcing compliance with such a policy, and the risks that come with checking personal devices.

Finally, from a pure employment perspective, if there are significant privacy issues with any applications, but an employer requires their use anyway, the possibility of constructive dismissal arguments also needs to be considered. In practice, much will likely depend on whether the apps are adopted as part of generally accepted health and safety practices as lockdown is lifted.

Can I require my workforce, event attendees or site visitors to tell me if they get a notification?

If an organisation collects such information, it will be a data controller and therefore would need to have a legal basis for this under the GDPR.

It is likely that most employing organisations have already been collecting data about whether their workforce have contracted Covid-19 (or had contact with confirmed cases, and other associated information), relying on the legal argument that this is necessary in order to comply with health and safety responsibilities, and the employer’s common law duty of care to staff. Such arguments should be considered here from a workplace perspective – after all it follows that if an employer knows that a worker has received a notification, they can then take appropriate steps to protect the rest of the workforce, such as making sure that the worker self-isolates.

The answer is less clear cut when it comes to requiring event attendees or other site visitors to provide such information, but arguably similar principles could be applied.

In all cases however these questions can only really be answered when we see in more detail what one of these apps will look like, and specific circumstances around the use of the app by the organisation.

What else do I need to think about from a pure employment perspective when encouraging my workforce to use an app?

There will be corollary questions which are outside the pure data arena, e.g. If an app tells an employee to self-isolate, do we still have to pay them?  Can we discipline someone if an app tells an employee to self-isolate and they do not? 

For more details on what employers should do when an employee is diagnosed with Covid-19, has been in contact with a confirmed case, or other commonly asked questions, please see our comprehensive FAQs here.

Related items

Related services

Smart phone

UK Contact Tracing App Switches Approach

01 July 2020

Following our April article ‘Is the use of contact tracing apps the answer for organisations to get out of lockdown?’ regarding contact tracing apps generally, and the NHS' plans to release the NHSX app based on a centralised data-matching approach, it has now been announced that the UK's app will adopt a decentralised model based on technology developed and provided by Apple and Google.

Covid 19 - Coronavirus

Our advice on dealing with the impact of coronavirus.

Back To Top