Skip to main content

Could a timely PIA have helped save the Royal Free from falling foul of data protection laws?

07 July 2017

For many organisations, the acronym PIA represents a Painfully Inconvenient Ask (if not something far less polite). But Privacy Impact Assessments are set to be a key way of meeting the General Data Protection Regulation’s accountability requirement, and their importance is illustrated by the outcome of a recent investigation by the Information Commissioner’s Office.

The Information Commissioner’s Office has, after a year-long investigation, held that the Royal Free London NHS Foundation Trust failed to comply with data protection laws when carrying out clinical safety testing of an AI-powered app, Streams. The testing phase involved Google’s artificial intelligence subsidiary, DeepMind, processing the sensitive medical data of some 1.6 million patients.

The development of Streams was laudable: it aimed to facilitate the early detection of acute kidney injury, a life-threatening condition. The app appears also to have been well regarded by doctors and nurses using it. Successful outcomes were reported. So what wasn’t there to like?

Having recognised the benefits to society of such solutions and emphasising that she doesn’t want to stifle innovation, the Information Commissioner went on to highlight the following shortcomings in relation to the clinical safety testing phase:

  • there was a lack of fairness and transparency given that patients presenting at the Royal Free for treatment would not reasonably have expected their data to have been accessed by DeepMind to develop a new app; nor were those patients adequately informed of that processing.
  • it was likely (in her opinion, having consulted with the National Data Guardian) that the processing amounted to a breach of confidence.
  • no evidence of a legal basis for the processing had been provided by the Royal Free which would otherwise remove the need for it to obtain the informed consent of patients.
  • the Royal Free did not demonstrate that it was necessary and proportionate to process records relating to 1.6 million patients to test the clinical safety of the app.
  • since patients were not fully aware of the use of their personal data for the testing phase, they could not object to the processing.
  • the written agreement between the Royal Free and DeepMind contained deficiencies, and no privacy impact assessment had been carried out before the project started (though it appears that one was conducted subsequently).

The outcome is that rather than serve an enforcement notice, an undertaking was sought from the Royal Free to confirm its commitment to make good on the shortcomings identified. Measures stipulated in the undertaking include conducting a (further) privacy impact assessment and a third party audit of the Royal Free’s processing arrangements with DeepMind

Comment

A feature of big data analytics is its ability to analyse data collected for one purpose for other entirely different purposes. Innovative solutions leverage that ability and often look to find new uses for personal data. This can make it difficult for organisations to foresee at the outset all the uses they might make of the data collected.

In an ever faster moving world where the use of big data, artificial intelligence and machine learning is unlocking huge potential, not just in healthcare but across all sectors, this investigation signals the importance attached to privacy impact assessments, and expectation that they be used, especially when it comes to high risk activities such as profiling or large scale processing of sensitive personal data. If they aren’t already a key tool in organisations’ compliance toolkits, they should be.

Had a privacy impact assessment been carried out before the start of the processing complained of, it is possible that many of the shortcomings highlighted by the ICO would have been flagged, and could have been addressed. Hence the importance of carrying out privacy impact assessments as early as possible when developing innovative solutions, as well as in subsequent iterations.

Privacy isn’t a zero-sum game and, as the Information Commissioner was at pains to emphasise, shouldn’t come at the price of innovation. Considering privacy from the start, and building it into the design of innovative solutions, should help organisations achieve the right balance.

Back To Top