Review agrees that DeepMind-NHS deal "lacked clarity"

However independent panel finds that the NHS is ultimately responsible for any legal breach

Hospital IT

An independent review panel has agreed there was a "lack of clarity" in the initial data-sharing DeepMind agreement with the Royal Free Hospital that saw 1.6 million people's data shared without their consent.

However, the panel said there was no evidence to suggest a breach of contractual obligations or data sharing agreements, unlike the UK data watchdog's Information Commissioner's Office (ICO), which deemed that the original deal was illegal.

Advertisement - Article continues below

The independent report, commissioned by DeepMind Health (DMH) following the ICO's own investigation that was published yesterday, expressed concerns that the specific scope of the original information sharing agreement with the Royal Free was not fully explained. However, it recognised that DMH has since taken steps to correct this.

An investigation by the ICO revealed several shortcomings in how the data was handled, particularly that patients were not adequately informed of how the data would be used. As a result, the authority believes the initial agreement broke data protection laws.

The panel, which was made up of a collection of industry leaders and academics, concluded that DMH acted only as a data processor, and that "any issues about data protection obligations or confidentiality obligations arising from the use of patient data during testing are in law, matters for the Royal Free as data controller, and we will not comment further on them".

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

While the report does not contradict the findings of the ICO, it said the specific shortcomings relating to the informing of patients lay squarely with the Royal Free, and that there was "no evidence that DMH had violated the data sharing agreement or any other contractual arrangements".

Under current laws, only the data controller is liable for breaches of regulations, however this will change with the new GDPR regulations set to come into force next year, as both the processor and controller will be jointly liable.

DMH has said it welcomes both the ICO investigation and the independent review, and acknowledges "we should have done more to engage with patients much earlier".

A number of other smaller criticisms were mentioned in the report, such as a handful of minor vulnerabilities in DMH's network and the logistics about "parachuting in" the Streams app, which relied on the huge volume of data to learn how to detect and diagnose acute kidney injury, but DeepMind said these have already been addressed.

Advertisement - Article continues below

DeepMind was also criticised for a lack of public engagement during the tests, as it failed to adequately refute claims spread in the media that patient data would be shared with its then-parent company, Google. If true, this would have breached contractual obligations, but there is "no evidence that DMH had any intention of doing this", according to the report.

The nine-person panel included Mike Bracken, CDO of the Co-operative Group and former CDO for the UK government, Eileen Burbidge, partner at investment firm Passion Capital, and Professor Donal O'Donoghue, medical director for Greater Manchester Academic Health Science Network.

03/07/2017:ICO: DeepMind-NHS deal broke data laws

The Royal Free NHS Foundation Trust failed to comply with data protection law when it provided patients' details to Google AI firm DeepMind, according to the Information Commissioner's Office (ICO).

Advertisement
Advertisement - Article continues below

As part of a trial to test an alert, diagnosis and detection system for acute kidney injury, the trust provided personal data of around 1.6 million patients to DeepMind back in September 2015, revealed seven months later by the New Scientist.

Advertisement - Article continues below

However, an investigation by the ICO discovered several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.

DeepMind had a deal with the Royal Free where the former would process partial patient records containing personally identifiable information (PII) held by the trust.

The PII in question included data on people who had presented for treatment in the previous five years for tests together with data from the trust's existing radiology electronic patient record system. Under the terms of the agreement, DeepMind would process approximately 1.6 million such partial records for clinical safety testing.

But Information Commissioner Elizabeth Denham determined that these records were processed for the purpose of clinical safety testing without patients being informed of this processing.

"The Commissioner was not satisfied that the Royal Free had properly evidenced a condition for processing that would otherwise remove the need to obtain the informed consent of the patients involved and our concerns in this regard remain," the ICO said in a letter to the trust.

Advertisement - Article continues below

It added that the mechanisms to inform those patients that their data would be used in the clinical safety testing of the Streams application were inadequate.

"In short, the evidence presented to date leads the Commissioner to conclude that data subjects were not adequately informed that the processing was taking place and that as result, the processing was neither fair nor transparent," said the ICO.

"Patients would not have reasonably expected their information to have been used in this way, and the trust could and should have been far more transparent with patients as to what was happening," Denham said in a statement.

Advertisement
Advertisement - Article continues below

"We've asked the trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people's data is being used."

The ICO won't fine the NHS trust or DeepMind. Instead, the trust has been asked to establish a proper legal basis under the Data Protection Act for the DeepMind project - which was and for any future trials, and to set out how it will comply with its duty of confidence to patients in any future trial involving personal data.

Advertisement - Article continues below

Plus, it mustcomplete a privacy impact assessment, including specific steps to ensure transparency, and commission an audit of the trial, the results of which will be shared with the Information Commissioner, and which the Commissioner will have the right to publish as she sees appropriate.

"We welcome the ICO's thoughtful resolution of this case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams," said DeepMind co-founder Mustafa Suleyman, and DeepMind Health clinical lead, Dominic King.

"In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health," they added.

"We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better."

Advertisement - Article continues below

The pair said DeepMind has published all NHS contracts since its "mistake" in failing to publicise the Streams contract, which it also replaced with "a far more comprehensive contract" in November 2016.

DeepMind said it's also since developed a patient and public engagement strategy, and is currently awaiting the published findings of nine "independent reviewers" it tasked with scrutinising DeepMind Health.

Featured Resources

Top 5 challenges of migrating applications to the cloud

Explore how VMware Cloud on AWS helps to address common cloud migration challenges

Download now

3 reasons why now is the time to rethink your network

Changing requirements call for new solutions

Download now

All-flash buyer’s guide

Tips for evaluating Solid-State Arrays

Download now

Enabling enterprise machine and deep learning with intelligent storage

The power of AI can only be realised through efficient and performant delivery of data

Download now
Advertisement
Advertisement

Recommended

Visit/security/privacy/355048/government-may-trace-covid-19-patients-using-mobile-phone-data
privacy

UK government may trace COVID-19 patients using mobile phone data

20 Mar 2020
Visit/policy-legislation/general-data-protection-regulation-gdpr/354842/irish-data-regulator-racks-up
General Data Protection Regulation (GDPR)

Irish data regulator racks up GDPR cases against Big Tech

24 Feb 2020
Visit/data-insights/data-management/354423/eu-us-data-transfer-tools-used-by-facebook-ruled-legal
data management

EU-US data transfer tools used by Facebook ruled legal

19 Dec 2019
Visit/information-commissioner/31751/what-is-the-information-commissioner-s-office-ico
Information Commissioner

What is the Information Commissioner’s Office (ICO)?

5 Sep 2019

Most Popular

Visit/infrastructure/server-storage/355118/hpe-warns-of-critical-bug-that-destroys-ssds-after-40000-hours
Server & storage

HPE warns of 'critical' bug that destroys SSDs after 40,000 hours

26 Mar 2020
Visit/software/video-conferencing/355138/zoom-beaming-ios-user-data-to-facebook-for-targeted-ads
video conferencing

Zoom beams iOS user data to Facebook for targeted ads

27 Mar 2020
Visit/software/355113/companies-offering-free-software-to-fight-covid-19
Software

These are the companies offering free software during the coronavirus crisis

25 Mar 2020
Visit/mobile/mobile-phones/355088/apple-lifts-iphone-purchase-restrictions
Mobile Phones

Apple lifts iPhone purchase restrictions

23 Mar 2020