DeepMind report fails to justify NHS use, claim privacy campaigners

A report that claims Google DeepMind did not break the law in its use of NHS patient data has failed to address the company's breach of UK privacy laws, campaigners have warned.

The independent review panel released its findings this week after the Information Commissioner’s Office (ICO) ruled the Royal Free NHS Foundation Trust breached the Data Protection Act when it provided DeepMind with the personal data of around 1.6 million patients. Their partnership was established to trial an app called Streams that provides clinicians with alerts about acute kidney injury.

The use of data in the project is described in the report as "the most important question faced by the Independent Reviewers". Peter Wainman, a technology lawyer and data protection specialist at the firm Mills & Reeve, was commissioned to advise the panel on the legality of DeepMind’s actions.

"Our legal advice found that DMH [DeepMind] had acted only as a data processor on behalf of the Royal Free, which has remained the data controller," the report states.

"It found no evidence that DMH had violated the data sharing agreement or any other contractual arrangements with the Royal Free. It found no evidence to suggest that DMH has breached confidence."

This classification makes the Royal Free liable for the breach, as the collection of information falls under the responsibilities of the data controller. DeepMind may, however, have been liable under the terms of GDPR, which comes into effect across the EU in May 2018.

[Read next: 10 facts worth knowing about Google DeepMind]

The limited criticisms of DeepMind have raised the ire of privacy campaigners. The report failed to hold DeepMind accountable for its unlawful data processing or to fully investigate the company's more questionable actions, campaign group medConfidential warned.

"Google DeepMind continues to receive excessive amounts of data in breach of four principles of the Data Protection Act, and the Independent Reviewers didn’t think this worth a mention," said medConfidential Coordinator Phil Booth.

"DeepMind did something solely because they thought it might be a good idea, ignorant of the law, and are now incapable of admitting that this project has unresolvable flaws. The ICO has forced both parties to fix them within weeks having ignored them for approaching two years.

"DeepMind Health needs real senior management with experience of caring for patients, i.e. a Regulated Medical Professional, as a Chief Medical Officer. The second paragraph on the inside front cover (which isn’t even a numbered page in the printed document, but page 2 in the PDF) shows how badly they have failed from the start."

DeepMind review panel vulnerabilities

DeepMind established the panel in 2016 after a report in New Scientist revealed the extent of the patient data that was being provided to the company

The panel of nine reviewers is chaired by former MP Dr Julian Huppert, and was given full access to DeepMind Health with no non-disclosure agreement and a budget of £50,000, which the company later agreed to increase to £59,315. They are reimbursed expenses but receive no salary.

Their principle concerns around DeepMind Health were an inadequate public engagement and a lack of clarity in the original information sharing agreement with the Royal Free Hospital.

A total of 11 vulnerabilities were identified, none of which were deemed critical or high-level. A single medium level issue was revealed, that the report states "should be addressed but [is] not thought to present an immediate threat to the environment or data handled by it".

The Google subsidiary was also advised to make more effort to clarify its links to its parent company, which acquired DeepMind in 2014. The contractual agreements signed with the NHS prohibit DeepMind from combining patient information with other data held by Google, but the company was told it should have done more to allay public concerns around privacy and data protection.

"'Good enough' is not good enough for a company linked so closely to Google, a company that already reaches into every corner of our lives," said Dr Julian Huppert in his foreword to the report.

"Even if untrue, the perception that Google might acquire highly sensitive health records, in addition to data already held by them about an individual, makes some feel uncomfortable no matter what the benefits might be. We believe that it is right that DeepMind Health should be held to higher standards, even if that means they are singled out as a lightning rod for public concerns."

[Read next: Seven real-life use cases for Google DeepMind’s machine learning systems]

Concerns were also raised over some of the unintended consequences of using AI in healthcare that apply to other screening methodologies but could escalate under DeepMind projects, particularly the retinal scan trial at Moorfields.

These include patients receiving unanticipated results about different conditions which could influence both their own care and the allocation of NHS resources more generally. The effects could include overdiagnosis due to false positives, the detection of conditions without treatments, and the release of data that that could challenge NHS policies.

The report praised the general vision of DeepMind, and "their desire to ensure that anything their expertise is applied to, meets the highest ethical and social purposes," but recommended that it develops clearer principles to guide its project. It also called for earlier engagement with both the public and medical professionals in order to understand the potential implications of AI.

“We would encourage scenario work with clinicians and the wider public to try and tease out what these might be and to generate other perspectives and ideas,” it states.

The development of Streams and its understanding of user interfaces is commended, but concerns are expressed over the broader implications and the possibility of it being 'parachuted in' to other hospitals

DeepMind response

In a written response to the report, DeepMind health acknowledged that it should have done more to engage with patients at an earlier stage, and that its initial legal agreement with the Royal Free should have been more detailed. It pledged to continue to publish all its NHS contracts, and to support other groups developing healthcare technology.

The company is holding two patient consultation events in July 2017 and has committed to better explain its works and invite feedback.

It also said that later this year it would log all access to patient data, reduce manual code and report on all data access required by Trusts en route to the objective of Verifiable Data Audit.

"The Independent Reviewers have expressed concerns about the lack of clarity in our original Information Sharing Agreement with the Royal Free, although they say this has since been corrected," the response states. "They also believe that we should hold ourselves to a higher standard than other organisations."

The panel will publish a second report next year reviewing the issues it has raised and its future activities were also addressed.

"We think it is the right time to draw up and agree a written Terms of Reference for the Panel, including clear approaches to the recruitment of Reviewers, their terms of office, ensuring the relevance and diversity of their skills sets, and to how the Panel will be chaired," DeepMind added.

It intends to agree on the Terms of Reference by October 2017, and will offer reviewers a modest honorarium, pegged at the standard level of an NHS Non-Executive Director, currently £6,157 per year.

This story, "DeepMind report fails to justify NHS use, claim privacy campaigners" was originally published by

Copyright © 2017 IDG Communications, Inc.

Shop Tech Products at Amazon