Corporate candour and public sector cant

The UK Information Commissioner has completed her investigation into the deal between Google DeepMind and the Royal Free Hospital Trust which gave the company access to the health records of 1.6m NHS patients. The Commissioner concluded that:

Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google DeepMind.

The Trust provided personal data of around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for acute kidney injury.

But an ICO investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.

The Trust has been asked to commit to changes ensuring it is acting in line with the law by signing an undertaking.

My Cambridge colleague Julia Powles (now at Cornell) and Hal Hodgson of the Economist did a long and thorough investigation of this secret deal (using conventional investigative tools like Freedom of Information requests). This led to the publication of an excellent, peer-reviewed article on “Google DeepMind and healthcare in an age of algorithms”, published in the Springer journal Health and Technology in March. In the period up to and following publication, the authors were subjected to pretty fierce pushback from DeepMind. It was asserted, for example, that their article contained significant factual errors. But requests for information about these supposed ‘errors’ were not granted. As an observer of this corporate behaviour I was struck — and puzzled — by the divergence between DeepMind’s high-minded, holier-than-thou, corporate self-image and its aggressiveness in public controversy. And I wondered if this was a sign that Google iron had entered DeepMind’s soul. (The company was acquired by the search giant in 2014.)

But now all is sweetness and light, apparently. At any rate, DeepMind’s co-founder, Mustafa Suleyman and Dominic King, the Clinical Lead in DeepMind Health, have this morning published a contrite post on the company Blog. “We welcome the ICO’s thoughtful resolution of this case”, they write, “which we hope will guarantee the ongoing safe and legal handling of patient data for Streams [the codename for the collaboration between the company and the NHS Trust]”.

Although today’s findings are about the Royal Free, we need to reflect on our own actions too. In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better.

This is an intelligent and welcome response. Admitting to mistakes is the surest way to learn. But it’s amazing how few corporations and other organisations do it.

When I first read the draft of Julia’s and Hal’s paper my first thought was that the record of errors they had uncovered was not the product of malign intent, but rather a symptom of what happens when two groups of enthusiasts (consultants in the Royal Free; AI geeks in DeepMind) who were excited by the potential of machine learning in detecting and treating particular diseases. Each group was unduly overawed by the other, and in their determination to get this exciting partnership rolling they ignored (or perhaps were unaware of) the tedious hurdles that one (rightly) has to surmount if one seeks to use patient data for research. And once they had been caught out, defensive corporate instincts took over, preventing an intelligent response to the researchers’ challenge.

Interestingly, there are intimations of this in today’s DeepMind blog post. For example:

“Our initial legal agreement with the Royal Free in 2015 could have been much more detailed about the specific project underway, as well as the rules we had agreed to follow in handling patient information. We and the Royal Free replaced it in 2016 with a far more comprehensive contract … and we’ve signed similarly strong agreements with other NHS Trusts using Streams.”

“We made a mistake in not publicising our work when it first began in 2015, so we’ve proactively announced and published the contracts for our subsequent NHS partnerships.”

“In our initial rush to collaborate with nurses and doctors to create products that addressed clinical need, we didn’t do enough to make patients and the public aware of our work or invite them to challenge and shape our priorities.”

All good stuff. Now let’s see if they deliver on it.

Their NHS partners, however, are much less contrite — even though they are the focus of the Information Commissioner’s report. The Trust’s mealymouthed response says, in part:

“We have co-operated fully with the ICO’s investigation which began in May 2016 and it is helpful to receive some guidance on the issue about how patient information can be processed to test new technology. We also welcome the decision of the Department of Health to publish updated guidance for the wider NHS in the near future.”

This is pure cant. The Trust broke the law. So to say that “we have co-operated fully” and “it is helpful to receive some guidance on the issue about how patient information can be processed” is like a burglar claiming credit for co-operating with the cops and expressing gratitude for their advice on how to break-and-enter legally next time.