Sharing health data should require patient consent • The registry

0

Register Debate This week’s registry debate debated the motion Presumed consent is the right approach for sharing healthcare patient data, beyond their direct care. The results have fallen, and as you can see, we clearly have a winner.

JavaScript disabled

Please enable JavaScript to use this feature.

There may be more intimate forms of personal data than our health records. However, these are usually only the result of our personal choices.

But we all generate medical data, and sharing this beyond the needs of our “immediate care” – for example with researchers, governments and commercial organizations that have access to it – is something that concerns us. all. Thus, the proposition that “presumed consent is the right approach to share patient data, beyond their direct care” was always going to provoke a strong reaction among healthcare professionals. Reg readers, many of whom have intimate knowledge of how data can flow to unexpected destinations.

Let us remember how the debate unfolded.

Dr Katherine Hanks, a general practitioner in Australia, was the first to enter the arena to support the proposal. She reminded us that general practitioners are familiar with issues of consent and ethics. While, of course, “the privacy of individuals must be vigorously defended, this does not necessarily mean that health data cannot be securely aggregated and anonymized to advance medical and social research.”

And she added: “It is important to remember that presumed consent is always informed consent: patients are informed that they are presumed to have consented to the sharing of their data for use in metadata analysis and, s ‘they wish to withdraw, how to do so. Assuming that consent does not replace personal rights, it simply creates a presumption in favor of a public good. “In the end,” In terms of public health, we must lean to favor collective benefits because in the end, it is the individuals who will reap the benefits. “

The first commentator to come out of the trap was Little mouse who was voted heavily for saying: “Unfortunately, I just don’t trust those responsible to process my data so that it is used for the common good. ‘Presumed consent’, as I understand it, means giving your consent for your records to be shared and sold roughly anybody wants.”

Flocke Kroes suggested a practical alternative: “Don’t share data at all. Store them on an air knife system. Run the queries on the system and return a graph of the number of sick people against age or a low resolution heat map of the disease incidence. The UK government (blue or red) takes every opportunity to become even more untrustworthy. This type of project should be put on hold at least until they grow older. “

Which sparked a heated sub-debate on capitalism versus communism – if you have something valuable, isn’t it a moral duty to charge as much as possible for it?

Naturally, things got lost in the consent of the organ donor. And fuel shortages. As well as previous health errors. All of this is relevant, if you care to read the reviews.

There were a few supporters for the proposal. Chris evans pointed out, “I’m almost done with prostate cancer treatment (they say the treatment should ‘cure’ me). I had a friend (half my age) who died of cancer last week, leaving a wife and two young children. If my medical history could help others, I would be more than happy. They have to make sure that the guarantees are strong and there will probably be violations, but to help my fellow citizens, it seems obvious to me. “

Veteran privacy activist Phil Booth intervened on Tuesday, making the argument that “Assuming consent for non-medical uses of your health information is not like implied consent for your own care.” .

It certainly doesn’t mean “also handing over your most sensitive health information to marketers who sell products anywhere in the NHS (not just you).” It certainly doesn’t automatically include you in experiments without your knowledge or permission, whether it’s about what kind of treatment you or others are receiving, how good or bad it is – or, as it is. more and more the case, to train AIs or develop mutant algorithms. “And on a purely practical level in the UK, Phil explained, the promised guarantees have yet to be delivered.

How did it go with the readers? Well, Sorry, you cannot reuse an old handle drew a lot of love from readers for saying, “The writer is on ‘Why does consent even have to be assumed?’ Because the government knows full well that express consent would rarely be given, it therefore uses the oxymoron of “presumed consent.” Flash: if it is presumed, it is not consent. ”

Become very practical, Jmch pointed out that: “Anyone who works with datasets knows that anonymized data can easily be de-anonymized. The higher the level of detail in a data set, the easier it is to find unique points that can be traced back to individuals. “

And Citizen of nowhere added: “This. And the fact that datasets can be combined, and once they are, what appeared “securely” anonymously in just one of them may not remain so after the data is combined. “

In a completely different register, ibmalone took issue with Phil’s use of the term “mutant algorithm,” suggesting that it actually helps policymakers out of the woods. “Whatever algorithm is in question, it didn’t crawl out of the sea in a 1950s B movie.”

But are there any technical reasons to question at least the current UK setup?

A Anonymous coward said they work in the NHS and are responsible for passing patient data to physician-run nonprofits to provide anonymous data sets for future rare disease studies: ” If you have a complex and rare disease, you should assume that your data was collected for storage in a research registry.

And how do those who suffer from the rare disease feel? Advance our third contributor, Dominic Nutt, a patient advocate and health activist specializing in medical innovation.

He stressed that the company is happy to hand over personal data for questionable rewards – while accepting that this general position does not apply to the Reg readership.

Specifically, he argued, “I am a type 1 diabetic. My antibodies attacked cells in my pancreas. I was also diagnosed with a rare cancer which, if – or more likely when – it returns, it will be incurable. Sharing data, he said, “will change the way research – currently based on diminishing returns from randomized clinical trials – takes place.”

Dom even shed light on tech hacking in the diabetic community, pointing out, “We combine our insulin pumps which have Bluetooth functionality, with our constant blood glucose monitoring (CGM) systems … [And] by working together and sharing data, we … have come up with a hack in which our CGMs automatically talk to our insulin pumps and adjust our doses for us, leaving us free to continue as usual without having to intervene every five minutes. “

A Anonymous coward said they would undoubtedly feel the same way Dominic did about his position: “But with respect, that’s not the subject of debate … for any purpose if you fail to unsubscribe. “

And Alain Guillaume stressed, “What is the cost and to whom?” … For someone who illegally re-identifies their data which is then sold, there is money to be made but there is little cost if this act is found; perhaps at worst a fine for his. If there were any significant personal fines, then someone re-identifying the data might not. Part of the problem is that most re-identifications are hidden behind company doors. “

vtcoder, a type 2 diabetic, took a broader view, expressing sympathy for Dominic’s situation, but adding, “Should this be available to bastard traders (is there a other type of trader?) Damn, no. And speaking only for myself, I’m in favor of jail – a lot – for those types of marketers who will inevitably try to pierce the veil of anonymity. “

ST added cryptically: “here is the fear argument If you do not provide us with your patient data, they or they you will have to cut your arms and legs. Unfortunately, the fear argument still works. “

It was up to you to oppose the proposal on Thursday, suggesting that while presumed consent works in theory, it doesn’t in practice. At least not yet. While the UK NHS has set a great example to the world in many areas, it still has failed to make a compelling case why we should trust it to handle our data securely. Looking further ahead, the United States’ biometric program in Afghanistan teaches an abject lesson about how technology policies can be overtaken by events on the ground. So, since debates usually don’t have a “not yet” option, the answer should be no.

But that wasn’t a strong enough no for many readers.

Citizen of nowhere said it was “difficult (for me at least) to understand how someone who specifically points to the potentially deadly results of data on / about Afghans collected under the previous regime falling into the hands of the Taliban and still coming to the conclusion “not yet” rather than “not ever” in response to the question asked. Some objections have been formulated, let’s say, more forcefully.

But frank as the reactions were to all of our contributors, they were also generally well informed, offering much more food for thought. It is just possible to speculate that those who were broadly in favor of the proposal were more likely to have personal experience of the disease or health problems. But, of course, that’s not the kind of data we capture at The register.

What we can show is that in the end, the medical arguments failed to outweigh the privacy and confidentiality arguments. The Reg the readership voted overwhelmingly against the proposal. ®

Share.

Comments are closed.