The UK Department for Health and Social Care (“DHSC”) published an important strategic document (‘Data Saves Lives: Reshaping Health Care and Social Services with Data‘) outlining the government’s plans for the regulation and use of health data.
In this article, we take a look at some of the more exciting proposals outlined in the strategy and consider what they could mean for the future regulation of data and technology in UK healthcare.
Secure data environments
The NHS will step up its investment in and use of ‘secure data environments’ (sometimes referred to as ‘trusted research environments’). Simply put, these are specially designated secure servers where a third-party researcher’s access to health data can be properly controlled and monitored. These will become the default route for NHS organizations to provide access to their anonymised data for research and analysis. This creates opportunities for providers of secure data platforms and the privacy-enhancing technologies that these platforms depend on. It also highlights the need for companies working with the NHS to increase their own familiarity with and investment in secure data environments.
Secure data environments are a hot topic in data circles. For example, they also appear in the EU’s new data governance law, in the form of its creation of “data intermediation services”, i.e. services that provide a secure environment in which businesses or individuals can share data.
Level playing field for data partnerships
The strategy also contains proposals for the data sharing agreements that NHS bodies use when accessing health data. Intended to address public concerns about data-sharing partnerships with the private sector, the government:
- Require that data sharing agreements embody 5 fundamental principles (e.g. any use of NHS data not available in the public domain must have an explicit aim of improving the health, wellbeing or care of patients in the NHS, or the operation of the NHS, and any data sharing agreements must be communicated transparently and clearly to the public).
- Develop business principles to ensure that data access partnerships contain appropriate contractual safeguards. This will lead to a review and likely update of NHS Digital’s Model Data Sharing and Data Access Agreements by December 2023.
Therefore, organizations accessing NHS datasets will likely see changes to the contractual terms on which such access is provided, and further consideration of the overall arrangement to ensure adherence to principles designed to encourage public trust. in such arrangements.
Trust and transparency
On a similar theme, the strategy contains a series of other proposals aimed at improving public confidence in the use of health data.
Along with investing in secure data environments, the government is also publicly committing to increase investment in a wider range of privacy-enhancing technologies (or “PETs”), such as homomorphic encryption (a technology that enables perform functions on encrypted data without ever having to decrypt it) and synthetic data (artificially fabricated data that strongly mimics real-world data, but without the privacy implications). The ICO has written in favor of some of these technologies in its updated draft anonymization guidelines, and as a result there seems to be a concerted push towards adopting technical solutions to privacy issues in a more more dependent on the data.
The government also plans to further improve the transparency and understanding of how it uses health data (public confusion surrounding changes to the national data deactivation scheme in 2021 is accepted as an example of the type of failure that the government wishes to avoid in the future) . Developments on this front will include a “data pact” (a high-level charter outlining key safeguards to the public in terms of fair use of health data) and an online hub, with a transparency statement explaining how publicly held health and care data are used in practice.
Improving access to health data
In addition to emphasizing public trust and transparency, the strategy also aims to promote better access to health data for the benefit of the public. This is a theme that has taken center stage internationally following the Covid pandemic – a renewed understanding of the importance of health data for research and development purposes, leading to a demand for removing unnecessary barriers to accessing and combining datasets for these purposes.
The government plans to do this in part through major investments (up to £200m) in NHS data infrastructure to make research-ready data available to researchers. DHSC envisions a “dynamic hub of genomics, imaging, pathology, and citizen-generated data, where AI-enabled tools and technologies can be deployed.”
On the legislative side, it is likely that this part of the strategy will also be supported by the Government’s impending Data Reform Bill, which, among other things, makes changes to the research provisions of the UK Data Protection Act. data protection in order, for example, to provide a clearer definition of scientific research, a broader form of consent when used as a legal basis for research, and a more concrete exemption from the privacy notice when the data is reused for scientific research purposes. All of these changes are expressly intended to promote greater use of personal data, including health data, for responsible research purposes.
There are strong parallels here with EU proposals for a European Health Data Area, which will promote access to electronic health data for secondary purposes.
Encourage innovation in AI
No data strategy in 2022 would be complete without considering artificial intelligence (AI). On this front, DHSC:
- Commits to working with the AI Office (OAI) on its plans to develop AI regulations in the UK. The OAI White Paper on AI Governance and Regulation is imminently expected and will be closely scrutinized as the UK’s response to the EU’s AI Bill. The healthcare sector is one of the most sensitive and important in an AI context and NHS work on this will be led by a newly created NHS AI Lab.
- Develop unified standards for testing the efficacy and safety of AI systems, in collaboration with the Medicines and Healthcare products Regulatory Agency (MHRA) and the National Institute for Clinical Excellence (NICE). Safety standards that can be used by development teams building AI systems are an important part of the regulatory framework for safe AI, and this will likely be a welcome step.
- Will develop, through the NHS AI Lab, a methodology to assess the AI safety of health products authorized on the market.
In summary, the strategy contains an ambitious set of proposals aimed at cementing the UK’s position as a global leader in health informatics and data-driven health research. Notably, they are clearly designed to balance and reconcile competing demands for increased access to and use of health data, with the protection of trust, privacy and security of that data.