As the government launches its data strategy for health and social care, a fine line must be trodden between innovating through privacy-enhancing technologies, and retaining data security for patients
Published: 23 Jun 2022
In his classic book Animal Farm, George Orwell wrote: “All animals are equal, but some animals are more equal than others.” A crude modern comparison might be: “All data should be secured, but some data needs securing more than others.” An example of data that needs more robust security than others is medical data. The reasons for this need little explanation.
When the UK government announced its new Data Saves Lives strategy for health and social care, we must consider the security implications and risks this entails. In a nutshell, this policy aims to reform the health and social care sector, to change the way data is used to bring about breakthroughs and efficiencies, assist in tackling the Covid-19 backlog, and bring about a system suitable for the future – a future where patients will be able to benefit from faster and more innovative treatment and diagnosis.
Interestingly, the main principles set out in this strategy are to improve “trust” in the health and care system’s use of data, as well as to ensure that health and social care professionals have the information they need to improve overall patient experience and healthcare delivery. The aim to give patients greater confidence that their personal information is safe will naturally cause some concern. The public have been informed that there will be secure data environments primarily for the NHS, its various trusts as well as social care organisations, which will provide access to de-identified data for research purposes.
Within the wider policy paper, the government has confirmed that any data linked to an individual will never leave a secure server and will only be used for agreed research purposes. The NHS refers to this as a “trusted research environment” (TRE). The TRE service provides approved researchers from trusted organisations with timely and secure access to health and care data. Researchers are given access to their approved data – in accordance with their data-sharing agreements – enabling them to collaborate or link data, as well as share code and results within the same research projects.
In cyber security, this is what is known as privacy enhancing technologies (PETs). Although there is no single definition of PETs, the term is generally accepted as referring to technologies that embody fundamental data protection principles by maximising data security and empowering individuals, as well as minimising personal data use. In this case, PETs will allow the NHS, or other healthcare services, to protect the privacy of patient records, or personally identifiable information (PII), provided to – and handled by – services or applications.
Common examples of PETs include format-preserving and homomorphic encryption, secure multi-party computation and secret sharing, differential privacy and obfuscation techniques, and various means of anonymisation or pseudonymisation. PETs can also be divided into hard and soft varieties. Hard examples include onion routing, the secret ballot and VPNs, while soft examples include access control, differential privacy and tunnel encryption, including secure sockets layer (SSL) and transport layer security (TLS) privacy technologies.
There is little doubt that privacy-enhancing technologies, such as homomorphic encryption, will transform cloud security, upon which healthcare providers will rely. Homomorphic encryption enables computation on data in a cloud environment without leaking the private key – it is commonly referred to as the “holy grail” of cloud security.
However, there is little technical information on the actual technology beneath the secure data environment planned for the NHS. Like any technology, there can be good or bad implementations of it, which can have drastic consequences. This is crucial in cyber security because the hackers only need to find the weakest link in the system to break in.
Modern computer systems are extremely complex. There is also the related issue of key management. You can have the perfect execution of a PET, but if the key management is not perfect, then all is broken. Another common criticism of PETs is that they can be complex to use. This complexity can lead to errors, which, critically, could lead to patient data leaks – not to mention the difficulties in auditing and compliance by health regulators and governments.
PETs are relatively new in IT and there have been outcries about industry giants rolling out such technology. In these cases, the concerns were that these companies wield disproportionate power via their vast troves of data resources. However, we must be conscious not to throw the baby out with the bathwater. These are interesting times, but the public has to hope that the NHS has the expertise to roll out the correct secure data environment. If not, once medical data is leaked, it can never be recovered.
Read more on Privacy and data protection
Privacy-enhancing technologies – myths and misconceptions
Government data talk to the fore at London Tech Week
By: Brian McKenna
UK, US prepare to launch PET project
By: Alex Scroxton
Privacy-enhancing technology types and use cases
By: Michael Cobb