Will there be a new data protection offence for the UK beyond GDPR?

September 08, 2017


In early August, the UK Government published its Statement of Intent in relation to a proposed Data Protection Bill. PwC’s initial analysis, which can be found here, notes that the proposed Bill largely reflects the General Data Protection Regulation (GDPR).

Notably, the Statement of Intent does divert from the GDPR in at least one area: it sets out a proposed new offence which relates to:

intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data. Offenders who knowingly handle or process such data will also be guilty of an offence. The maximum penalty would be an unlimited fine.’

The proposal is striking for a number of reasons:

  1. It unsettles the distinction between anonymised and pseudonymised data

The wording of the proposed offence – as currently drafted - opens up the possibility that anonymised data could be manipulated to re-identify individuals.

Under the GDPR, anonymised data is data where the data subject is no longer identifiable. Any means of identifying an individual has been permanently removed.  It is, of course, distinct from pseudonymised data, which typically involves replacing one attribute in a record with another, for example by hashing or tokenisation. With pseudonymised data, there is an ability to reverse-engineer the data set back in order to identify an individual, because the organisation holds the pseudonymisation key. With anonymised data there is, under the GDPR at least, no possibility of doing this; there is no key. (Recital 26)

The distinction between the two concepts is an important one, as the GDPR applies to pseudonymised, but not anonymised, data. It has also been a relatively clear cut distinction. However, the proposed offence – as currently drafted – unsettles that distinction. It is based on a premise that it is at least possible to re-identify individuals from anonymised data, and this brings a degree of uncertainty to the concept. It may be that the final version of the proposed offence addresses this uncertainty, by limiting its scope solely to pseudonymised data.

  1. It does not currently reflect that there will be circumstances in which re-identification is appropriate

Controllers and Processors have an obligation to protect personal data by implementing appropriate technical and organisational measures to ensure a level of security appropriate to the risk. Pseudonymisation is explicitly called out as an example of such a measure (Article 32(1)(a). 

Controllers and Processors also have an obligation to regularly test, assess and evaluate the effectiveness of their security measures (Article 32(1)(d).) So, if an organisation is using pseudonymisation, the GDPR explicitly requires that they stress test its effectiveness. This will, in many instances, involve intentionally re-identifying individuals from pseudonymised data, to check that the measures are robust.

Clearly, the proposed offence could not be intended to cover those scenarios. Similarly, it is also anticipated that the UK Bill will allow exemptions for journalists and whistleblowers, so that they are not caught by the criminal offence, if such identification is done for journalistic purposes or for the purposes of holding organisations to account. What is less clear, however, is whether the proposed offence will allow exemptions for researchers acting in good faith, re-identifying purportedly anonymised or pseudonymised data in order to expose failing in data security or data governance.

  1. A similar offence has already been subject to heavy (and long) debate in Australia

The offence - as currently drafted - mirrors aspects of the Privacy Amendment (Re-identification Offence) Bill 2016  introduced into Parliament in Australia in October 2016. The key purpose of the proposed new law is to deter re-identification of earlier de-identified data, through the creation of new offences prohibiting:

  • intentional re-identification of de-identified personal information made available by an Australian Commonwealth agency (ie. Commonwealth Government data); and
  • the intentional disclosure of re-identified personal information.

The Bill also proposes the introduction of offences for counselling, procuring, facilitating, or encouraging another to re-identify personal information from Commonwealth Government datasets. There will also be obligations to notify where re-identification occurs.

Under the Australian Privacy Act, personal information is “de-identified” where it is no longer about an identifiable individual or an individual who is reasonably identifiable. The Bill focuses on de-identified personal information that has been published by a government agency. The challenge recognised by the Australian regulator, the OAIC, is understanding how entities ensure de-identification is ‘correctly’ or reasonably done in the first place. The Productivity Commission identified in a recent report into data availability and use suggested there is a need for better guidance on robust de-identification, noting the public policy benefits with the advance of new technology and access to de-identified data. Indeed, Australian Government often publishes de-identified data on data.gov.au.

The proposed amendments recognise that de-identification techniques may become susceptible to re-identification in the future, and so there is a need to develop a network of non-technical data protections which support technical de-identification. Exceptions to the offences are also proposed where Commonwealth Government data releases de-identified datasets for research and policy making, or to certain agencies and entities (as determined by the relevant Minister).

Critics of the Bill argue that with the advances in technology, de-identified data might still be susceptible to future re-identification. The issues with successfully being able to de-identify information was demonstrated in an incident involving researchers from the University of Melbourne who were able to use cryptographic methods to reverse the encryption algorithm used to de-identify doctor identification numbers. See the PwC Australia Legal Talk alert for more information about the incident and the laws via https://www.pwc.com.au/legal/assets/legaltalk/data-governance-11may17.pdf.

However, debate has ensued in relation to whether de-identified data falls strictly within the ambit of “personal information” governed by the privacy laws, and the need to weigh up the proportionality question – ie. whether re-identification should be subjected to criminal or civil penalties. The Senate Committee, who was charged with reviewing the Bill, stated that despite the criticism, the Bill was necessary and proportionate.

The Australian Bill has been debated for some time now; a reflection, perhaps, of the complexity of the topic. The UK Bill – in contrast – is intended to go before Parliament in September, and to be in force by May 2018.


Polly Ralph  | PwC | Barrister & Solicitor (New Zealand Qualified) – Cyber Security and Data Protection Legal Services | PwC - UK
[email protected] | +44 (0) 207804 1611

Sylvia Ng | PwC | Director – Legal Services | PwC - Australia
[email protected] | +44 61 2 8266 0338