The Journey to Code through the lens of Accountability
October 19, 2018
At PwC we believe that the future of Data Protection (‘DP’) lies in the delivery of many more DP outcomes in the actual technology and data layers of business than is currently the case. We call this ‘The Journey to Code’.
What distinguishes this point of view is that it draws attention to one of the main problems of DP ‘programmes’ over the years, which is that they have been balanced, or biased, towards the delivery of DP outcomes in the paper and organisational layers of business.
We are not arguing that the paper and organisational layers of business are not vital components of DP strategies. However, when the root causes of DP failures are analysed, often a standout feature is that much more could have been done in the tech and data layers to deliver the desired DP outcomes.
Security - a case study for tech and data based Data Protection outcomes
This point of view is most readily understood in the context of Cyber Security failures affecting personal data. A survey of ICO enforcement cases for security over the past ten years reveals that: encryption became a de facto mandatory privacy-enhancing technology a decade ago; tech-based protections against SQL injections became critical four years ago; and now regulatory investigations about network scanning and secure coding are becoming commonplace.
Security is just one of the Data Protection Principles (Article 5(1)(f) of the General Data Protection Regulation 2016), but it provides an excellent case study on how the lens of accountability can be applied to the tech and data layers of business. It is inevitable that the lens of accountability will be applied to the other DP Principles in the same way as it has been applied to the security principle.
Privacy by Design and Accountability: ensuring the delivery of ‘good’ outcomes through the ‘state of the art’
It might be helpful to think about accountability as operating on the same continuum of DP as Data Protection by Design and by Default (‘DPbDD’). Essentially, they are both mechanisms to ensure the delivery of ‘good’ DP outcomes. Where they differ is that they operate at different points in time in the DP continuum: DPbDD operates prior to and at the commencement of data processing, while the focus for accountability is on the period after processing has begun (save for unusual cases covered by DPIAs and the ‘prior consultation’ rule). This is not in conflict with the view that DPbDD forms one of the measures to meet the requirements of accountability. However, they address exactly the same set of questions, one of which is about ‘the state of the art’.
The ‘state of the art’ isn’t a new idea within Data Protection. The old DP Directive 1995 required the state of the art to be considered in the context of security outcomes (see Article 17(1) of the DP Directive 1995), but the GDPR takes this much further, embedding it right at the very heart of DPbDD (see Article 25(1) of the GDPR). The state of the art includes the state of technological development, which now means, of course, that the law requires Data Controllers to proactively consider how the tech and data layers within their organisations can be configured to deliver on the full set of DP Principles. Article 25(1) of the GDPR even goes as far as to cite the data minimisation principle as an example of what it is seeking to achieve within the state of the art.
Therefore, if an organisation is required to give an account, it will need to show its thinking on these points. The absence of a narrative will not be an answer.
Humans cause harm and damage, but it’s also about the opportunity
One observation that has been put to me is that the real threats to personal data and privacy are human beings, not tech and data themselves. People will leave laptops on the train, they’ll throw confidential papers in the waste and they will commit crime.
I agree that people risks are huge, but this is not the case for concentrating DP efforts in the paper and organisational layers. Leaving aside that the law says that this approach cannot be correct, for the reasons explored above, it is operationally inefficient to base controls against serious human behavioural risks, which are subject to infinite variables, in the paper and organisational layers, if there is a technological solution available, particularly a code-based solution. If the opportunity for human abuse or mistakes can be removed through tech and data solutions, that must be the right trajectory.
A good way of looking at this is to think of common burglaries. We know that the risk of burglary can be reduced by educating people about the differences between right and wrong and by providing severe deterrents against wrong doing (prison sentences and fines), but we still invented the lock and the key to reduce the opportunity for crime. It is the opportunity to reduce the likelihood of human failures that can be addressed in the tech and data layers.
However, the case goes much further, because the human role in personal data and privacy abuse and misuse is reducing, due to the drive towards wholly automated processing with no human elements, which is gaining momentum by the day. The forthcoming ePrivacy Regulation, for example, has a focus on machine-to-machine communications. AI will generate its own code. In a world that will deliver driverless cars, we have to accept driverless privacy.
Data Protection is a state of being
But perhaps the strongest response to the ideas that the biggest threats to personal data and privacy are human beings and therefore we need a bias towards paper and organisational DP outcomes, is that DP isn’t simply concerned with abuse and misuses of personal data: DP is about a state of being.
We describe the state of being in terms of the DP Principles, and when the state of being is conceptualised, we find that in both a quantitative and qualitative sense, the state of being exists mostly inside the tech and data layers. Therefore, the logic of DP is that the desired outcomes need to be delivered in those layers.
To find out more about how we can help address key data protection, privacy and optimisation challenges facing your organisation, please get in touch with one of our subject leaders.