The Journey to Code: the next evolutionary step in Data Protection
October 03, 2018
The summer was a pleasant relief from the intensity of the run-up to the GDPR go-live date, 25th May. The holiday season coupled with legal “due process” gave us some breathing space, to take stock and reflect on what we’ve learned about data protection and the possibilities on the road ahead.
Technology now plays an increasingly important role and the majority of processing activities involving personal data are software-controlled, i.e., code-based. At PwC, we believe that we are on a “Journey to Code”, which will lead to more and more Data Protection outcomes being delivered actually in data and technology itself and less so within paper and human processes. It might seem an obvious destiny, but experience shows that in many respects we’re no nearer to that outcome than we were fifty years ago, when concepts of data protection were first identified as the solution to worries about the surveillance and propaganda risks arising from new technologies and new data processing techniques.
The breadth and depth of data protection
Data protection is a broad and deep topic. It’s broad in the sense that it encompasses obvious matters, such as legislative compliance, through less obvious matters, such as “Adverse Scrutiny”, through to cutting-edge matters about how “Modern Business” works. It’s deep in that it covers a “paper layer” of data protection (such as policies, notices, and work plans), a people layer (such as human processes, training, and monitoring), and a “code layer” (which covers hardware, and software).
We know, of course, that many organisations will be able to point to the use and deployment of technology within their GDPR programmes. We are not arguing that the data protection world has wholly ignored or abandoned the code layer. Instead, we see a need to rebalance the load of effort, away from the paper layer, which currently takes up the lion’s share of DP focus in most organisations, and more towards code.
If we take data-sharing between different organisations and countries as common examples of data protection concern, there is an obvious role for paper and people. Data sharing agreements and access rules will be put in place, and monitoring of users’ activities will occur. The idea is to create an environment of trust around the data itself. However, these trust schemes face natural boundaries: contracts are there to be broken; policies are there to be ignored; monitoring is there to be misunderstood.
What if the desired outcomes for trust could be achieved inside the data itself, not just around it? What would be the potential effect on operational risk, if the outcome was coded-in? Could we “time-bomb” the data, so that it destroyed itself after a period of time? Would that be a solution to the risk of a third party retaining personal data after the expiry of its contractual rights?
Many data protection outcomes can already be achieved in code and it is only going one way. The state of technological development itself propels the Journey to Code. The law and operational adequacy demands that we embrace it.
Therefore, we need to develop Code Strategies for DP.
“A Journey of Ethics”?
The Journey to Code must also involve a Journey of Ethics. An ethics journey will democratise the delivery of data protection outcomes in a way that the law just cannot achieve, although in some jurisdictions a Code of Ethics for Data Protection and DP Law will look virtually similar.
There are two core problems with law. First, law is a top-down, command model. Second, because of this, it is politicised, in the sense that it emanates from the State and the functions of the State (e.g. judges). This places a natural brake on the ability to create universal legal principles. Basically, if you want universal principles, you need Treaties, which means politicians agreeing. That is a hard and slow process. GDPR itself took four years from start to finish and this was inside an established legal framework.
So, while The Rule of Law is essential for the health of functioning democracies, laws themselves are not necessarily the best vehicles for delivering universal outcomes. Oddly, laws do not “democratise” the subject matter that they are concerned with, in the sense of giving ownership of the agenda to people far and wide.
Codes of Ethics work differently. In contrast to laws, they have a “bottom-up” effect, immediately removing the political connotation. Their potential for universal effect is therefore considerably greater than laws. We see massive success in Codes of Ethics. Think about the Hippocratic Oath, or lawyers’ duties of client confidentiality, which are both ancient and universal. Of course, we are not denying the value and culture aspect that affects design choices.
Due to the state of technological development, Codes of Ethics need to be developed so as to guarantee the delivery of the right DP outcomes in code. The rapid development of AI, robotics and machine-to-machine communications makes this an urgent requirement. And we have to understand that advanced AI will generate its own code. Society needs to be sure that the inventors of technology are appreciative and supportive of the DP outcomes that society holds dear. The law alone will not guarantee those outcomes, but Codes of Ethics might.
The Journey to Code and a Journey of Ethics will be at the very heart of the next phases of development of Data Protection principles. Translating this into operational need, organisations will need to develop Code and Ethics Strategies to successfully deliver the GDPR over the longer term.