Purposeful Data Privacy – how tech and data professionals can help change the narrative
April 09, 2019
Technologists, data scientists and other people interested in the tech and data aspects of data privacy have an opportunity to do something purposeful. They can help change the narrative of data privacy, to focus citizens, politicians, regulators, businesses and the media on the need to deliver meaningful data privacy outcomes within technology and data themselves. Those closest to technology and data have the greatest capacity to deliver change.
Delivering meaningful data privacy is one of the great challenges of our age and we have a responsibility to address it. Arguably, it sits in the same category of global-scale challenges as Climate Change, the rise of extremism and the widening of inequalities in society. What’s behind this statement? Well, it’s the belief that meaningful data privacy is essential for the health and wellbeing of society.
If we don’t address the challenge, history may judge us badly.
But we can’t deliver meaningful data privacy without being honest about the role that technology and data themselves have to play within the problem and within the solution. And we also need to be realistic about what has actually been going on in the field.
A brief history lesson: The data privacy framework that we recognise within the GDPR and similar legal texts is about 50 years old, depending on when you judge the starting point, but we know for sure that the principles of data privacy were codified by 1973 at the latest, when the Council of Europe published its Resolution on private sector data banks.
The reason for the development of the principles in the late 60's and early 70's was a fear of technology and a recognition of the risks of surveillance and propaganda fuelled by personal data and accelerated by technology.
Surveillance and propaganda. Think about it. Ask yourselves what do you see around you?
Edward Snowden showed us the truth about mass surveillance by governments and intelligence agencies, of course. But surveillance is now also a core activity of online services and retail services, through tracking, behavioural profiling and activities such as AdTech.
And what did we learn about recent US Presidential elections and the UK Referendum? Micro-profiling, to influence the citizen’s vote.
And what do we know about the China social credit scheme? Life changing decision-taking, fuelled by personal data and accelerated by technology.
Whatever we look at it, we’re living in a surveillance and propaganda society. These activities have been normalised. Surveillance is even in the nursery.
The technology and data environment that triggered the fears that led to the development of the data privacy principles are now realities and the trajectory is only one way: there will be more and more surveillance, more and more propaganda and more and more automated decision-taking. We can develop this for good, or we can stand by and let it develop for bad.
However, we must be careful not to falsely compartmentalise the problem and say this is just a government problem, or a Tier 1 US tech company problem, or an AI problem. That kind of thinking causes a collective shrug of indifference, kicking the can down the road, or, more likely tipping the balance towards a dystopian future that none of us would want for our children and grandchildren, or for our friends’ children and grandchildren.
So let me get more specific. Data privacy outcomes – the principles and rights of data privacy - need to be delivered in the technology and data layers of business. And where that isn’t happening, we need to call it out and whatever we can, make the board and other business leaders aware of the deficit.
A lot of work has been done in the economy over the past two or three years, on the GDPR, but in most situations it has only scratched at the surface, delivering paper change – new rules, policies, procedures, data and risk registers – but not meaningful transformation of technology and data themselves. Of course, the role of paper is important and it can sometimes be critical, but if we don’t address the need to deliver data privacy outcomes in technology and data themselves, we are not making a meaningful difference.
Alas, that is the case as I see it, but who cares? Some organisations are seeing data privacy simply as a regulatory compliance issue and they won’t care until there is some kind of Big Bang explosion of fines from the regulators. If that is what it will take them to change, it’s a shame, and we’re likely to be disappointed: the problems of data privacy are simply too big for the regulators to fix by themselves.
So, instead, if we believe in what data privacy is about, what it is trying to achieve and why it is important, we have to change the narrative of the conversation, making this a purpose and trust issue, not a compliance issue. Purpose and trust might just be enough to properly elevate the topic, both in terms of the audience engaged and the cadence.
And once the conversation is elevated, we should talk about the need to deliver data privacy in the technology and data layers, about baking the principles and rights into the very fabric of technology and data themselves.
That is what the Journey to Code is all about, PwC’s new vision for data privacy. I believe that there is only one purposeful journey for data privacy and it leads ultimately to coding in the outcomes, so they are clear, set and scalable. This needs to be done for all issues of data privacy, where possible to do so. If we get there, the benefits will be huge and history will applaud us.
To find out more about how we can help address key data protection, privacy and optimisation challenges facing your organisation, please get in touch with one of our subject leaders.