Santa’s stand-in playmates
December 17, 2020
It’s that ‘most wonderful time of the year’ when people want to connect with friends and family. Whilst the rules of social and physical distancing will limit the number of people we can interact with, our ability to stay connected to the rest of the world via the internet will inevitably continue without constraint.
This connectivity is an integral part of today’s daily life. Many of us struggle to function without our devices. We need and depend on the Internet of Things (IOT) in order to function in society whether it’s for shopping, paying bills, catching up with current affairs, streaming music and films, playing on-line games, watching smart TVs or video-calling friends… the list seems endless. And whilst some of us might have requested the latest smart-watch or fitbit for Christmas this year, some children have asked Santa for an i-Pad, PS5 console or talking teddy bear. But do we know enough about how these devices operate in the background to know whether they protect our children and their data appropriately and adequately?
Children’s toys have become a lot more sophisticated in recent years. Many of them are connected to the internet and are designed to provide a personalised learning and playtime experience. Some toys - including cuddly ones - have the ability to recognise images and voices, and can tailor audio and robotic responses to the child’s reactions so that each subsequent response creates a type of authenticated dialogue. This interactivity is made possible by the toy’s in-built video camera, sensors and microphone which record and transmit data to a connected server in order to formulate and send back a response using sophisticated algorithms. These toys have become stand-in playmates. But in order to provide this functionality these toys are collecting and - frequently - retaining, thousands of data points about our children. How can we ensure that information being gathered about our children in this way is kept safe and not exploited?
It has been argued for some time that a legal framework was needed to safeguard the rights and freedoms of children in this respect. The European Commission has proposed legislation relating to the Privacy and Electronic Communications Directive and on 2 September 2020, a statutory code of practice, also known as the Age Appropriate Design: A code of practice for online services (the “Code”) came into force in the UK. It has a 12 month transition period and businesses need to align their practices by 2 September 2021. The Code is aptly timed as it has been followed by a class action claim worth £2.5 billion alleging exploitation of children through inadequate data privacy being filed in the UK High Court. The Code is the first of its kind anywhere in the world and the seriousness of its purpose should not be underestimated.
The Code is applicable to businesses that process children’s (ages 0 - 17) personal data in the UK. Informed by the United Nations Convention on the Rights of the Child (“UNCRC”) and the data protection laws governing the UK, the Code provides a road-map for businesses to process children’s personal data fairly and lawfully. It is made up of 15 standards which are intended to be ‘neutral design principles and practical privacy features’ for organisations to adopt. It covers areas such as acting in the child’s best interests, conducting Data Protection Impact Assessments (DPIAs), not tracking a child’s geo-location, not profiling children and not using nudge techniques. Essentially, the Code encompasses data protection by design and default. Children need to be protected by these principles in the digital world to ensure they can access and use online services such as games, apps, connected toys and websites without parents or guardians having to.
While the Code itself is not legally binding, it is a statutory code of practice under the Data Protection Act, 2018. The UK’s privacy regulator, the ICO, has made it clear to businesses that failure to conform to the provision of the Code will be subject to the ICO’s approach to enforcement as outlined in its Regulatory Action Policy. As has been seen in recent weeks, the ICO has not been afraid to throw the book at organisations who do not comply.
What is abundantly clear is that children need to be afforded greater levels of protection and given special treatment when it comes to their safety and the handling of their personal data; the Code is intended to do just that.
‘Elf and Safety’
If you decide to spoil your children this Christmas with some of the latest connected toys and smart products, take time to ensure your most vulnerable will remain protected from any unintended and inadvertent consequences. Read the product’s accompanying materials and take time to set any controls that minimise data capture and sharing.