Hacking the playroom: How can children be safe and protected in the digital world?

This year, the digital world will reach a significant milestone – Almost 50% of the world’s estimated 7.4 billion population will be online. And, according to research by UNICEF Innocenti, one-third of these will be children.

So what are the particular risks or harms that children face in an increasingly connected world? In this blog, children’s online rights expert Dr Rachel O’Connell will examine the issues  through the perspective of recent reports about connected toys. She will then consider the new European Union data protection rules, which come into force in 2018 and how these and other developments might help to provide more security, privacy and safety.

Toys that talk and listen

As connected and smart toys are being utilised by companies as marketing tools, advertising, product placement and sponsoring are increasing. For example, Cayla “the world’s first interactive doll” came in for criticism, when she was found to have in-built audio tools designed to market foods high in sugar or fat to children. You can see the video here from BEUC the European Consumer’s organisation.

Cayla was also in trouble for failing to protect children’s data and privacy. The bluetooth enabled doll comes with a microphone to capture children’s speech which can then be analysed using a third party app.  So concerned was Germany's network watchdog by what they deemed the unlawful surveillance capability of the doll that they urged parents to destroy her:

Any toy capable of transmitting signals and surreptitiously recording audio or video without detection is unlawful. The danger, the agency claims, is that anything a child or someone else says in the vicinity of the doll can be transmitted without parents' knowledge. Also, lack of network security could allow the toy to be turned into a listening device, the agency suggests.

To be clear…

The company that produced the Cayla doll would have had numerous contractual relationships between a range of third parties, which include data processors, app platforms, marketing technology and advertising platforms, data management platforms, data analytics, and speech recognition software.

While blanket permission for these businesses to process a child’s data will have been given, when a parent clicks ‘I Agree’ to the Terms of Service and Privacy Policy, the limits to this approach to informed consent have been well documented

Rights of the child

As well as advertising and security, regulators are concerned by violations of the legal protection of children’s rights afforded under the UN Convention of the rights of the child , including Article 16:  

  • No child shall be subjected to arbitrary or unlawful interference with his or her privacy, family, or correspondence, nor to unlawful attacks on his or her honour and reputation.
  • The child has the right to the protection of the law against such interference or attacks.

However, as the UK Information Commissioner's Office (ICO) highlighted, under existing data protection legislation there was ‘little that could be done to prevent unscrupulous third parties from harvesting a child’s data and using it for inappropriate purposes’[1] (eng).

The new General Data Protection Regulation (GDPR), which comes into force in May 2018 stipulates why children’s rights merit specific protection with regards to their personal data:

“Children may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child.”

Article 8 of GDPR also states that where a child is below the age of 16 years, processing of their personal data is only lawful if consent is given or authorised by the holder of parental responsibility over the child. Member states can choose to lower the age at which parental permission is required to 13 years of age, but no lower.

The GDPR specifically states that separate consent will be needed for different processing operations – this means that in the future it will not only be a requirement to inform consumers of who the data processors are and obtain consent, they must also enable consumers to withdraw this permission at any point.

Privacy by design

A key principle underpinning GDPR is that businesses will need to adhere to the principle of Privacy by Design, which requires privacy and data protection compliance during the product or service design stage, instead of bolting them onto the end. These rules will have a reach far beyond the EU as any business processing EU citizens’ data will have to abide by them.

New rules, new tools

What is beginning to emerge, driven primarily by regulation, is a raft of technical standards which detail how businesses can develop Privacy Enhancing Technologies (PETs) that provide consumers with greater control over their personal data. For example

  • User-Managed Access protocol (UMA) is an access management protocol standard, which will enable end users to better protect their data no matter which platform they are on.

The global consumer movement has a duty to advocate for the adoption of best-practice tools and ensure that existing and new digital services are built with consumer protection in mind. Educating consumers about the choices they have available to them will also help pave the way for a digital world that is safer and more secure for people of all ages.