Growing up has always been difficult. But growing up as a child in today’s environment of immersive technology where thousands of apps and online services vie for your attention must be especially baffling to a child navigating today’s complex world. It’s become even more important in recent years for the privacy consequences of technology used by children to be properly understood and addressed. As technology becomes increasingly a part of our everyday experience, children rightly need protecting from the harm that can pervade the online world just as harm is present in the offline world. So it’s no surprise that Governments and regulators see the need for action to ensure that law and regulation takes the right approach to protect children.
Breaking new ground
Until now there has been no comprehensive regulation in the EU addressing the privacy risks that children face in the online environment. However, the UK Government decided in the Data Protection Act 2018 (Act) to recognise the need for greater regulation of children’s online privacy. Under s. 123 of the Act, the Information Commissioner’s Office (ICO) is required to prepare a Code of Practice which includes guidance, as the ICO considers appropriate, on the standards of age appropriate design for relevant Information Society Services (ISS) which are likely to be accessed by children. This requirement represents a different focus from the GDPR which, in Article 8, refers to specific rules applying to ISS which are ‘offered directly to a child’. This Code of Practice deliberately casts the net more widely than GDPR to capture a broader range of data processing activities. This is not the only initiative the UK Government is currently pushing in order to develop a regulatory online environment with greater protections for children. In April 2019, the Government published the Online Harms White Paper which sets out a framework of stronger accountability for online activities to protect the vulnerable.
As a result of the Act, on 15 April the ICO published a draft Age Appropriate Design Code of Practice (the Code) which is now out for public consultation. Interested parties have until 31st May to provide a response to the consultation.
A new compliance standard by default?
Given the remit of the Code, it has the potential to become a new compliance standard which all organisations with an online presence will comply with regardless of whether it definitely applies to them, simply because it may not always be clear that they are caught by the Code’s scope. For instance, the Code states that an organisation should apply the Code’s standards to all users unless the organisation has robust age verification checks that can distinguish children (defined as individuals under 18 years) from adults. In fact, the Code states that as a starting point the services provided to all users should meet the Code’s requirements. If the website can additionally use age verification mechanisms to check that the user is an adult, then a separate version of the service which doesn’t meet the Code’s requirements can be provided to adults if they wish.
Who needs to comply with the Code?
The ICO states that the Code applies to organisations that provide online products or services that process personal data and are likely to be accessed by children in the UK. So the Code is not just for online services specifically targeted at children. When designing the service, organisations will need to think about whether or not the service will appeal to children even if this is not the organisation’s intent. If an organisation believes that only adults will access the service, the ICO will expect the organisation to be able to demonstrate that this is the case. In other words, the organisation needs to think about carrying out market research or be able to rely on measures which can demonstrate that the service deliberately excludes children. This has the potential to have a significant impact. It suggests that the onus is on a website operator to clearly prove that the website does not appeal to children in order to escape the scope of the Code. Additionally the Code states that if it subsequently becomes evident that children (even a small proportion of the overall user base) are accessing the service, the organisation will need to comply with the Code. This could have the counterproductive result that organisations will seek to remain ignorant of whether children are using their services since knowledge of this fact would then prompt them to comply with the Code.
The Code states that most online services will be ISS. It identifies ISS as including apps, search engines, social media platforms, online messaging services, online marketplaces, content streaming services, online games, news or educational websites, and any websites offering goods or services over the internet. Connected toys and connected devices are also considered ISS. This covers a substantial number of services. The Code confirms that there is no requirement for a fee to be paid by a user to the service provider for a service to be an ISS. But it clarifies that information only websites and websites that do not allow a user to buy products online or access a specific online service are not an ISS. Furthermore, it is not just UK only organisations that are expected to comply with the Code. Non-UK organisations are potentially also caught under the same extraterritorial rule that affects non-EU organisations under GDPR.
The Standards in the Code
So what does the draft Code require? It sets out a list of 16 standards that it expects organisations to comply with. A number of these are common sense principles which don’t necessarily reflect defined data protection principles. For instance, they include: don’t use children’s data in a way that may be detrimental to their well-being; don’t use nudge techniques to encourage children to provide unnecessary personal data. Certain key points from the standards to note are:
- A requirement to act in the best interests of a child. The Code indicates that the best interests of a child’s right to privacy are unlikely to be outweighed by the commercial interests of an organisation.
- Organisations should not use ‘sticky features’ to extend user engagement when the users are children.
- Organisations must reset existing user settings to a standard of high privacy protection within a certain number of months (still to be determined) of the Code coming into force. This may be one of the more difficult aspects of the Code for organisations to comply with.
- Organisations should not share children’s personal data with third parties if they can reasonably foresee that the third party will use the personal data in ways that have been shown to be detrimental to the well-being of children. Organisations are expected to have carried out due diligence and obtained reassurances from such third parties in these circumstances. Furthermore, the Code indicates that organisations should not share children’s personal data unless they have a compelling reason (e.g. safeguarding) to do so and if it is in the best interests of the child.
- The Code clarifies that profiling can include where online services make suggestions to users about who to connect with or who to follow. If children are fed content which is recommended for them, the organisation is responsible for ensuring that such content is not detrimental to a child.
- Websites must also provide online tools which help children exercise their rights online e.g. provide a ‘download all my data’ feature or ‘erase all my data’ feature.
- Organisations should usually carry out a Data Protection Impact Assessment (DPIA) since the nature and context of online services within the scope of the Code inevitably cover a type of processing likely to involve a high risk to the rights and freedoms of children. But, of course, if the scope of the Code itself will capture many if not most websites, this could suggest that most websites must also carry out a DPIA on this basis – which seems excessive. As part of the DPIA process, the Code expects organisations to have carried out some form of consultation i.e. from existing users or members of the public.
Enforcement of the Code
Failure to act in compliance with the Code can lead to regulatory action and the ICO states that organisations will find it difficult to comply with the law if they fail to comply with the Code. The ICO indicates that use of children’s data is one of their regulatory priorities which suggest that they will more rigorously scrutinise any potential abuses of the Code. Additionally, the ICO will take the Code into account when considering compliance with the GDPR and the e-privacy rules. The Code also suggests that due to the public interest in protecting children’s data, harm or potential harm to children will attract more severe penalties.
Protecting children by design
The focus of the Code is on design. The requirement is that organisations think about and implement designs in the online world that are appropriate to the age of any children accessing their online product or service. The challenge is that so many organisations have an online presence and have not necessarily considered the impact of their websites or apps on children up to this point. Nevertheless, the Code signals an important step change in the responsibility of organisations to carefully consider the impact of their online presence on children growing up in the digital age.
All content on this page is correct as of April 26, 2019.