Skip to content

Earlier this month, the UK’s Information Commissioner’s Office published a draft code of practice (“Code”) on designing online services for children. The Code  is now open for public consultation until May 31, 2019. The Code sets out 16 standards of “age appropriate design” with which online service providers should comply when designing online services (such as apps, connected toys, social media platforms, online games, educational websites and streaming services) that children under the age of 18 are likely to access. The standards are based on data protection law principles, and are legally enforceable under the GDPR and UK Data Protection Act 2018. The Code also provides further guidance on collecting consent from children and the legal basis for processing children’s personal data (see Annex A and B of the Code). The Code should be read in conjunction with the ICO’s current guidance on children and the GDPR.

The 16 standards set out in the Code are as follows:
  1. Best interests of the child. The best interests of the child should be the primary consideration when developing and designing online services that children are likely to access. This includes consideration for children’s online safety, physical and mental well-being, as well as development.
  2. Age-appropriate application. Online service providers should consider the age-range of users of the online service, including the needs and capabilities of children of different ages. Annex A of the Code provides some helpful guidance on key considerations at different ages, including the types of online services that children may encounter at different ages, their capacity to understand privacy information and ability to make meaningful decisions about their personal data.
  3. Transparency. Privacy information, policies and community standards provided to children must be concise, prominent and use clear language in an age-appropriate manner. ‘Bite-sized’ explanations should also be provided about how the personal data is used at the point that the child starts to use the service, with further age-appropriate prompts to speak with an adult before providing their data or not to proceed if uncertain.
  4. Detrimental use of data. Online service providers should refrain from using children’s personal data in ways that have been shown to be detrimental to their well-being, or that go against industry codes of practice, other regulatory provisions or Government advice. Examples of codes or advice that are likely to be relevant includes guidance from the Committee of Advertising Practice (CAP) that publishes guidance about online behavioural advertising which covers children.
  5. Policies and community standards. Online service providers should uphold their published terms, policies and community standards (including, but not limited to, privacy policies, age restriction, behaviour rules and content policies).
  6. Default Settings. ‘High privacy’ settings should be provided by default (unless the online service provider can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child), thereby limiting visibility and accessibility of children’s personal data.
  7. Data minimisation. Online service providers should collect and retain only the minimum amount of personal data necessary to provide the elements of the service in which a child is actively and knowingly engaged. Children should be provided with as much choice as possible over which elements of the service they wish to use and how much data they provide. This choice includes whether they wish their personal data to be used for (each) additional purpose or service enhancement.
  8. Data sharing. Children’s personal data should not be shared or disclosed with third parties unless there is a compelling reason to do so, taking account of the best interests of the child. Due diligence checks should be conducted on any third party recipients of children’s data, and assurances should be obtained to ensure that sharing will not be detrimental to the well-being of the child.
  9. Geolocation. Geolocation options should be turned off by default unless there is a compelling reason otherwise, again taking account of the best interests of the child. Online service providers should ensure that the service clearly indicates to child users when location tracking is active. Options which make a child’s location visible to others must default back to “off” at the end of each session.
  10. Parental controls. Age-appropriate information should be provided to the child about parental controls, where provided. If the service allows a parent or caregiver to monitor their child’s online activity or track their location, such monitoring should be made clear to the child through the use of obvious signs. Audio or video materials should also be provided to children and parents about children’s rights to privacy.
  11. Profling. Profiling options must be turned off by default, unless there is a compelling reason for profiling, taking account of the best interests of the child. Profiling is only allowed if there are appropriate measures in place to protect the child from any harmful effects (in particular, being shown content that is detrimental to their health or well-being).
  12. Nudge techniques. Design features that suggest or encourage children to make a particular decision to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use, should not to be used. By contrast, pro-privacy nudges are permitted, where appropriate.
  13. Connected toys and devices. The Code applies to connected toys and devices, such as talking teddy bears, fitness bands or ‘home hub’ interactive speakers. Providers should provide clear, transparent information about who is processing the personal data and what their responsibilities are at the point of purchase and set up. Connected toys and devices should avoid passive collection of personal data (e.g., when in an inactive “listening mode” listening for key words that could wake the device).
  14. Online tools. Online service providers should provide prominent, age-appropriate and accessible tools to help children exercise their data protection rights and report concerns. The tools should also include methods for tracking the progress of complaints or requests, with clear information provided on response timescales.
  15. Data protection impact assessments (DPIAs). Online service providers that provide services that children may access should undertake a DPIA specifically to assess and mitigate risks to children. Annex C of the Code provides a template DPIA that modifies the ICO’s standard template DPIA to include a section for online service providers to consider each of the 16 standards in the Code.
  16. Governance and accountability. Online service providers should ensure that they have policies and procedures in place that demonstrate how providers comply with data protection obligations and the Code, including data protection training for all staff involved in the design and development of online services likely to be accessed by children.
Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Mark Young Mark Young

Mark Young, an experienced tech regulatory lawyer, advises major global companies on their most challenging data privacy compliance matters and investigations.

Mark also leads on EMEA cybersecurity matters at the firm. He advises on evolving cyber-related regulations, and helps clients respond to…

Mark Young, an experienced tech regulatory lawyer, advises major global companies on their most challenging data privacy compliance matters and investigations.

Mark also leads on EMEA cybersecurity matters at the firm. He advises on evolving cyber-related regulations, and helps clients respond to incidents, including personal data breaches, IP and trade secret theft, ransomware, insider threats, and state-sponsored attacks.

Mark has been recognized in Chambers UK for several years as “a trusted adviser – practical, results-oriented and an expert in the field;” “fast, thorough and responsive;” “extremely pragmatic in advice on risk;” and having “great insight into the regulators.”

Drawing on over 15 years of experience advising global companies on a variety of tech regulatory matters, Mark specializes in:

  • Advising on potential exposure under GDPR and international data privacy laws in relation to innovative products and services that involve cutting-edge technology (e.g., AI, biometric data, Internet-enabled devices, etc.).
  • Providing practical guidance on novel uses of personal data, responding to individuals exercising rights, and data transfers, including advising on Binding Corporate Rules (BCRs) and compliance challenges following Brexit and Schrems II.
    Helping clients respond to investigations by data protection regulators in the UK, EU and globally, and advising on potential follow-on litigation risks.
  • GDPR and international data privacy compliance for life sciences companies in relation to:
    clinical trials and pharmacovigilance;

    • digital health products and services; and
    • marketing programs.
    • International conflict of law issues relating to white collar investigations and data privacy compliance.
  • Cybersecurity issues, including:
    • best practices to protect business-critical information and comply with national and sector-specific regulation;
      preparing for and responding to cyber-based attacks and internal threats to networks and information, including training for board members;
    • supervising technical investigations; advising on PR, engagement with law enforcement and government agencies, notification obligations and other legal risks; and representing clients before regulators around the world; and
    • advising on emerging regulations, including during the legislative process.
  • Advising clients on risks and potential liabilities in relation to corporate transactions, especially involving companies that process significant volumes of personal data (e.g., in the adtech, digital identity/anti-fraud, and social network sectors.)
  • Providing strategic advice and advocacy on a range of EU technology law reform issues including data privacy, cybersecurity, ecommerce, eID and trust services, and software-related proposals.
  • Representing clients in connection with references to the Court of Justice of the EU.
Photo of Sam Jungyun Choi Sam Jungyun Choi

Sam Jungyun Choi is an associate in the technology regulatory group in the London office. Her practice focuses on European data protection law and new policies and legislation relating to innovative technologies such as artificial intelligence, online platforms, digital health products and autonomous…

Sam Jungyun Choi is an associate in the technology regulatory group in the London office. Her practice focuses on European data protection law and new policies and legislation relating to innovative technologies such as artificial intelligence, online platforms, digital health products and autonomous vehicles. She also advises clients on matters relating to children’s privacy and policy initiatives relating to online safety.

Sam advises leading technology, software and life sciences companies on a wide range of matters relating to data protection and cybersecurity issues. Her work in this area has involved advising global companies on compliance with European data protection legislation, such as the General Data Protection Regulation (GDPR), the UK Data Protection Act, the ePrivacy Directive, and related EU and global legislation. She also advises on a variety of policy developments in Europe, including providing strategic advice on EU and national initiatives relating to artificial intelligence, data sharing, digital health, and online platforms.