UK ICO Finalizes Rules for Children’s Content
The UK Information Commissioner’s Office (ICO) recently finalized its Age-appropriate design: a code of practice for online services (the code). The code applies to any “relevant information society services which are likely to be accessed by children” (by which the ICO means minors under age 18), whether designed for kids or general audiences. The new version makes few significant changes from the consultation draft circulated in May 2019. The ICO added a 12-month transition period and issued industry-specific guidance for media companies, however, most of the substance of the code remains the same. It calls on companies to adopt a risk-based and proportionate approach to age verification and to determine whether their services are “likely to be accessed by children.” While the finalized code offers examples of how a business might ascertain age and whether minors are likely to visit a website or service, it fails to provide a specific, workable definition of “likely to be accessed by children” or technical guidance. The code is not a law, but “it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services.”
The updated code still defines “children” as minors under 18, citing the UN Convention on the Rights of the Child. It requires that the best interests of the child be foremost when processing personal data of children. Companies must adhere to 15 new standards, starting with privacy-by-design. The code directs businesses to carry out data protection impact assessments, apply data minimization principles, and avoid “nudge” techniques. The initial draft described “nudge” techniques broadly, generating strong criticism that the ICO was straying into advertising issues outside its purview; the final version clarifies that the focus is on nudge techniques that encourage children to disclose unnecessary personal data or to weaken or turn off privacy controls. Default settings for services should be “high privacy,” and geolocation tracking and profiling should be given a default setting of “off.”
The notion that all minors should be treated like children is problematic, reflecting a lack of real understanding of the developmental differences between kids, tweens, and teens. Even more onerous from an implementation standpoint are the obligations to provide very different and specific types of notices depending on the age of the “child.” For digital services that are targeted to different age ranges, the operational obligation will be significant, especially considering the small screen sizes of mobile devices. Importantly, the worry is that the code will force businesses to collect more, not less, data about a child and, specifically, to collect and retain data about a user’s age in circumstances where it is not permitted or is discouraged under other laws like the U.S. Children’s Online Privacy Protection Act (COPPA).
The code departs from existing, accepted definitions of a “child” reflected in privacy, advertising, and product safety laws. For example, COPPA applies to operators of websites or online services that are either directed to children under 13 or have actual knowledge that they are collecting personal information online from a child under 13. COPPA does not require operators to guess whether kids might visit a site not designed with them in mind. Such sites are expected to assume that visitors are under 13 rather than collect and retain birthdates. And COPPA does not obligate general audience sites, such as e-commerce sites, to seek out age information. Similarly, the U.S. Consumer Product Safety Improvement Act (CPSIA) defines a “children’s product” as one designed and intended primarily for children 12 and younger. Defining “children” to include all minors is likewise inconsistent with decades of child development research on advertising to children, which generally defines children as around age 12. Defining a child as anyone under 18 is also inconsistent with Article 8.1 of the EU General Data Protection Regulation (GDPR), which imposes a default age of 16 but allows member states to set the age of a child between 13 and 16. (Ironically, the UK set its GDPR age of consent at 13.) The International Chamber of Commerce Marketing and Advertising Reference Guide on Advertising to Children provides useful background on why it makes sense to distinguish between children and teens for advertising and privacy purposes.
While the code does not have the force of law, it is persuasive in ICO and court determinations and will be a key measure of compliance with the UK Privacy and Electronic Communications Regulations and the GDPR. And, like the GDPR, penalties can reach £17 million or 4% of global turnover. Businesses that fail to comply with the code therefore could face added scrutiny by the UK ICO, leaving them potentially vulnerable to punitive fines. If approved by Parliament, the code is expected to take effect in 2021.
Unfortunately, despite statements about the necessity for the code and its achievability, operationalizing its standards will be enormously difficult and the extent to which it will actually enhance children’s privacy is questionable. Nevertheless, the Ireland Data Protection Commission (DPC) has also been working on a consultation on children’s privacy and may also consider similar approaches.
The code presents some conflicts for global businesses who have applied COPPA as the gold standard for children’s privacy protection. And while merely making available a digital service to UK or international visitors is likely not enough to trigger application of the code, businesses may choose to geo-gate and block UK visitors instead. As more countries adopt additional proscriptive requirements and guidance on privacy, the possibility of conflicts and inconsistences are real, creating a confusing landscape for consumers and businesses alike.