August 15, 2020

Volume X, Number 228

August 14, 2020

Subscribe to Latest Legal News and Analysis

August 13, 2020

Subscribe to Latest Legal News and Analysis

August 12, 2020

Subscribe to Latest Legal News and Analysis

UK ICO Finalizes Rules for Children’s Content

The UK Information Commissioner’s Office (ICO) recently finalized its Age-appropriate design: a code of practice for online services (the code). The code applies to any “relevant information society services which are likely to be accessed by children” (by which the ICO means minors under age 18), whether designed for kids or general audiences. The new version makes few significant changes from the consultation draft circulated in May 2019. The ICO added a 12-month transition period and issued industry-specific guidance for media companies, however, most of the substance of the code remains the same. It calls on companies to adopt a risk-based and proportionate approach to age verification and to determine whether their services are “likely to be accessed by children.” While the finalized code offers examples of how a business might ascertain age and whether minors are likely to visit a website or service, it fails to provide a specific, workable definition of “likely to be accessed by children” or technical guidance. The code is not a law, but “it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services.”

The updated code still defines “children” as minors under 18, citing the UN Convention on the Rights of the Child. It requires that the best interests of the child be foremost when processing personal data of children. Companies must adhere to 15 new standards, starting with privacy-by-design. The code directs businesses to carry out data protection impact assessments, apply data minimization principles, and avoid “nudge” techniques. The initial draft described “nudge” techniques broadly, generating strong criticism that the ICO was straying into advertising issues outside its purview; the final version clarifies that the focus is on nudge techniques that encourage children to disclose unnecessary personal data or to weaken or turn off privacy controls. Default settings for services should be “high privacy,” and geolocation tracking and profiling should be given a default setting of “off.”

The notion that all minors should be treated like children is problematic, reflecting a lack of real understanding of the developmental differences between kids, tweens, and teens. Even more onerous from an implementation standpoint are the obligations to provide very different and specific types of notices depending on the age of the “child.” For digital services that are targeted to different age ranges, the operational obligation will be significant, especially considering the small screen sizes of mobile devices. Importantly, the worry is that the code will force businesses to collect more, not less, data about a child and, specifically, to collect and retain data about a user’s age in circumstances where it is not permitted or is discouraged under other laws like the U.S. Children’s Online Privacy Protection Act (COPPA).

The code departs from existing, accepted definitions of a “child” reflected in privacy, advertising, and product safety laws. For example, COPPA applies to operators of websites or online services that are either directed to children under 13 or have actual knowledge that they are collecting personal information online from a child under 13. COPPA does not require operators to guess whether kids might visit a site not designed with them in mind. Such sites are expected to assume that visitors are under 13 rather than collect and retain birthdates. And COPPA does not obligate general audience sites, such as e-commerce sites, to seek out age information. Similarly, the U.S. Consumer Product Safety Improvement Act (CPSIA) defines a “children’s product” as one designed and intended primarily for children 12 and younger. Defining “children” to include all minors is likewise inconsistent with decades of child development research on advertising to children, which generally defines children as around age 12. Defining a child as anyone under 18 is also inconsistent with Article 8.1 of the EU General Data Protection Regulation (GDPR), which imposes a default age of 16 but allows member states to set the age of a child between 13 and 16. (Ironically, the UK set its GDPR age of consent at 13.) The International Chamber of Commerce Marketing and Advertising Reference Guide on Advertising to Children provides useful background on why it makes sense to distinguish between children and teens for advertising and privacy purposes.

While the code does not have the force of law, it is persuasive in ICO and court determinations and will be a key measure of compliance with the UK Privacy and Electronic Communications Regulations and the GDPR. And, like the GDPR, penalties can reach £17 million or 4% of global turnover. Businesses that fail to comply with the code therefore could face added scrutiny by the UK ICO, leaving them potentially vulnerable to punitive fines. If approved by Parliament, the code is expected to take effect in 2021.

Unfortunately, despite statements about the necessity for the code and its achievability, operationalizing its standards will be enormously difficult and the extent to which it will actually enhance children’s privacy is questionable. Nevertheless, the Ireland Data Protection Commission (DPC) has also been working on a consultation on children’s privacy and may also consider similar approaches.

The code presents some conflicts for global businesses who have applied COPPA as the gold standard for children’s privacy protection. And while merely making available a digital service to UK or international visitors is likely not enough to trigger application of the code, businesses may choose to geo-gate and block UK visitors instead. As more countries adopt additional proscriptive requirements and guidance on privacy, the possibility of conflicts and inconsistences are real, creating a confusing landscape for consumers and businesses alike.

© 2020 Keller and Heckman LLPNational Law Review, Volume X, Number 56

TRENDING LEGAL ANALYSIS


About this Author

Sheila Millar, Keller Heckman, advertising lawyer, privacy attorney
Partner

Sheila A. Millar counsels corporate and association clients on advertising, privacy, product safety, and other public policy and regulatory compliance issues.

Ms. Millar advises clients on an array of advertising and marketing issues.  She represents clients in legislative, rulemaking and self-regulatory actions, advises on claims, and assists in developing and evaluating substantiation for claims. She also has extensive experience in privacy, data security and cybersecurity matters.  She helps clients develop website and app privacy policies,...

202-434-4646
Tracy Marshall, Keller Heckman, regulatory attorney, for-profit company lawyer
Partner

Tracy Marshall assists clients with a range of business and regulatory matters.

In the business and transactional area, Ms. Marshall advises for-profit and non-profit clients on corporate organization, operations, and governance matters, and assists clients with structuring and negotiating a variety of transactions, including purchase and sale, marketing, outsourcing, and e-commerce agreements.

In the privacy, data security, and advertising areas, she helps clients comply with privacy, data security, and consumer protection laws, including laws governing telemarketing and commercial e-mail messages, contests and sweepstakes, endorsements and testimonials, marketing to children, and data breach notification. Ms. Marshall also helps clients establish best practices for collecting, storing, sharing, and disposing of data, and manage outsourcing arrangements and transborder data flows. In addition, she assists with drafting and implementing internal privacy, data security, and breach notification policies, as well as public privacy policies and website terms and conditions.

As to intellectual property matters, Ms. Marshall helps clients protect their copyrights and trademarks through registration, enforcement actions, and licensing agreements.

She also represents clients in proceedings before the Federal Communications Commission and Federal Trade Commission.

Ms. Marshall is a Certified Information Privacy Professional (CIPP/US) through the International Association of Privacy Professionals (IAPP) and a contributing author of Beyond Telecom Law Blog and Consumer Protection Connection.

Education: Washington and Lee University (B.A., 1997); American University, Washington College of Law (J.D., 2002).

Admissions: District of Columbia; Maryland

Memberships: American Bar Association

202-434-4234