KIDS Act Would Expand Existing Federal Protections Under COPPA and Require Significant Changes to Existing Business Models
The Kids Internet Design and Safety (KIDS) Act (the “Act”), introduced in March of this year, would limit operators (“Operators”) of commercial websites, online services, online applications and mobile applications (“Platforms”) directed to children aged 15 and under (“Covered Users”) from incorporating certain features such as auto-play and push alerts, “amplifying” or encouraging certain dangerous, adult or “wholly commercial” content, and employing certain advertising strategies construed as manipulative. Businesses that violate the KIDS Act, or any of the rules that may be issued under it, may be subject to a civil penalty, and such violation would be treated as an unfair or deceptive business act or practice under the Act.
If passed, the KIDS Act would expand on current protections for children such as the Children Online Privacy Protection Act (COPPA), which, as we’ve discussed more fully here, requires websites directed at children to notify parents and obtain consent before such websites utilize personal information from users aged 12 and under. Violations of COPPA can result in notable payouts by potential violators. As with COPPA and the Protecting the Information of Our Vulnerable Children and Youth Act (PRIVCY ACT), a potential significant expansion on, and alteration of, COPPA, the KIDS Act would continue a trend of expanding protective measures for children in a climate of rapid technological advancement and change. Although important to certain legislators and groups such as the Common Sense Media children’s advocacy group, which aided in crafting the legislation, it is unclear given the current political climate whether a bill such as the KIDS Act - which could impose significant additional restrictions on business such as social media companies, online sellers and companies that maintain applications - would pass.
The KIDS Act, if made law, would apply to any Platform “directed to children,” meaning one that targets Covered Users, as demonstrated by a number of factors such as its subject matter and visual content, use of animated characters, music or other audio content, or empirical evidence relating to the composition and intended composition of the audience, among others.
Platform Interface Restrictions Would Require Redesigning Many Existing Software Systems
Once deemed a Platform “directed to children,” it would become unlawful for such Platform to incorporate any of the following features:
Auto-play that commences without input from the user;
Push alerts that urge a Covered User to spend more time engaging with the Platform when not actively using it;
“Likes,” or other methods of displaying the quantity of positive engagement or feedback that a Covered User receives from other users;
Badges or other visual award symbols based on elevated levels of engagement with the Platform;
Any design feature or setting that “unfairly” encourages a Covered User, due to their age or inexperience, to make purchases, submit content, or spend more time engaging with the Platform.
Certain Existing Advertising Mechanisms and Content Would be Prohibited
Further, Platforms “directed to children” would be prohibited from employing certain advertising methods, including:
Directing content to Covered Users that includes host-selling (i.e. “commercial video content that features the same characters or individuals as in the adjacent noncommercial content”);
Exposing Covered Users to program-length advertisements, a term that is to be defined by regulation implemented by the Federal Trade Commission;
Directing branded content (i.e. “commercial content created for, and distributed on a Platform in such a way that the line between entertainment and advertising becomes unclear in order to generate a positive view of the brand”) or native advertisements (i.e. “a form of paid media where the advertising experience follows the natural form and function of the user experience in which it is placed”) to Covered Users;
Directing online advertising or material with considerable commercial content involving alcohol, nicotine, or tobacco to Covered Users (in certain cases such advertising may already be prohibited under federal or state law);
Directing content that includes product placement to Covered Users.
Finally, certain restrictions included in the KIDS Act would apply to Operators who have constructive knowledge that Covered Users use its Platform, even if such Platform is not “directed to children.” Such restrictions would make it unlawful for such Operators to “amplify, promote or encourage” the consumption of content that includes “sexual material; physical or emotional violence, including bullying; adult activities, including gambling; or other dangerous, abusive, exploitative, or wholly commercial content.” The KIDS Act would also require such Operators to provide users a mechanism for reporting suspected violations of such restrictions and forbid Operators from using age verification information collected from Covered Users for commercial purposes. Unfortunately, the Act does not provide a test for determining whether an Operator is deemed to have “constructive knowledge” that Covered Users use its Platform or any guidance regarding actions Operators may take to mitigate the risk of being imputed with such constructive knowledge.
The KIDS Act also contains provisions relating to marketing and commercialization that, if enacted, would take effect within a year of the Act’s enactment, such as: (i) creating a labeling system to allow caregivers to “identify noncommercial, educational and enriching content,” and (ii) requiring an annual audit of the top 25 Platforms directed to children.
Businesses Would Need To Redesign User Interfaces and Existing Advertising Practices
The main takeaways from the proposed KIDS Act are that (i) certain widely-employed advertising methods, as well as very popular and widely used features such as auto-play, push alerts, badges, rewards and other positive engagement tools – including displaying “likes” from other users – would need to be removed entirely from Platforms directed to Covered Users, and (ii) even if a Platform is not explicitly directed to Covered Users, an Operator with constructive knowledge that Covered Users visit its Platform will still be subject to restrictions on amplifying certain content and will need to create a mechanism for reporting suspected violations of such restrictions.
Many online retailers, influencers, advertisers, and other businesses utilize branded content, product placement, host-selling, and/or the use of likes and other “positive engagement” tools covered by the KIDS Act. The KIDS Act would require significant changes by many of the tech giants such as Google (YouTube) and Facebook (Instagram) – and would have an impact on all commercial websites, online services, online applications and mobile applications directed to Covered Users (or with constructive knowledge that Covered Users use such Platform), many of which would need to employ new strategies for engaging with its Covered Users.