California Moves Closer to Enacting More Stringent Online Privacy Protections for Children
For years now, California has led the way by setting the standard for privacy and data protection regulation in the United States. Recently— and as calls for greater controls over the addictive nature of social media grow louder—legislators in the Golden State have moved closer toward enacting a new, first-of-its-kind privacy law that would prohibit the development and utilization of “addictive” features by social media platforms. At the same time, state legislators also advanced a second bill that would put in place stringent online privacy protections for minors.
Businesses should monitor the progress of these bills closely, as their enactment—combined with an increased focus on children’s privacy by both federal lawmakers and the Federal Trade Commission (“FTC”)—may have a ripple effect in other states and municipalities, with legislators following close behind to enact similar children’s online privacy laws.
Social Media Platform Duty to Children Act
At the start of the year, California lawmakers introduced the Social Media Platform Duty to Children Act, AB 2408 (“SMPDCA”), a targeted privacy bill seeking to curb the addictive nature of social media and its increasingly negative impact on children. At the end of June, California’s Senate Judiciary Committee passed an amended version of the bill, which now moves to the Senate Appropriations Committee. While the original version of the SMPDCA included a private right of action allowing parents to sue tech companies for violations of the law, the amended bill removes the ability to pursue class action litigation, leaving enforcement exclusively in the hands of the California Attorney General and the state’s district attorneys, county counsel, and city attorneys.
Under the SMPDCA, social media platforms are prohibited from using “a design, feature, or affordance that the platform knew, or which by the exercise of reasonable care should have known, causes child users to become addicted to the platform.”
Significantly, however, the bill would allow platforms to shield themselves from liability under the bill’s safe harbor provision, which requires platforms to satisfy two conditions: (1) implementation and maintenance of a program for regular auditing to detect practices or features that have the potential to cause or contribute to the addition of child users; and (2) within 30 days of the completion of any audit, correction of any practices or features detected through the audit that present more than a de minimis risk of violating the law.
Violations of the SMPDCA would subject social media platforms to civil penalties of up to $2,500 per negligent violation and $7,500 per intentional violation. In addition, any platform that knowingly and willfully violates the law would be subject to an additional civil penalty of up to $250,000 per violation, as well as attorneys’ fees and litigation costs.
California Age-Appropriate Design Code Act
Also this year, state lawmakers introduced a second proposed piece of legislation—the California Age-Appropriate Design Code Act, AB 2273 (“AADC”)—which would impose a range of requirements and restrictions on online businesses that offer services, products, or features likely to be accessed by children. At the end of June, an amended version of the AADC was advanced out of the California State Senate Judiciary Committee and is now scheduled for a hearing before the Senate Appropriations Committee at the start of August. If enacted, the AADC would go into effect on January 1, 2024.
Modeled after the United Kingdom’s Age Appropriate Design Code, the AADC—which seeks to “elevate child-centered design in online products and services that are likely to be accessed by children”—would mandate that companies consider the best interests of children when designing, developing, and providing services, products, or features that children are likely to access, while at the same time requiring companies to “prioritize the privacy, safety, and well-being of children” in the event a conflict arises between the company’s commercial interests and the best interests of children.
Under the AADC, companies would be required to adhere to the following: (1) undertake a data protection impact assessment and provide a report of the assessment to the California Privacy Protection Agency (“CPPA”); (2) configure all default privacy settings relating to any online service or product to provide a high level of privacy protection; (3) provide any privacy disclosures—such as privacy policies and terms of service—concisely, prominently, and using clear language suited to the age of children likely to access that online service/product/feature; (4) where parents or guardians have the ability to monitor a child’s online activity or location, provide an obvious signal to the child when they are being monitored/tracked; and (5) provide prominent, accessible, and responsive tools to help children, or where applicable their parent or guardian, exercise their privacy rights and report concerns.
At the same time, the AADC would also prohibit: (1) using the personal information of any child in a way that more likely than not causes or contributes to a “more than de minimis” risk of harm to the physical health, mental health, or well-being of a child; (2) profiling children by default; (3) collecting the precise geolocation information of children by default or without providing an obvious sign to children that such information is being collected; and (4) using dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected to provide a service, product, or feature.
Violations of the AADC would subject companies to civil penalties per affected child of up to $2,500 for negligent violations and up to $7,500 for intentional violations.
Analysis & Takeaways
Both California bills come as privacy advocates increasingly call for greater protection for children online. In addition, both President Biden and federal lawmakers have voiced the growing need for greater regulation of companies that cater to children’s online activities. At the same time, in May of this year, the FTC adopted a policy statement pertaining to increased scrutiny of Children’s Online Privacy Protection Act (“COPPA”) violations, which President Biden noted during recent remarks demonstrates that the FTC will focus on “cracking down on companies that persist in exploiting our children to make money” for the foreseeable future.
Taken together, companies that offer services or products to children online should closely monitor the progress of California’s pending online children’s privacy bills while at the same time re-assessing their own privacy practices and compliance programs to mitigate the increased scrutiny that they are likely to face from all levels of government moving forward.