11/19/2024 | Press release | Distributed by Public on 11/19/2024 16:25
The Association of National Advertisers (ANA) held its annual Masters of Advertising Law Conference November 11-13 in sunny Scottsdale, Arizona, and wow, children's advertising and privacy was a hot-button topic at the conference, with several panels focused on these issues. The Children's and Teens' Online Privacy Protection Act (COPPA 2.0), the Kids Online Safety Act (KOSA), several state laws and of course the specter of artificial intelligence (AI) all loomed large over the conference panels. We provide an overview below.
Much like the panelists at the ANA conference, we have talked extensively about COPPA 2.0 and its journey from bill to law. The Senate version of the bill overwhelmingly passed the Senate on July 30, and more recently the House version of the bill was approved by the Committee on Energy and Commerce. COPPA 2.0 would significantly strengthen the original Children's Online Privacy Protection Act (COPPA) in several ways, including by raising the ages covered to 16, banning targeted ads to teens and children, and creating an "Eraser Button" to allow for the deletion of personal information. While this bill would prohibit targeted advertising to people under the age of 16, contextual advertising that does not use personal information remains a valid and effective strategy. COPPA 2.0 would also establish a Youth Privacy and Marketing Division within the Federal Trade Commission (FTC), tasked with addressing privacy and marketing practices impacting children and teens. Although it remains to be seen what the FTC will look like under the new administration, there is a chance the bill could pass during the lame-duck session of Congress that started on November 12 and we will continue to monitor.
KOSA is on a similar path as COPPA 2.0 and has so far passed the Senate while working its way through the House. If this bill were to become law, it would create more safeguards for children on the Internet. Under KOSA, platforms would be required to enable the strongest possible privacy settings by default when the platform knows the user is under the age of 16. KOSA would require parents to be provided with controls to help children, and it would require parents be provided a dedicated channel to report harmful behavior. Additionally, the Senate version of the bill would create a duty of care for online platforms to design their features so as to prevent and mitigate specific dangers to minors, such as eating disorders, anxiety and substance abuse. This duty of care has been stripped from the House version of the bill, so it remains to be seen how this will be resolved.
Several states have entered the fray and passed their own laws aimed at protecting children online. On June 20, New York's Gov. Kathy Hochul signed the Stop Addictive Feeds Exploitation (SAFE) For Kids Act, which prohibits social media platforms from providing addictive feeds to minors under the age of 18 without parental consent. The act would also prohibit social media companies from sending push notifications to kids and teens between the hours of midnight and 6 a.m. The Maryland Age-Appropriate Design Code Act was signed into law on May 9 and requires covered entities to configure all default privacy settings provided to children (defined as under the age of 18) to the highest level possible and provide age-appropriate privacy information for children in clear language that is suited for the age of children likely to access the online product. California passed its own Age-Appropriate Design Code Act in 2022 but it has been subject to an injunction. Following a ruling by the U.S. Court of Appeals for the Ninth Circuit, the case has been remanded to the district court for further considerations of the statute, specifically the portions that restrict the collection and use of children's data.
Gaming and the metaverse continue to be hot topics for children's privacy and advertising. As games grow more and more pervasive, the age-old question of "Is it content or is it an ad?" grows ever more important. The Children's Advertising Review Unit's (CARU) guidelines provide specific ways for advertisers to prevent "blurring" (i.e., the mixing of advertising and entertainment content), including by requiring disclosures and contextual cues to inform children which is which. To ensure a game or metaverse experience is more in the realm of entertainment and not advertising, the products that are intended to be featured in the experience should be considered. If it's an exact replica, this could be seen as more of an advertisement. Changing up the colors, features and dimensions would make an experience more entertainment-esque. Instead of using the experience to walk through the bells and whistles of a widget, they should be used as a part of the adventure or storytelling. And companies must always ensure that proper disclosures are included because children may not understand that "#Ad" means a game is sponsored content, so a phrase such as "brought to you by" to make it clear to younger audiences should be used.
As we've discussed previously, the FTC is holding a workshop in 2025 on what it is calling the "attention economy." The attention economy is a fancy way of describing the hours we spend aimlessly scrolling on our phones and tablets. There are some simple fixes to make a game or experience more kid friendly, such as allowing autosaves, encouraging breaks through the user experience and avoiding rewards that can only be won by not breaking a streak.
AI is the hot new thing - it's everywhere and everyone wants a part of the action. CARU issued a compliance warning letter in May putting advertisers on notice. CARU's letter was designed to remind advertisers that CARU's guidelines apply to "all advertising, in any medium, directed to children under age 13, including advertising that uses AI to create or disseminate the ad." Of particular concern to CARU was the ability of AI to mislead children through the creation of AI-generated deepfakes, simulations or realistic people/known characters and the power of AI voice-cloning techniques. In August, CARU announced a settlement with KidGeni, a generative AI platform designed for children, for violations of CARU's guidelines and COPPA. CARU found that KidGeni failed to provide clear notice of its information collection practices, did not make reasonable efforts to ensure parents/guardians receive direct notice of the company's data practices and failed to collect verifiable parental consent prior to collecting children's personal information. To effectively use AI with children, there are a few commonsense tips that bear repeating. All ads, whether or not they are created with AI, need to be substantiated. Disclaimers must be written in a way that makes it abundantly clear to children that AI is being used. Also, platforms must ensure children are not feeding their personal information into a generative AI program that is not COPPA compliant.
Influencers continue to reign supreme, and child influencers are no exception. On January 1, Illinois' child influencer law went into effect, and it appears several other states are considering their own versions. The Illinois law mandates that if a child appears in at least 30 percent of a parent's or caregiver's social media content over a 30-day period and the number of views received per video met the online platform's threshold for compensation, that child must receive monetary compensation. The parent or caregiver must set aside these funds in a trust, or as it's known to those of us in showbiz, a Coogan account. Coogan accounts, named after Jackie Coogan, Uncle Fester on "The Addams Family" who previously was a successful child actor, are designed to prevent parents from spending the hard-earned money made by child actors. Additionally, when influencers are involved, child or not, content must always meet the FTC's Endorsement Guides. If the influencer's ad is designed for a child, the disclosures should be easily understood by children, avoid dangerous activities and model safe behaviors.
See? We weren't kidding around; this has been a big year for children's advertising and privacy. We will continue to monitor COPPA 2.0, KOSA and the myriad other issues discussed here so you can be the coolest kid on the playground.