Youth Privacy and Data Collection Online
While technology provides children with the benefits of social connection and belonging, there are also adverse effects on children and teens’ physical and mental well-being and concern about how their personal data is used. Children are accessing the Internet from an early age and they and their parents need to understand what data is being collected and how that data is being used if they are to give informed consent to such uses. Unfortunately, children are also more likely to be the targets of criminal behavior online. In his 2023 State of the Union Address, President Biden urged Congress to pass legislation to stop Big Tech from collecting personal data about children online, to ban advertising targeted to children and to impose stricter limits on the personal data companies collect on all users.
Children’s Online Privacy Protection Act (COPPA)
As we have discussed previously, COPPA, enforced by the Federal Trade Commission (“FTC”), applies specifically to websites or online services targeted toward children under the age of 13 or to general audiences with the actual knowledge that the services are collecting personal information from children under 13. COPPA requires clear notice of the data collection, including links to the website’s privacy policy. Verifiable parental consent must also be obtained before collecting any personal information from children and restricts the collection of personal information to only that which is reasonably necessary. COPPA includes a preemption clause that blocks states and private plaintiffs from seeking relief under state law (“[n]o State or local government may impose any liability for commercial activities or actions…that is inconsistent with the treatment of those activities under this section”). In August 2022, a Ninth Circuit panel in Jones v. Google LLC, 56 F.4th 735, 741 (9th Cir. 2022) addressed the issue of whether COPPA’s preemption clause is so broad that it prohibits parents from suing Google and YouTube for allegedly using persistent identifiers (i.e., information that can be used to recognize a user across different sites over time), and collecting data and tracking minor children’s online behavior without their consent. The Ninth Circuit held that Congress did not intend the preemption clause to create an exclusive remedial scheme and does not bar state law causes of action already proscribed and consistent with COPPA.
The California Age-Appropriate Design Code (CAADC)
In August 2023, California legislators passed the California Age-Appropriate Design Code (“CAADC”), effective July 1, 2024, which significantly expands the protections of COPPA by requiring businesses that provide any online service, product or feature likely to be accessed by children to comply with specific obligations.
The CAADC encourages businesses to prioritize children’s best interests over commercial interests. Covered entities are for profit entities that meet one or more of the following criteria: (1) have $25 million or more in annual gross revenue, (2) buy or sell the personal information of 100,000 or more users or (3) derive 50% of annual revenue from selling or sharing consumers’ personal information. Businesses that meet these criteria and provide an online service, product or feature likely to be accessed by children are within the scope of the CAADC. These entities have a duty of care to minimize the data being collected from minors and to avoid using personal data in a way that is detrimental to children’s physical and mental health. In addition to evaluating privacy risks, businesses must analyze inappropriate or harmful material generated by their algorithms. Unlike COPPA, CAADC applies to online services and sites likely to be accessed by children under 18, rather than 13. As a result of the more rigorous compliance obligations under CAADC, Google has made SafeSearch the default browsing mode for all users under 18, and TikTok and Instagram have disabled direct messages between children and adults they do not follow. Under the CAADC, “likely to be accessed by children” is defined as the reasonable expectation that the online service, product or feature is:
- directed to children as defined by [COPPA]
- determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children
- advertised and marketed to children
- substantially similar or the same as an online service, product, or feature subject to subparagraph (2)
- designed with elements that are known to be of interest to children, including, but not limited to games, cartoons, music and celebrities who appeal to children
- determined to have a significant percentage of children among their audience, based on internal company research.
The CAADC does not require entities to verify the age of their users, but they must estimate the age of minor users with a “reasonable level of certainty appropriate to the risks” or treat all users as though they are children under the CAADC. In practice, businesses must either provide both a youth and adult version of their service and implement measures to sort their users into those two pools or design the entirety of the service to comply with the requirements of the CAADC.
The CAADC has more stringent requirements than COPPA for verified parental consent, including requiring covered businesses to deploy age-estimation obligations (which may necessitate the use of face scanning to collect additional information, raising its own privacy concerns), to configure all default privacy settings to offer a high level of privacy and conduct data protection impact assessments. It also requires social media platforms and other websites to turn off features likely to pose risks to younger users, like “friend finders” that enable adult strangers to communicate with children. The CAADC will also create the California Children’s Data Protection Working Group, comprised of members with expertise in children’s data privacy and rights.
Other Federal and State Legislation Addressing Children’s Online Activities
Under both California’s Privacy Rights for California Minors in the Digital World and Delaware’s Online and Personal Privacy Protection Act, minors have the right to request the removal of information posted online, websites are prohibited from advertising products that are not legally permitted to be sold to minors, and certain online advertising practices based on minors’ personal information are prohibited. California’s Privacy Rights for California Minors in the Digital World allows those under 18 to “scrub” their digital history. This privacy right has certain notable limitations in that it applies only to content that the minor posted and not to any content posted by a third party. Moreover, online service providers are not required to permanently erase content, but rather just to hide it from view on their site.
Voted out of the Senate Commerce Committee unanimously in July 2022, the Kids Online Safety Act (“KOSA”) is being resurrected by Congress to strengthen data protection and online safety for children up to the age of 16. KOSA has now garnered more than 25 additional bipartisan co-sponsors and has gained the support of groups including the American Psychological Association, the American Academy of Pediatrics and the Eating Disorder Coalition. KOSA sets out requirements for online applications and services that minors will likely use. The bill mandates that online platforms provide minors or their parents with certain safeguards, such as settings that restrict access to a minor’s personal information, refrain from advertising products or services that are illegal to sell to minors and tools for parents to supervise the minor’s use of a platform, such as control of privacy and account settings.
In light of national security concerns, several states and the federal government have banned TikTok from public agency devices. Utah and Montana have passed such laws, though TikTok has already filed suit to challenge the Montana law. In March 2023, the Utah legislature passed a law requiring parental consent to set up social media accounts, to give Utah parents access to their children’s posts, messages and responses and implements a social media use overnight “curfew.”
Measures Taken by Online Platforms
To address social media’s potentially harmful effects and discourage legislators from imposing additional regulations, websites are implementing features to make their services safer and age appropriate. Snapchat introduced new parental controls in its “Family Center,” which give parents more control over the type of content their children are consuming. The controls also provide teenagers with the ability to notify their parents when they report an account or content on the app and lets parents see who their teens are messaging. Significantly, however, this is not the default setting, and parents and children need to opt into this service. While there is still a danger of children misrepresenting their age, Snapchat and other social media platforms are using artificial intelligence (“AI”) to look for age mismatches. If a user’s friends are mostly in their early teens, or if a person’s interests align with content most popular with children or teens, AI may reveal a user’s true age. TikTok, which, according to Pew Research Center, is used by 67% of U.S. teenagers, has also implemented new safety features, such as eliminating direct messaging features for young users and providing a screen-management tool for users and their parents. “Family Pairing” on TikTok allows parents and teenagers to customize their safety settings.
A Brief Word About the Family Educational Rights and Privacy Act (“FERPA”)
Since the Family Educational Rights and Privacy Act (“FERPA”) applies to all educational institutions that receive federal funding and addresses privacy protections for students and their education records, FERPA is beyond the scope of this Insight. However, there are certain overlaps with broader privacy issues as FERPA does not prohibit schools from selling student directory information to commercial entities. FERPA‘s K-12 School Service Provider Pledge to Safeguard Student Privacy (“Pledge”) does prohibit the selling of student personal information and the use of collected information for targeting advertisements to students. Elements of the Pledge have been incorporated into many state laws, but signing the Pledge is voluntary.
Measures taken by both online platforms and federal and state legislators are welcome and much overdue, but they still fall short in addressing critical privacy concerns for children and teenagers, the most vulnerable audiences on the Internet. Increasing digital literacy and awareness of the issues among young people and their parents will also promote informed and responsible Internet usage. Lutzker & Lutzker will continue to monitor and report on developments addressing the critical issue of youth privacy.