The current COVID-19 crisis has increased the importance of online services that provide access to healthcare, entertainment, fitness, and education. However, many businesses that provide these services, including service providers, publishers, app developers, and content creators, may find that they are increasingly being used by unintended audiences. Prior to COVID-19, these businesses had not yet targeted children under the age of 13, and as a result, businesses such as Zoom had not meaningfully assessed their obligations under the Children’s Online Privacy Protection Act (COPPA)[1] before offering their services for use to businesses and schools. Moreover, because many businesses and schools are using technologies not originally marketed, developed, or used in these environments, they are potentially collecting personal information from children under 13 in violation of COPPA. Violators of COPPA can be fined up to $43,280 per violation and publishers, app developers, and service providers can be held responsible for the content published on their services. For example, in September 2019, the Federal Trade Commission (FTC), the primary regulator of COPPA, and the New York Attorney General reached a settlement of $170 million with YouTube for such violations.[2]
COPPA requirements
Generally, COPPA requires that commercial operators of websites and online services “directed to children” under 13, post a notice of their privacy practices, provide direct notice to parents, and obtain verifiable parental consent prior to collecting, using, or disclosing personal information collected from children.[3] Websites and online services are considered “directed to children” when a business’s intended audience is children under 13, or when the FTC determines that the operator of a website or online service has “actual knowledge” based on certain factors. These factors include the subject matter, the visual content, the age of models, the kind of music or audio content, the use of animated characters, child-oriented activities and incentives, the presence of child celebrities, language or characteristics of a site, advertising promoted, or that appears to be promoted to children, and competent and reliable empirical evidence about the age of the audience.[4] As a result, even though a website or service is intended for a general audience, it can still be “directed at children” based on those factors because the business has actual knowledge that its service is being used by an audience that is under 13.
The Most Recent and Significant Application of COPPA
On September 4, 2019, the FTC found that although YouTube’s service was intended for a general audience, it was “directed to children” under COPPA because YouTube allowed individual channel owners to distribute and publish content on YouTube’s service that channel owners “directed to children.” YouTube was fined $170 million because it failed to provide a clear and understandable privacy policy regarding its privacy practices with children’s personal information and to direct notice to parents regarding its collection practices, and failed to obtain verifiable parental consent from parents before collecting persistent identifiers, specifically cookies, to track viewers of channels created by channel owners that were “directed to children” on its service. According to the complaint, these cookies were used to target ads to viewers of different channels on YouTube’s service.[5] The channel owners were businesses that used YouTube’s service to publish content “directed to children” for their own business purposes. The FTC found that YouTube had marketed its platform to makers of popular children’s products, several channel owners told YouTube their content was “directed to children,” and YouTube’s own internal content rating system identified content as “directed to children.” As a result, the FTC determined that YouTube’s service was “directed to children” in these instances and should have followed the requirements pursuant to COPPA.
The FTC has held third-party platforms responsible for complying with COPPA regulations when they have “actual knowledge” that their services are being “directed to children” under the age of 13 even if they themselves are not responsible for the content. Finally, general audience platforms can be responsible when collecting personal information from children under 13, when they have “actual knowledge” that content on its platforms is “directed to children.”
In addition, the FTC suggested it would hold content creators themselves liable for failing to adhere to COPPA requirements in the future. Some third-party platforms such as Apple and Google have already taken steps in light of the YouTube decision to require content creators to comply with COPPA. Following the FTC’s allegations that three dating apps were violating COPPA, Apple and Google both removed these apps from their stores. As a result, content creators will likely have to be vigilant concerning the FTC’s new focus on content creators while also complying with the requirements of the platforms on which they publish their content.
In light of these evolving changes and the complexity of COPPA compliance, the FTC has continued to provide guidance to multiple stakeholders regarding their obligations. Recently, the FTC published a more streamlined set of guidance for their FAQ’s. Moreover, in light of the COVID crisis, and educators’ increased use of video conferencing platforms such as Zoom, the FTC published additional guidance regarding COPPA and its application in schools. Finally, and most importantly, businesses should be aware the FTC is conducting a review of COPPA and concluded its public comment period on December 11, 2019. During the public comment period, the FTC received 176,280 comments regarding proposed changes that should be made to COPPA in light of the rapid changes in technology. As a result, potential changes to COPPA are likely in the near future.
Your business should review its current marketing and privacy practices
Moving forward, service providers, app developers, and content creators may be held responsible by regulators and some of the most widely used third-party platforms for their collection of personal information from children under 13. These third-party platforms are enforcing COPPA by effectively removing services from their platforms if found not to comply with their platforms’ notice policies or with COPPA. Service providers, app developers, and content creators that provide technologies to schools and businesses should undertake a complete assessment of their services. The assessment should include stakeholders from marketing, legal, compliance, and IT to determine whether they must comply with COPPA. The assessment should include:
- Conducting a data inventory of what personal information is collected by the services and how that information is being used by the business;
- Conducting a data impact assessment regarding potential risks associated with the personal information collected, the systems used for collection, and the intended use of collected personal information such as marketing; and
- Conducting a risk assessment to determine how the business protects the security, confidentiality, and integrity of personal information.
[1] https://www.edweek.org/ew/articles/2020/04/03/zoom-use-skyrockets-during-coronavirus-pandemic-prompting.html; https://www.theverge.com/2020/4/8/21213847/zoom-ceo-security-privacy-apology-fix-china-videoconference.
[2] https://www.ftc.gov/news-events/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations.
[3] 15 USC § 6502(a)(1).
[4] 16 C.F.R. § 312.2
[5] See Complaint, https://www.ftc.gov/system/files/documents/cases/172_3083_youtube_revised_complaint.pdf