Understanding the Children's Online Privacy Protection Act (COPPA)
The Children's Online Privacy Protection Act (COPPA) has been a federal law in the U.S. since 1998… And online platforms still get fined for violating it.
The COPPA Act aims to safeguard minors against collecting personally identifiable information (PII). The law places requirements like age verification and restrictions on targeted ads for platforms that cater to children under the age of 13. Failure to comply with COPPA standards can result in fines of more than $40,000 for each COPPA violation – which add up quickly. In 2019, Google was found to be tracking the viewing history of children in order to serve them targeted YouTube ads and was fined a record $170 million.
Online platforms not aimed at children still aren't exempt from COPPA rules. If a general-audience site has any users under the age of 13, it is expected to comply with COPPA requirements. That's why it is imperative for Marketing, Product, and Trust & Safety teams to ensure that proper guardrails are in place to identify minors and appropriately restrict platform activity for their accounts.
Specifics of COPPA
COPPA guidelines focus on three main aspects of child online privacy:
Personally Identifiable Information (PII)PII includes any collected data that can be used to identify an individual, such as their name, email address, geolocation, or even browsing history.
Under COPPA, online platforms need to obtain verifiable parental consent before collecting PII from children under the age of 13. So, they'll need an effective way to confirm that the person giving consent is actually the child's parent or legal guardian.
Age VerificationThe main crux of COPPA compliance is that online platforms must ensure that minors are not subject to the collection of PII. Ignorance is not an excuse – even if a user lies about their age, COPPA regulations (And potential COPPA fines!) will go into effect if the user is later determined to be under age 13.
While the COPPA Act does not require sites to ask users their age, online platforms would be well served to implement systems to detect underage users in order to restrict their access to adult content.
Targeted adsCOPPA prohibits online platforms from using children's PII for targeted advertising, which are ads delivered to a designated audience based on their browsing habits, interests, or other traits collected from user data.
Users under age 13 may only be served contextual ads which are based on the content of the page or app that the child is viewing rather than on the child's personal info.
Spectrum Labs' solutions for COPPA compliance
Our customers trust us to provide AI-driven solutions for online safety and compliance with government regulations.
For COPPA compliance, Spectrum Labs boasts five key pillars to detect underage users:
LexiconsSpectrum Labs' AI is trained on one of the largest data vaults in the world, which draws from user-generated content across a wide array of online platforms like gaming sites, dating apps, social networks, and beyond. From this data, we create the most precise and nuanced lexicons for our AI to accurately parse language that is used in communities around the globe.
Our lexicons are the backbone of our natural language processing models and subsequent AI-driven solutions. That's why we make sure they are developed from a vast collection of data, so our AI is trained on the broadest range of real-world behavior for utmost accuracy.
Under 18/Under 13 DetectionOur AI can recognize dialogue cues that indicate when a user is under 13 years old (“when i’m 10 i can get a phone”) and differentiate between minors who are over 13 but still under age 18. This allows platforms to more effectively comply with COPPA and like-minded regulations around the world, like the EU's Digital Services Act and the UK Online Safety Bill.
Detecting underage users requires more than just keyword-based solutions, which run a high risk of over-triggering and returning lots of false positives. Instead, our Contextual AI parses a variety of metadata and conversational nuances to accurately determine when users do not meet legally mandated age requirements.
LatencySpectrum Labs' AI can parse natural language and spot harmful content or behavior within 20 milliseconds – that's about 5x faster than the blink of an eye. This effectively makes for real-time detection of problematic posts and users that can be acted on with pre-configured automation or flagged for immediate review.
Automated ActioningMost violations on any platform are repetitive, including COPPA-related behavior like lying about age or posting personal information. On platforms with millions of users, the number of violations can become overwhelming.
Spectrum Labs' solutions allow for custom-configured automation to act on detected content at scale. Rather than offering a "one size fits all" solution, we enable platforms to determine how they handle different types of content – whether they want to act instantly against certain behaviors or flag specific types of cases for prioritized review by moderators.
Scrubbing Personally Identifiable Information (PII)Even if a platform doesn't collect PII from underage users, those same users still could post personal info on their profiles or in messages to others – which is not exempt from COPPA.
Spectrum Labs' AI can recognize PII and determine when it is being posted by users under age 13, then promptly block it from being published on the platform. Since this can be done with near-zero latency, the overall user experience on a site is not affected.
To learn more about how Spectrum Labs can help your platform better comply with COPPA and similar regulations, check out our Online Child Safety solution guide.