Skip to content
3 min read

Identifying Minors and High-Risk Behavior

By Lee Davis

I recently received a call from a teacher at my son’s school. He detailed to me an incident from that day, in which my son made some poor choices and displayed improper behavior. I thanked the teacher for explaining the situation to me and said I would work with my son to reduce the likelihood of it happening again.

The teacher followed up by thanking me - not just for promising to work with my son - but specifically for accepting his account of events. He explained that many parents tend to respond with disbelief or a “that doesn’t sound like my child” position.

I guess it shouldn’t be all that surprising. We tend to think we know how our kids behave, even when we’re not around. We like to think we know their values and that we’ve instilled sound mechanisms for decision-making. But in the end, they’re kids. And they are still learning to make decisions.

Coincidentally, this incident related a lot to some research work Spectrum Labs has been doing for a client. This client, a social media platform, allows users to set their own age. The platform’s Trust & Safety team had put some safeguards in place to restrict minors but wanted to test their effectiveness and identify needs for improvement.

Results from the testing were eye-opening for the client.

A widespread problem. Not an easy solution.

By analyzing content and surrounding metadata, then evaluating that against Spectrum’s sexual and underage user models, we started to build a picture of concerning underage behaviors. We discovered that at least one in 10 users were lying about their age - such as 14-year-olds claiming to be 30. Furthermore, 9% of underage users were engaging in highly sexual conversations, and 83% of those conversations were with adults!

High-Risk Behavior

Just as I was unpleasantly surprised to learn about my son’s behavior at school, this client was alarmed by the behavior taking place on its platform. To be fair, this platform is by no means alone. TikTok recently revealed it has removed more than 7 million accounts from its platform because they likely belonged to users under 13.

In research cited by the Atlantic in 2016, 34% of Facebook users were found to be between eight and 10 years old. Four in 10 minor users lied about their age. In fact, back in 2013, a Facebook executive confirmed to The Guardian that underage users were a problem and admitted, “We haven't got a mechanism for eradicating the problem.”

Simply requiring users to provide their age proves ineffective. Pew Research has found that almost half of teens confess to lying about their ages in order to access a website or register for an online account. The data also revealed many parents know their underage children are on platforms like Facebook in violation of the network’s restrictions. Some adults are actually complicit in helping their children join the platforms.

“...Not achieving that goal.”

Most social platforms maintain a threshold for participation at 13 years old. For online communities in the U.S. this is primarily based on regulatory compliance laid out in 1998 in the Children’s Online Privacy Protection Act (COPPA).

The real concern, though, shouldn’t just be violations of COPPA. Numerous studies have shown - and we continue to see - that underage behaviors online can have significant impacts on developing children.

"This results in children being exposed to privacy and safety threats such as cyberbullying, online grooming, or exposure to content that may be inappropriate for their age," stated Dr. Liliana Pasquale, assistant professor at University College Dublin's School of Computer Science, in a recent study titled “Digital Age of Consent and Age Verification: Can They Protect Children?

To add some context to the growing nature of the problem, in the first quarter of 2021, the vast majority of videos removed from YouTube were for “child safety” reasons. Only two years ago, child safety concerns were a fraction of flagged videos on the platform.

Videos removed by YouTube wordwide

Tackling the issue cannot be left to the parents alone, especially given data that show they can sometimes exacerbate the problem. Platforms have a legal and ethical obligation to address online child safety.

“Many kids are using underage sites, many kids are lying about their age, many parents are lying for their kids, many parents are confused, many feel left in the dark, feel pressured and out of their depth,” Martine Oglethorpe, of the digital wellbeing consultancy The Modern Parent, told the Family Zone blog. “A system that is supposed to protect our children and educate our parents is not achieving that goal."

Spectrum Labs is proud to be part of the efforts to make the Internet a safer place for children. We have a host of child safety resources online, and offer the most technologically advanced moderation tools to help online communities address issues of online child safety.

Download the Child Safety White Paper

Learn more about how Spectrum Labs can help you create the best user experience on your platform.