Menu
Request A Demo

Improving Internet Safety for Kids: Crafting a Multi-Layered Response

October 15, 2020

Keeping kids safe on the Internet has never been more challenging. Young people now spend more hours online than ever before, on a variety of different websites and platforms. And even those that have been well-informed about online safety and acceptable usage still engage in dangerous behaviors: a 2019 study found that 40% of kids in grades 4-8 had chatted online with a stranger, even though 87% of them were taught to use the Internet safely.

While legislators have attempted to address Internet safety in a variety of ways, to date, these efforts have a common weakness that makes them largely ineffective: they require technology companies to shoulder the majority of the burden rather than making Internet safety for kids a responsibility that is shared between parents, brands, platforms, and regulators.

For example, in March 2020, the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act was introduced in the Senate. The Act focuses on eliminating child sexual abuse material (CSAM) from digital platforms by allowing states to develop regulations that online platforms will be required to follow.

The problem with this is that it places the entire responsibility for the identification and removal of CSAM: rather than building an effective response that fights against the exploitation of children online. An effective response to CSAM should address data misrepresentation, funding, and regulatory response.

Data misrepresentation

The bulk of CSAM data comes from the National Center for Missing and Exploited Children (NCMEC). Electronic service providers (ESPs) are required to report identified CSAM incidents to the NCMEC tipline: but these reports are inconsistent representations of the number of children that are exploited as well as exploitative acts. Moreover, a platform that submits a low number of reports to NCMEC isn’t necessarily safer: it may just have an ineffective identification system.

Because the NCMEC is not a governmental agency, it is not subject to the Freedom of Information Act: so it is difficult to clarify what the data means outside of the NCMEC reports. To reduce data misrepresentation, the NCMEC should release raw data for analysis, for greater transparency and a better, more granular level of analysis.

Funding

While regulations such as the EARN IT Act focus on increasing the accountability of technology companies, they fail to address the issue that each report must be addressed by law enforcement. Because of this, funding law enforcement agencies sufficiently is a critical factor in the effectiveness of CSAM prevention regulation.

Regulatory Response

Existing regulations include the requirement that CSAM must be deleted within 90 days of the initial report being submitted. This puts a burden on law enforcement, who may not have the opportunity to complete an investigation in such a limited time frame. Moreover, when materials are deleted, it means that they cannot be used for analytics, specifically, to train AI algorithms to identify incidents of CSAM. An AI algorithm that is effective in accurately identifying CSAM in real time is critical to resolving this issue, as it can:

  • improve response time by automating identification and reporting
  • reduce the cost of creating and maintaining a CSAM content moderation program and
  • protect the mental health of human content moderators, by limiting their exposure to damaging materials

However, as access to CSAM by AI is limited by the 90-day requirement, it is difficult to train AI models to improve accuracy and efficiency.

To read more about CSAM prevention, download our whitepaper: Challenges to the Online Child Safety Ecosystem

Keeping kids safe online is a critical priority for technology companies and for regulators alike. However, responses to date have been limited, by failing to address the nuances of the online ecosystem. Placing responsibility on a single party, like the platforms, is an ineffective response to the CSAM problem. To improve CSAM response, regulations must at least attempt to address some of the shortcomings of the current system, correct them at the source, and support forward-looking technologies that will get us closer to reducing, and maybe one day eliminating, online CSAM.

1 https://isc2-center.my.salesforce.com/sfc/p/#G0000000iVSt/a/0f000000fyoc/TYQ9XvDATBA78rR00G.PGJ9fmaLm1vQfAW9HCpy3GWk