
On September 29th, 2021 Kris McGuffie, Director of Research, Spectrum Labs, Lisa Thee, Launch Consulting, and Roo Powell, Founder, SOSA, came together to discuss online best practices to protect underage users on your platform.
Underage users commonly lie about their age and age gates provide minimal protection when it comes to access age appropriate content. We conducted research across several social platforms and found that 9% of underage users engage in sexual conversations, and that 83% of those conversations are with adults.
Here are the key takeaways:
Kris McGuffie proposed several frameworks and best practices for companies to reduce risk and increase online child safety.
- Be strategic about high-risk behaviors, with policies, with moderation, and with enforcement practices
- Make thorough risk assessment a regular part of operations, using up-to-date data from user activities on your platform
- Employ a layered approach to detection, moderation, and enforcement that breaks down high-risk behaviors into discrete categories which can then be more easily detected and actioned
- Regularly test your systems by using red teaming and similar approaches
- Cultivate expertise about high-risk behaviors, tactics, and language. For every high-risk behavior on your platform, there are subject matter experts actively tracking the changes in those behaviors, tactics, and language. Invite them in and let them make your approaches work in the short and the longer term
Roo Powell shared her experiences of evaluating THOUSANDS of conversations with predators online.
- Grooming: a process where perpetrators operate under the guise of trust, emotional connection, and relationship building to manipulate a child. Perpetrators operate under the guise of trust, emotional connection, and relationship building. Children are often unaware they are being groomed because it can happen slowly and through deception.
- Minors are especially vulnerable as they're likely to hide this abuse from their families for fear of shame or retribution. Getting their phone taken away, embarrassment over the conversations and their own curiosity, etc.
Lisa Thee shared her years of industry experience to share how AI content moderation can be used to identify grooming on a platform.
- AI is a tool, not a solution. It’s very good at highly repetitive tasks and detecting patterns in grooming conversations. This frees up content moderators to focus their time and energy on high-level tasks and grey areas that often come up when moderating high volumes of content.
- Sex Trafficking: the use of force, fraud, or coercion to compel children and adults to engage in commercial sex acts.
- Average age of a trafficking victim is 12-14 and 75% of trafficking victims are sold using technology. Overall, there is a +4000% increase in reports of child sexual abuse material since 2013 from tech companies.
Looking for more?
Our master class is available to watch online. We recorded this session for you to revisit with your teams and absorb all the great information that was shared.
Download the White Paper: Protecting Underage Users
In addition, you can check out our #TSCollective, a community of trust and safety professionals sharing best practices and support for this heroic work.