Skip to content

Cyberbullying Prevention Master Class Recap

October 24, 2022

YOUTUBE COVERS

Cyberbullying is no new challenge. It is an issue that seemingly comes with the territory of internet use.

But does it have to?

As young people spend more time online, the effects of cyberbullying are becoming more apparent, and the responsibility of managing it is shifting more heavily into the hands of platforms.

In this master class, our expert speakers dove into what cyberbullying looks like and the steps your platform can take to prevent it.  Matt Soeth (Head of Trust & Safety at Spectrum Labs) sat down with Sameer Hinduja (Founder of the Cyberbullying Research Center) and Kris McGuffie (Director of Research at Spectrum Labs) to discuss:

 

Main Take-Aways

  1. Identifying cyberbullying is oftentimes not as simple as it seems. 

    How do we define cyberbullying online? It would be nice to look at it as a black-and-white issue - but in reality, it's much more complicated - there are layers to look at. When it comes to terms and statements traditionally used to identify cyberbullying in online communities, we must recognize that contextual and cultural variables are often at play. Because of this, keyword-based moderation can allow for cracks in moderation strategies that may punish the innocent and let toxic users through.


  2. What can we look at to identify cyberbullying?

    In addressing cyberbullying, looking at two key elements is essential:

    A) Is there harm taking place?
    When a user reports cyberbullying or it is detected, be willing to look at the whole picture. The bully may approach the victim using words that would not traditionally be flagged in text-based moderation. Flexibility within your approach is essential - what the user may be experiencing could be outside of your definition of harm - but if we have a user who's struggling in some capacity, it is articulable, and it is understandable, then taking them at their word can keep them safer in the long run.

    B) What is the intent?
    There are plenty of trolls on the internet that, although annoying, don't directly inflict harm. Cyberbullying, on the other hand, tends to be reoccurring and aims to cause some type of harm to the target, whether that be emotional damage or physical threats.


  3. Age-appropriate moderation & determining the needs of your community.

    Another challenge to content moderation is developing a strategy based on demographics - not just location, gender, etc. - but age. Specific spaces require higher levels of moderation. Take online communities with a younger audience, for example; you may not want to allow them as much freedom in specific ways. 

    Alternatively, in a more mature platform driven by adult users, keyword-focused moderation can inappropriately flag user speech as cyberbullying when it is community appropriate. These may be cases in communities driven by "harsh" language or ones where you want to allow a certain level of "trash talk."

    Overall, setting the tone for what your platform prioritizes in terms of user behavior and expectations is imperative. Implementing features that individual communities will be receptive to can also be a way to nudge users into healthier behaviors. In other words, determining the needs of your community can simultaneously improve user experience and keep them safe.

     
  4. How can you utilize experts when identifying, implementing, or enforcing bullying and harassment policies?

    Organizations can have a hive mindset that can prevent them from seeing all points of view. However, working with an external partner gives way to a fresh perspective when looking into your platform that your team might miss. Hearing from various disciplines who focus on researching and understanding the reasoning and cause behind user behavior can help expand an organizational point of view.


  5. How you can encourage healthy behavior within your platform. 

    The critical components of a healthy community include repetitive, pro-social interactions among bonded people who are essentially rewarded for their behavior. Rewarding positive behavior can develop trust with your user base, which, in turn, can foster good behaviors throughout entire platforms. Simultaneously, understanding the critical area driving cyberbullying within your community can position you in a way that truly allows for control. Utilizing tools to understand what is happening within your community can also give you a better understanding of where to encourage positive behaviors. 


  6. People change, but tech can help. 

    It is essential to acknowledge how people change the ways they are harmful to one another. Keeping this in mind and implementing more flexibility into your moderation strategy will strengthen your ability to catch those bad actors. Additionally, determining what is universal and transferable between language, culture, age, etc., then combining that into ever-growing data allows us to operationalize human behavior throughout the change. When we take those two points, then utilize data to train A.I. that will calculate those variables and recognize intent, our view of negative influencers on the platform can expand exponentially.

 

 

Additional Resources

  • Watch the master class on demand.

  • Sign up for #TSCollective, a community of trust and safety professionals sharing best practices and support for this heroic work. 

  • White Paper | Let's Get Serious About Ending Cyberbullying
    • With 36% of youth experiencing cyberbullying at some point in life, technology is being looked upon increasingly to solve the problem. This white paper discusses the complications behind spotting it and the technological advances that are putting more power in the hands of platform moderators to control it.

  • E-Book | Prevent cyberbullying on your platform.
    • Download the e-book to learn about the characteristics of cyberbullying, prevention strategies, and key metrics for benchmarking.


  • Solution Guide | Healthy Behaviors A.I.
    • Healthy Behaviors A.I. by Spectrum Labs is the only content moderation solution that encourages healthy user behaviors with A.I. detection, analytics, and action so that Trust & Safety leaders can improve user experiences and increase retention, engagement, and revenue. Contact our team to learn more.

You May Also Like

These Stories on Content Moderation