Skip to content
32 min read

Cyberbullying: Detection in a Meme, Emoji, and Slang-Filled World

By Lee Davis

 

This episode of The Brief originally aired on December 11, 2019. You can listen to the full episode on Apple Podcasts or watch the live recording on YouTube.


Did you know that 59% of U.S. teens have reported being bullied or harassed online? Online harassment is more than empty words behind a screen — in fact, about 30% of online bullying victims said they feared for their lives.

In this webinar, moderator Justin Davis, CEO and Cofounder of Spectrum Labs, is joined by:

  • Matt Soeth, Executive Director at #ICANHELP, an organization that supports educators and teaches students how to respond to cyber issues in a positive way;
  • Ross Ellis, Founder and CEO of STOMP Out Bullying™, the leading national anti-bullying and cyberbullying organization for kids and teens in the U.S.,
  • Jon Purnell, Vice President of Data Science at Spectrum Labs

Listen to our panel of cyberbullying experts discuss the underlying issues behind the epidemic of online harassment, and find out how Spectrum’s AI solutions are addressing this issue head-on.

Panel of Experts

 

Screen Shot 2020-07-21 at 12.20.51 PM

 

And so I finally [...] teased it out and found out the next day that there was a student on our campus that tweeted out, like, “Hey, I have this photo. If I get 50 likes, I'm going to share it.” And the students saw it, and the kid got 50 likes, but in less than two minutes, the photo was reported and down. 

And the next day, we had about five or six kids out in front of the main office saying, you know, “Hey, this is what happened. This is who did it, how can we help?” 

And to me, I think that is really rooted in the cultural climate of [...] “We're not going to treat people this way. It's not right to treat people this way. And we need to make this right.” 

And that's where I think it can get really magical in the school setting -- [...] when the students are really ready to stand up and support each other in doing the right thing.

Ross Ellis  (6:00)  I agree. I agree...because they're tired of being treated badly.

Justin Davis (6:05)  Yep. So, I love what both of you guys are doing. And we are constantly talking to various NGOs and nonprofits that are working in this space. 

But you know, when you look at the research, there's that “59% of U.S. teens [are] being bullied” stat. And if you go to cyberbullying.org, it'll show you that it's the highest these victimization rates [...] have ever been, with all [of] these different tools in place and efforts, like what you guys are implementing. 

Why do you think we're at the worst point for these cases right now? What's the cause behind this upset?

Ross Ellis  (6:39)  I mean, there is an uptick, because let's face it -- there are all [of] these brand new devices coming out. And everyone's waiting for the next iPhone or the next new something, and they can be anonymous. 

I think we're living in a world today that’s unfortunately cruel, and the kids see this, and so they think that they can -- and do -- go on their devices and say terrible things to someone, to the point where another kid is traumatized. I mean, we get so many emails and phone calls about kids who've been told to go kill themselves. I can't imagine a kid agreeing to do that. But yet it does happen.

And there are kids online, saying that to them. It's just terrible. And we've got to put a stop to it. So that's been the uptick — the uptick because they realize they can do anything they want.

Justin Davis (7:36)  So Matt, what do you say? What do you think is behind...to pile on to that? What do you think is behind that uptick?

Matt Soeth (7:40)  Oh, I think it's a few things. I mean, definitely, it's the vices technology, the platforms, I mean, they're growing almost exponentially, which is kind of normal with tech. 

Two, I think we've done such a good job of educating kids on what cyberbullying is...kind of like bullying back in the day...there's a greater awareness. So in terms of reporting, I think there's an element of that.

And then, three, it's really looking at human behavior in general, right? So, we have this new device, we have this new technology. I remember talking to my dad when I started teaching [and I was] having issues with kids texting in the classroom. And these are phones with the first keyboards, right? So, it's not like “modern” modern technology, you're still talking about 15-20 years ago. 

But I remember telling my dad about it. He started teaching in 1967. He starts laughing at me. 

I'm like, “Why are you laughing at me?” He’s like, “I just can't believe kids are still doing that.” I'm like, “Doing what?” He's like, “Just getting away with stuff.” 

And so it's kind of going back to the essence of the behavior. What is driving the behavior? Where are the platforms? I think anonymity is a small part of it. But I also think there are ways to stand up and be aware. I still think there's a big education that needs to happen. 

You know, talking to schools, you hear a lot of times and it's not entirely fair, [...] like, “Oh, we need parents more involved”, or parents like, “Oh, we need schools to do a better job of keeping our kids safe.” 

And you just get a lot of this back and forth, of looking at, what is the role of each individual in that? How can we support each other -- either parents to kids, peer to peer, kid to kid advocates? And even schools...whether it's policy language, whether it's educational programming, digital literacy, social literacy, [or] media literacy. And really getting to a point where we can just treat each other better across the board, and really know what impact this stuff is having. [...] 

In our case, it's the “KY”s, or “kill yourself” — those kind of comments. [...] Ten years ago, 20 years ago, everyone was like, “Oh, that's gay.” And it was like, “Well, you can't really say that.” [So] let’s have a conversation about why you shouldn't say “kill yourself”. Let's have a conversation about why. 

I think young people are just very sensitive to that information when it comes out. [...] They're reacting more quickly, and they're responding more quickly, and I think they're more aware of what's happening. And so we're getting those bigger numbers in response [to] that awareness.

Justin Davis  (9:44)  Yeah. And to drill into that, [...] to make this real, this is a very personal issue for many of us — being bullied and harassed a lot, even offline. 

Can you share a few examples of bullying? Like maybe there's an anecdote or a story...because I think when we [talk about] cyberbullying, it's a broad definition. And depending upon the platform you’re on, or the experience that you're on, that may have a different indication there. 

So what [...] are some examples of cyberbullying, just to make this real for our listeners here?

Ross Ellis (10:16)  I mean, we have so much. They call each other names. They accuse them of being a different gender than they may be. And it's just [that] kids are very sensitive today. They're almost hypersensitive. So when this starts, they're feeling really bad and scared and they don't know what to do. That they put up means which they could put a boy in the shower who's overweight, and they put that up online, and we know that doesn't go well. 

They do have their slang language -- like “cyber-sassing” is someone with an attitude. Or the emojis: “Tomorrow going to be hella fire” with the fire emoji. They [...] go so fast with this, that you can't keep up with these techniques. I think that when a parent buys a kid a device, they need to really teach them how to be socially and digitally responsible. Because [...] this is not going to stop.

Justin Davis (11:23) And Ross, staying with you on this one, are there certain groups of people or individuals that are more susceptible to being bullied a lot from your perspective? 

Ross Ellis (11:31) Well, I think the A[-type] personality is going to be okay because they know how to stand up to the bully. We never want them to ignore the bully, but especially online — we want them to block and delete. 

You have other children who are somewhat shy and timid, and so, they're afraid to stand up to their cyberbully. They're afraid to block the message and delete it. And that's the first thing we tell kids -- because if you block it and delete it, you won't see it anymore. 

But yet girls, especially, want to see what's being said about them. I always say to kids, “If I'm wearing a red sweater, and I asked you if you like my sweater, and you tell me, that's okay. But if I'm wearing the red sweater, and [I didn’t ask you] and you tell me it's ugly, that's not okay. Because I didn't ask you.”

So, kids really need to understand where it's coming from, and how to handle it, and the impact that those behaviors have.

 

Screen Shot 2020-07-21 at 12.27.43 PM

 

Justin Davis  (12:28)  And Matt, what about you? What observations have you made around [...] specific groups of people or individuals or kids that seem to be more susceptible to cyberbullying?

Matt Soeth (12:38)  Definitely we see a lot with the LGBTQ population. And [...] on some days, it's almost passive aggressive, if not overly aggressive. It's a meme. It's a post. 

Like, one school particularly [has] recently [been] having some issues with their LGBTQ population being excluded. And it's very subtle stuff, right? It might be a Trump hat. It might be, you know, “Hey, we're going to Chick-Fil-A on our campus tomorrow. We sure love chicken sandwiches!” to really drive home this point of, “Oh, that company we think has historic reputation for [discrimination]...we're going to have them here because we know they don't like you. And that's our way of saying we don't like you.” 

To me, a lot of these come back to one simple rule: that, particularly in middle school and high school, I think every human [has a] fear of exclusion — being left out, or being intentionally left out. When really, most humans just want to fit in, which we already know, is the most challenging thing to do as a middle school or high school student, right? 

And it's trying to find that way of [figuring out] who are the groups that are being targeted, being excluded, being left out, and just having those little comments? 

The other thing to that, as we know, a lot of kids will not block. They're unsure how to report. So having a conversation about how it works [can help], but they don't want to block, because it's, “Oh, someone's talking about me. I want to know what they're saying.”

It's that fear of knowing someone is talking about you but not being able to see it that really bothers them more so than the fact that anyone's talking to them...and just having the self-confidence to think that, “Oh, like, I can handle this”, or, “Someone's gonna help me”. 

Justin Davis (14:01)  Exactly. And so what happens to these people, Ross? You know, when they experience cyberbullying, what are the longer-term effects on these individuals [that] have to go through this?

Ross Ellis  (14:11)  I mean, they go through so much. They have fear, depression, loneliness, anxiety...their self-esteem is shot, they can develop physical illnesses, and sadly, they can get suicidal thoughts. 

Not all of them, obviously. But for the ones that do, hopefully they're getting help. We have a HelpChat Line, so we've been able to save over 6,000 lives. But what about the kids that don't come to our HelpChat Line, who are out there feeling so miserable and distraught, and they don't know what to do? So, the long term effects can be very serious. 

Justin Davis (14:48) I would agree with that. I don't know if I've ever exposed this in a public forum, but, you know, for me, I experienced [the negative impacts of cyberbullying] firsthand in high school and I had the internet. I'm 36 now, the internet is in a [very] different place now, [and] social networks are in a different place. But had the internet been what it is now, then I probably would have had a very different experience coming out of that. And so, yeah, there's a lot of a lot of negative things that can happen as you go through that stuff. 

Now, what have you seen for the kids and even the parents who have to deal with this for their own kids?

Matt Soeth 15:19  [...] I think any parent really wants to help their child as best they can and try to find solutions. Schools, I think, are struggling. It really depends on whether or not they have a policy in place and how they can enforce it. Because generally, when [cyberbullying] has happened, there's usually a lot of evidence. It's just about being aware of it. 

And then for students, there's some new research coming out. You're definitely seeing anxiety and depression connected to [cyberbullying] in some cases, and we're also seeing anxiety and depression happening in individuals who just witness this negative content as well. And so I’m hoping to see more research around that to really figure out how [cyberbullying] is impacting our sense of well-being, or “digital being”, whatever you want to call it. 

But yeah, just that sense of being excluded or being the target; that victim mentality of trying to shrink away from the attention and just disappear for the moment. Like, “Don't notice me. Leave me alone. How come I'm this target?” 

You know, that's the stuff that we're seeing -- how those kids are responding to [cyberbullying]. Particularly if they don't have a strong peer group or an adult that they can go to and really help them out.

 

Screen Shot 2020-07-21 at 12.28.26 PM

 

Justin Davis (16:17) Absolutely. And so we've talked a lot about [...] framing this problem, [...] that it obviously impacts lives. We're talking about a digital world that has real-world consequences for everyone that's involved in this thing, and cyberbullying. 

So I'm curious -- Ross, how do you think about working with various gaming companies or social media platforms? Do you get into the trenches there and help shape how they need to think about rolling out community guidelines, or enforcing policy and detection?

Ross Ellis (16:50)  I mean, we do work with the social media apps, but their rules are pretty much into place. The interesting thing is that legally, for most of them, you must be 13 to even sign onto and create an account. And what I'm saying [is that] on Twitter and Facebook (although [young people] don't use Facebook that much anymore), but on Twitter and Instagram, they're seven and eight years old. Their brains are not mature enough to handle, you know, something like when someone says, “Oh, you're ugly, you're fat”, whatever. And it becomes a real problem. 

As I said, they check their social media over 100 times a day. And, you know, it's to find out what kids are saying about them. I just wish that they could do something different and use it to their advantage. Matt was right in that [kids] don't want to block and delete, but if they don't, they're going to see all this stuff that's being said about them. And that could go in a very bad direction. 

So [...] I'm not a gaming fan. I really have an issue with gaming, simply because I think it's so anonymous. Someone could turn to a gamer and say, “Well, gee, I'd like to get to know you” or whatever. 

We actually had a gamer’s mom call us. And she said, “My 13-year-old son is on a gaming site. And this 24-year-old woman asked for his address and is sending gifts to him at our house.” That's definitely not acceptable. So, as well as your child, who needs to be very aware and very astute about these sites, the parents do too.

Justin Davis (18:38) And so, Jon, you've been you've been very, very patient here since the beginning. But to follow what Ross just said: There's an article that came out this morning for Fast Company that had some research put together by Pew, by MC International, and the ADL. I urge everyone to check that out after the webinar. And of course, we'll send that out in the follow-up materials. 

But one of the recommendations that came out of that, or, there [are] three things: 

One is that moderation tools need to be approved; two is that how case management is handled needs to be streamlined, so that when people report these issues, when your kids are reporting these issues on various platforms, that they understand that something's actually being done. And then the third one is recommending additional collaboration between various platforms -- social networks, dating apps, gaming companies, and that sort of thing.

Based off of what Ross said — she mentioned something interesting around underage users on platforms. And, you know, we see a lot of that with our work with the dating app community specifically. Can you give us a little bit of perspective around [...] what we've seen on that spectrum; around the tools that are in place historically to detect these behaviors? And what we're doing that can help, whether [its] cyberbullying underage users on various platforms, or beyond?

Jon Purnell (19:57) Sure, absolutely. I think one of the first points of that article [was] about moderation tools. I think that's definitely one of the Achilles heels here. Is there an effective tool to help moderators to dive into the situation that they're looking at?

And in that line, it's one of the things where I think AI is starting to catch up in the natural language processing field. It's behind [in] image and video detection. It's something that -- just in the last year and a half -- [is] really coming along, but it has given us a lot of great new tools. 

And to your point [...as to] what Ross was saying about underage users appearing -- You know, there's a lot of new research showing that there is a different way that younger users use language to communicate, just beyond the prevalence of emojis and memes. And so, the newer technology on language models can [be] used to help give some signals and hints to moderators that “Hey, this user might be below the guideline age” and also to help me, I think, talking about younger users who might be vulnerable...you can write some insights on that. And then also just capturing more semantics of the conversations. 

Justin Davis (21:23) Right. So Matt, what do you think in terms of the role that technology should play here and the efforts that various social platforms have delivered to date here? And do you have any insight into, not the failings, because a lot of great companies have done great things here...but, you know, there's a great community with the FPA (the Fair Play Alliance) and so forth. But what more can be done here? What role does technology need to play in the court?

Matt Soeth (22:01) I think that [...] the Fair Play Alliance really tapped into something important -- which I have seen firsthand through a few different events and just in my conversations with the companies -- [the importance] of really collaborating on [...] safety principles [and] safety best practices. And it was a really intentional effort of recognizing that something needs to happen.

 Along that line, I’m also seeing a lot of startups -- the conversation is starting from a safety by design [perspective], and the early days it was so much [more] about, “We have this tech -- let's get it out. Let's make some money.” And now I believe there's a lot of thought going towards, “Okay, if we are going to start this, how do we protect our users in that process?” 

And so you're seeing a lot of [...] ethical tech summits, ethical tech boards, advisory research, you name it. [In] Australia, there's the safety by design principles that have come out from their eSafety Commissioner (@tweetinjules), and you have new groups like All Tech is Human who are bringing these companies together to have these larger, somewhat philosophical -- but then breaking it down into practical -- conversations, like, “Is it reporting tools? Is it AI? Do we have better ways to track this?” 

Even this week, you had the article from The Times specifically calling out what is happening in gaming and how to respond to this. And I think there are some people out there who are very committed to improving this. And, you know, with the tech, [...] it's pressure on the companies for sure. But I also think that there need to be conversations around [...] who is in this biosphere that can help. 

So you have NGOs, you have special [...] tech groups, summit groups, research groups, think tanks...it really takes all of us -- down through parents, through schools, [...] through users [...] working together to not only identify this behavior, but to respond to it. To flag it, to report it, to [...] work together, much like a neighborhood watching our community. It really takes all of us creating systems, creating policy, and creating tools to keep us safe online, and tech can definitely play a role in that. 

One of the studies I like to reference the most from the last year [...] has to do with Blizzard Entertainment and how they managed to reduce the online negativity on their platform by [about] 43% and they did it [by just throwing] some experiments around […] incentivizing positive play. 

So there's definitely a role [that] I think tech can play within the game. There's definitely a role [that] users have in making it better. And we can definitely all work together and have a huge impact on this.

Ross Ellis (24:14)  I totally agree with that.

Justin Davis (24:17) Ross, what would you like to see happen for some of these -- not just the gaming companies specifically, but just any social platform? What would you like to see improved on the technology side and the way that they implement the solutions?

Ross Ellis (24:30)  I'd like to definitely see them improve their reporting. There are one or two out there that say “report here”, and then nothing is done. So I think when you say to a child that's on your site, “report it”, you need to follow through. 

But Matt was right, because we need schools, we need communities, we need technology companies, all to work together. Because otherwise we're never going to solve this problem. And it doesn't have to be a problem if everyone agrees that we need to help these kids and work hard to find a way, find a solution.

 

Screen Shot 2020-07-21 at 12.29.13 PM

 

Justin Davis (25:08) Ross, staying with you for a bit, what keeps you up at night the most about this particular problem? Is it the tech? Is it the processes? Is it something larger and more macro? At the macro level, around politics or societal factors. But what do you think [...] about this particular problem?

Ross Ellis (25:33) What really keeps me up at night are these kids that just have no concern. They think it's funny. I will check our transcripts from the night before from our HelpChat Line. And kids are joking around, they're funny or they're saying things that they really don't mean. And I think there are underlying causes for this, because they have to realize it's not a joke. They have to have seen kids in their community, sadly, having taken their lives by suicide, or just really having problems. 

It does keep me up at night because I wonder, “How are we going to help the next kid? Who is the next kid that we can help?” And every night there's at least one kid that comes on to our HelpChat Line that is having suicidal thoughts. And I tell our counselors, they must learn that, no matter what, nothing is bad enough to take your life. So I think by all of us working together, we can make some great strides.

Justin Davis (26:39)  Matt, what about you? What keeps you up at night about this problem?

Matt Soeth  (26:44) My fear of bears. Wait, no, I'm sorry, this topic? No, for me, it's really, you know, what is being done? [...] Cyberbullying, it's scary...it's everything else you want to throw at it. But in reality, it's that there's way more positive than negative.

 One of the things, particularly in schools, that we’ve had a lot of success with is, sure -- training them on how to respond to negativity is great --but teaching them how to build positive social campaigns is even better. Because when kids are creating content, when they own the content -- when it is theirs and they get to drive it -- then they're also going to protect it.

And so for us, that's where [unintelligible] for good came in, is that we saw all these great things happening in schools and just wasn't getting talked about, you know, like Project Woke. Christina is now a sophomore -- almost a junior -- who created a whole Instagram page around women of color and social media and how she wants to change that narrative.

You get projects around mental health, you get stuff around feminine hygiene, and the whole period project out of the Santa Cruz area, just looking like, “How do we get these products to girls who can’t afford them in the local community?” You have people in Veterans Affairs using technology and knowledge of robotics to print functioning 3D-printer-printed hands [...] so this one veteran can throw and catch a softball who lost his arm in Afghanistan? There's some really cool stuff happening out there. 

I think [that…] in terms of solutions and things like that, what keeps me up is, yes, this is a serious problem. But I think there's a lot of positive stuff out there. I think young people, as we said earlier, they're tired of the drama. And they really want to create something positive and do something good and leave an impact on the world.

And if we can do a better job of amplifying those stories [of…] young people, inspiring young people, I think there's a big impact that they can make. And then we can [start...] sharing that and really connecting people with, “Hey, here's what you can do about it. Here's the impact you can make, and let's start pushing in that direction.” 

You know, [...] the first rule of teacher-classroom training is, “Instead of telling kids what they can't do, show them what they can do.” And you'd be amazed at the positive behavior that happens. I think in social media, [it’s the] same kind of thing. Yes, there's negative stuff. We need to be able to respond to it. But there's a lot of really good stuff and we need to start sharing those ideas and amplifying those stories and [then] more good things are going to happen.

Ross Ellis  (28:55)  I agree. I mean, they need to feel like they own it.

Justin Davis (28:59)  Yep. I agree. The thing that keeps me up at night, to throw my hat into the ring here, being a technologist, is [that] I'm not convinced that the technology today -- or that has historically been applied to detecting these behaviors -- is where we need to be.

Obviously, that's why we started [Spectrum Labs] two years ago -- [it] was really to solve a specific pain point in the workflow of this problem. [It] wasn't really around the moderation itself or the viewing of the tickets, it was specifically around, “How do you detect this behavior with a high degree of confidence and accuracy?” 

So, Jon, can you talk to me a little bit about how difficult it is for companies to do this? There was (I think) [an] MIT review article came out recently that showed that Facebook catches about 16% of [cyberbullying in] their automated systems. And then there's research that shows only about 18% of people will flag other users. That means there's a whole bunch of haters out there that are going undetected. So, why is it so hard to detect cyberbullying, as well as underage users, human trafficking, sexual harassment and so forth?

Jon Purnell (30:09)  Yeah, I think that was that was a very interesting paper, I mean article -- because it started off with how well they're doing with [automated systems] capturing video and image. And they were saying that they are able to automatically capture 98% of terrorist content, which is great for their moderation team and not having to actually visibly see that. 

But yeah, clearly this shows that the [AI] ability for going through the text and going through chats and conversations is limited. So it's a challenge. It's a couple of challenges. One of them in the realm of AI that we are still catching up with image processing. But also Facebook is facing internet support problems, and so they're just seeing all sorts of behavior, mayhem to…

Justin Davis (31:06) …millions of communities and language and content types.

Jon Purnell (31:09) Yeah, exactly. And I think a lot of the cyberbullying, to your question, what's complex about this problem is the contextual nature of it. Right? And you know, [...] in our approach to building tools like this, [...] the first thing, you know, you always start simple. The first thing you'd always do would be some sort of keyword instruction. And that can get you pretty far and is good for profanity and certain terms like that. 

But then when you start talking about emojis, and we start talking about memes, when you talk about the nuance of language and sarcasm -- you know, you were talking earlier about the passive aggressive -- so it's not, you can't [...] clearly, just by pattern matching, find stuff.

Then it becomes more about getting the context. You have to go beyond individual messages, in its own conversation, and then beyond the conversation, [...] the community around it. [...] How does the community communicate? There is certain language that sometimes is [just used as] banter. And sometimes it can be difficult to make that decision [as to whether or not the language constitutes cyberbullying] without the contextual nature. 

I think that is what makes it even more complex. I went to that Fast Company article. There were examples they gave [...] about network harassment -- [...] that it's not only on one platform; it's across multiple platforms. 

But that's also I think because I'm coming from a tech perspective, right? That's also an addressable problem. You know, you're talking about how we get platforms to work together. And there's a lot of work in AI and machine learning about what they call “differential privacy”. So [...] there's tech that's available now that we can take our AI systems and apply them in a way that still preserves privacy from users, in addition to also providing help to moderators, and also [...] privacy for platforms. So there [are] ways to still put these rules and actions [into place] without exposing people; without compromising the person’s conversations.

Justin Davis (33:15)  Right. Thank you for that. So, I was recently given the advice by a board member, actually, to do two things at once: to see things as they are, and as they ought to be.  

Jon, sticking with you -- I know we've been talking a lot about how things are today and [..] where they've come from. How should they be, when it comes to the technology? What would be the ideal solution?

Jon Purnell  (33:47)  I think the ideal solution is that users could be protected from this kind of harmful content providing parents with [...] the ability to catch 100% of [unintelligible]. 

[...] I think Matt was talking earlier about how kids are getting away with stuff, right? But I think also conveying a consequence for the behaviors [...] -- proactive tools that, you know, when someone is submitting harmful content, [...] [will] give them give them a pause [to] say, “Is this really what you want?” And, you know, “Are you aware of how this looks?” 

Maybe [that we] could be more sympathetic [...]. I think just showing that the platform is looking [for harrassment] and does care about it.

Justin Davis (34:48) Right. So Ross, this is for you. On that vein, let's say we get our stuff together [...] -- the industry at large, and various verticals, and we've got buy-in from major players, the Facebooks, the Googles, the Amazons, you've got the mid-tier players and the tech side. And then you've got all these different social networks. 

If we band together and actually pull this off and start to become much more effective at the detection side, as well as the enforcement and moderation side, what does this look like? What will the platforms look like? 

Whether it's the features or the environment, what does the world look like at that point, in your opinion, that we need to see? If we can visualize, then we start to move towards that.

Ross Ellis  (35:32)  Right, right. For me, the world would be very simple -- and it's almost too simple. But by having everyone work together, including the parents -- because parents feel like, “I'm so busy, I can't, I have my job, it's driving me crazy. Here's your new iPhone, take it.” 

You know, schools very often will say, “It's not our job to help the kids with cyberbullying because, you know, it's not [happening literally] on our campus,” but they're still [responsible for] the students. So if you have everyone take ownership, and you have the kids take ownership, and really involve them to a great extent, I think it could be terrific. 

You know, because we’re doing a Town Hall within the next month or so. And I think if you do this and you involve the kids, they do want the ownership. But everyone has to do it -- the communities, the parents, the corporations, the technology companies, and the social media apps, mostly. 

Without mentioning names, there are social media apps that say, “Sure we're going to help you out. Just report it,” and they don't. So we've got to make sure that if they're going to play in the playground, they have to do what's right.

 

Screen Shot 2020-07-21 at 12.29.53 PM

 

Justin Davis  (36:46) Matt, for you…if we are successful as a collective [...], what does the world look like?

Matt Soeth (36:57) I think it just improves that experience. [...] I think if anyone enters into this online world, they want to know that they'll be [...] safe and protected. It’s very “rudimentary pie in the sky”, but Maslow's Hierarchy -- that I can come in and I can share my opinion; we can have a conversation. 

And I think if the tools are there, and hopefully the societal norms reemerge, to, you know, consistantly across the board, and we can actually have conversations and we can share ideas and collaborate. You know, that's the perfect world scenario -- that we get to a point of really detecting negative behavior [and] toxic behavior and hopefully start amplifying larger conversations that have a positive impact and [...] show [that] most people really are, you know, [...] humans are inherently good. 

And so, trying to get back to the point of [...] how do we find that commonality versus the general divisiveness that seems to be [...] pervading throughout various silos?

Justin Davis (37:54) I think the Internet (I've been saying this for two years now and since I've started Spectrum), that the Internet experiences periods of heavy investment into infrastructure -- you know, the core things that are required to deliver ecommerce and social experiences and marketplaces -- that it swings over to the applications themselves. And people are drawn to different experiences through that infrastructure. I think right now we're experiencing a pendulum swing back into infrastructure investments. And I think -- you know, based on what you said earlier, Matt -- safety by design is a core principle of that. 

But I'm encouraged by what I'm seeing -- a pendulum swing back into infrastructure on a variety of different levels, just to usher in the next wave of applications for users. And I think trust and safety is at the forefront of that, which gives me hope that we're headed to the future that the both of you just described. 

So we have a little bit of time left to pull us out with [...] one last question: If you have one specific message you'd like to leave our audience with. And again, we have folks from data science and machine learning; trust and safety executives, and also just individuals that are interested in this topic.

Matt, what would be the one specific message that you would want to leave the audience with?

Matt Soeth (39:14)  It’s a little sappy, but I'll qualify...i’ll provide context -- that’s good chat-moderation word, right? No, just hope, right? I think any human being who entered this world and cares about it wants to make an impact and leave it better than when they left. And if you look at people working in trust and safety, I think [...] as you just mentioned, again -- safety by design -- [I] just cannot impress upon that enough. 

And, you know, the conversations I'm having -- not only with young people, but with adults -- is that there are a lot of people who really want to make good things happen. And again, if we can keep working towards that goal and really amplify that message, that's where I think we're going to start heading in a direction of making the Internet a better place -- that it can be, and for the most part, that it is.

[...] Along that line, [...] what are the tools and resources we need to make sure [progress in this arena] keeps happening? [...] If it's going to be AI, if it's going to be content moderation, if it's [going to be] us reporting -- just across the board, [...] providing the tools, resources, education and training. I think good things are on the horizon. We just need to keep building up those resources and supporting them.

Justin Davis  (40:20)  Ross, [...] what specific message would you want to leave our audience with today?

Ross Ellis (40:24)  Well, I agree with everything Matt said, but what I'm seeing here, at STOMP Out Bullying™, [is] a lot of blame -- because parents call us [or] email [us], blaming the school and the teacher. And the teachers are saying this [...] shouldn't be happening, because it's not on our campus. Let the parents deal with it. 

So, for me, for everybody to work together and stop the blaming, and really take ownership. I think everyone is going to be so much happier and healthier --- digitally and in every other way, even in the mental health area -- because kids are terrified when they start seeing “go kill yourself” and all those horrible messages. But if we work together, and everyone accepts the ownership, I think we can make great strides.

Justin Davis  (41:13) Jon, and you can’t say [unintelligible]...what would you say for your specific message?

Jon Purnell  (41:20)  Yeah, [...] to the point I was making earlier, which is that [...] what I see when I look at [...]advances in tech is the stuff that's happening around text and understanding text is still like, you see a lot of toy examples [unintelligible] But I think behind it, I firmly believe that the technology [...] [that has] very recently been coming out of academia is going to make drastic strides in addressing this, so I think the tech is there. 
And so even though [that with] the current tools, you might see an industry seem to fall woefully short, I think we're very close to seeing [...] revolutionary change across the industry, and [the] performance and ability to capture this kind of behavior.

Justin Davis (42:15) Yeah, that would be my message as well. So, I think we get an interesting perspective from Facebook -- that this problem can’t be solved, or [that we are] years away from being able to solve it… and that's just not consistent with [...] the results that we're seeing at Spectrum around how contextuality can be applied and committed to solving the detection issue of complex behaviors like cyberbullying. 
So, thank you [...] to our panelists for your expertise and engagement and adult leadership, Matt and Ross, and JP. It's great to have you on today, and thank you to our listeners for your attendance as well. I urge all of you to check out #ICANHELP and what Matt's up to about bullying, and to learn more about what they're up to and see how you can get involved [...]. And we'll send a follow-up note afterwards with some key themes and some information based off of the discussion, but [I] appreciate your time today, and thank you very much.

Ross Ellis  (43:09)  Thank you.

Matt Soeth  (43:10)  Yeah. Thank you. Appreciate it.

 

Learn more about how Spectrum Labs can help you create the best user experience on your platform.