
This episode of The Brief aired on Thursday, January 30, 2020. You can watch the live recording on YouTube.
You may have seen Exploited, the New York Times’ horrifying series on online child sexual abuse and trafficking. We’ve been following the series since the first article, The Internet is Overrun with Images of Child Sexual Abuse, was published in late September 2019, followed by a devastating deep-dive into tech companies’ tepid reaction with Child Abusers Run Rampant as Tech Companies Look the Other Way, and the most recent installment, While They Play Online, Children May Be the Prey — leveling a gut punch to the gaming industry.
The underlying question is this: If online companies are already using software to detect trafficking, then why is this problem growing?
Our CEO, Justin Davis, sat down with Jo Lembo, National Outreach Manager at Shared Hope International to discuss this.
Full Transcript
Meredith Reed 0:08
Hello, and welcome to this episode of The Brief by Spectrum Labs. I'm your host, Meredith Reed. January is National Slavery and Human Trafficking Prevention Month. In this episode, we are bringing attention to the ways that internet communities play a role in human trafficking. We're also discussing how predators use online communities to find and traffic kids.
I'm here today with Justin Davis, Cofounder and CEO of Spectrum Labs. Our special guest for this episode is Jo Lembo, National Outreach Manager at Shared Hope International. They are an organization that is working to put an end to sex trafficking. And Jo, if you could just start by telling us a little bit about yourself, and a little bit more about what Shared Hope International is working on.
Jo Lembo 1:01
Thank you, Meredith, and thank you, Spectrum. I'm really pleased to be here today representing Shared Hope International. As Meredith said, my name is Jo Lembo. And 10 years ago, I really had no idea that children were being sold for commercial sex acts in our own backyards. I had read Linda Smith's first book, Renting Lacy, and I began to understand how it happens, what makes kids vulnerable? What are the tactics that pimps use and how we can effectively fight it? So today, I'm the National Outreach Manager, working with trained ambassadors and volunteers across the nation to educate their communities to help us help protect kids from predators.
Meredith Reed 1:42
And could you tell us a little bit more about your organization's founding story?
Jo Lembo 1:48
Yes. In 1998, then-U.S. Congresswoman Linda Smith visited Mumbai. She received a call from a pastor friend who said, “Children are being sold on Falkland Road in chicken wire cages, and you have no idea the magnitude of the sex slavery going on in India. She thought, “It surely can't be that bad.”
And so, she booked her own flight and went to Mumbai, arriving at about one in the morning. She said, “Take me to the brothels.” They said, “You want to go now?” And she said, “I'm assuming they're operating now. So take me now.”
So as she walked down the alleyways and she saw literally chicken wire cages with small metal beds, and children that looked to her the same age as her granddaughter, about 10 years old. Her heart really broke. And she saw one little girl come out of the little doorway and stood right there. And being a woman of faith, she said she felt like that moment, God said to her, “Reach out and touch her for me.” And she said, “This child smelled like one thousand men...I had no idea what manner of disease she might have, but [...] I reached out my arms and she fell against my chest and I could feel her heartbeat.” And she said, “At that time, I knew I had to do something.”
And so, that was the first Village of Hope that was born. Several more followed. And when [former U.S. Congresswoman Linda Smith] left Congress, she started Shared Hope International, and today the villages still thrive. We support seven international partners, and we provide grant money for five USA restoration organizations.
But in 2006, our board of directors recognized that this was an issue that was here in the United States that our own children were at risk and being sold. And so they pulled our focus back to the United States and began to fiercely protect minors on our own soil. We do that through our three pillars prevention, which is where I live — in the prevention pillar — which is (I say) the “happy place” because I'm actually preventing it from happening and helping to educate communities.
We teach parents, teachers, [and] anyone who works with kids how to recognize the signs and how to effectively respond to that. We also have restoration which supports our partners — which I mentioned earlier — providing direct services, both internationally and here in the U.S.
And then there's the “bringing justice” part and through training law enforcement, judges, prosecutors and passing laws that will strengthen sentencing for all the perpetrators in the commercial sex market, that are seeking to have sex or sell children for sex in the U.S. We include all predators, all facilitators, everyone involved in that chain of marketing that would seek to buy children and sell them. So that's what I do. And that's who Shared Hope is.
Meredith Reed 4:45
Thank you for sharing that. [It’s] such an incredibly powerful founding story and something that I think many people aren't even aware is happening, and that it's not only a problem overseas, but also here in the U.S., which we'll get into a little bit more in a bit.
Speaking of that advocacy piece that you touched on, Shared Hope International recently started an Institute for Justice and Advocacy. […] Could you share a little bit more about that?
Jo Lembo 5:16
Super exciting — I was there and it is powerful to be just two blocks from the White House. The Institute is a venue and it's a voice for survivors of sex trafficking. It represents justice and advocacy. It's a center for training, for research, for education, and it gathers and concentrates the power of Shared Hope’s collective resources under one roof.
Now, the word “justice” is vital in the Institute’s name because we're fighting for justice on behalf of the victims of child sex trafficking in America. The Institute has some of the Nation’s finest attorneys on this topic, and they work out of that office — working with all lawmakers, legislators in all the states and D.C., and advocates and stakeholders coming from across the spectrum of sex trafficking prevention, and they'll convene at the Institute to work on justice and restorative initiatives. They'll participate in training programs.
We train law enforcement, social workers, lawyers, first responders, doctors, legislators, judges…all of those who are in a position to make changes and advocate for the rights of victims. Those are the ones that we want to convene there and gather together.
The leads of the Institute’s seven special efforts are listed on our website at sharedhope.org. And that's what will enable Shared Hope to dig deeper, with a more [...] expanded and aggressive stance, which means taking on the emerging challenges as they come.
So, as things face Shared Hope, we're in a position with this location to convene with those who can make a difference.
Meredith Reed 6:59
And I think that advocacy piece and the education piece is so important in government. And I think it's incredible that Shared Hope International is playing such a large role in being an advocate for policies and for laws to be in place that are protecting sex trafficking victims, and especially for children. So, speaking of the government — the U.S. Department of Defense has named human trafficking as the world's fastest growing crime. Could you tell us a little bit more about what exactly is behind that?
Jo Lembo 7:36
Well, greed, in one word. And money, which goes hand-in-hand with greed. That’s really what's behind all of it. If there was no demand, there would be no market. And there are those that can sell people in order to make a profit. And so that's really the bottom line that drives the market.
Now, human trafficking is a very broad topic that also includes labor trafficking, as well as sex trafficking. So, human trafficking happens in every nation on the Earth. It’s global. It's all ages, all nationalities. It is everywhere on the Earth. And so that's a little staggering.
A full explanation of human trafficking is outlined in the TIP Report -- Trafficking In Persons -- and that's issued by the U.S. State Department's office each year. And it ranks governments on their perceived efforts to acknowledge and combat human trafficking. So that's kind of a broad picture of human trafficking.
Now, Shared Hope was the first non-governmental organization to recognize and to name the horrific phenomenon of commercial sexual exploitation of minors here in the U.S., and so we named it “domestic minor sex trafficking”, or DMS. It focuses on the commercial sex market for minors in the U.S. and how to end the demand that sells children like products.
Justin Davis 9:04
Yeah. And Meredith, I would add to that and say another reason that I think the proliferation of human trafficking has increased over the last four years. It’s really about access. The internet makes it really easy to exploit, and groom, and find victims of all different ages and types, whether you're talking about gaming platforms, or social networks or dating apps. And so, it's just a lot easier now to find these folks and put them through these types of behaviors.
Meredith Reed 9:31 Absolutely. And taking it back to you, Jo. Because our listeners work mostly for U.S.-based companies, could you give us a better idea of what human trafficking looks like in the U.S., and how does it compare to the rest of the world?
Jo Lembo 9:47
Well, yesterday, a study came out from the Minnesota Department of Health and the University of Minnesota School of Nursing that was really staggering. So I'm going to read a few statistics for you because it really is what sex trafficking looks like in the U.S.
So, for the first time ever, 9th and 11th graders in Minnesota schools were asked, “Have you ever traded sex or sexual activity to receive money, food, drugs, alcohol, or anything of value?” And the results from that single question — out of thousands and thousands and thousands of children — was about one in five students answered in the affirmative. [Correction: 1.4% percent of respondents in the above referenced survey said they had traded sex for something of value, which University of Minnesota researchers estimated was equal to approximately 5,000 teens in the state being exploited for sex.]
It's evident that trans [and/or] non-gender-conforming Native American, African American, and Latino youth are disproportionately impacted. We knew that, but to see it in stats that came right from the children's mouths was sobering.
Surprisingly, the highest rates reported were in rural parts of the state of Minnesota. That was surprising because most often we think it happens in cities off of major freeways; it happens in huge conference centers. But we'll talk a little bit more about what makes kids in rural parts of America even more vulnerable.
Supporting what we already knew — that youth with the following histories were disproportionately at risk. We found that it was far more likely if the child also answered “yes” to Department of Juvenile Justice involvement, foster care involvement, sexual violence against them or in their home, unstable housing, or if they had spent time in alternative learning centers.
Now, this is just one study in one state with thousands of kids answering, but the results are staggering and most likely understated, as the survey was only two classes, 9th and 11th graders, and it doesn't account for those who may have been missing that day, which is more likely to be the at-risk kids. So, chances are, with truancy and running away and kids who are missing from school, those are the ones that are in the highest risk group.
In the U.S., minor victims of commercial sexual exploitation are predominantly recruited and groomed by predators online -- as was stated earlier -- and who are seeking fraudulent relationships. They'll learn a kid's name, their hopes, their dreams, their hurts, and then they'll use that information against them.
Shared Hope recently did a six-month research project where our researchers posed as various-aged females online, creating fresh profiles. And the purpose originally was to find out how kids are using apps and what their experience is. And we knew there would be negative feedback; things that we knew were out there, but we wanted to hear right from the kids. We didn't want to just do our own studies and then assume what kids were thinking.
And so, we posted as a petite 15-year-old girl, and we began to reach out to general audiences to ask, “What’s this app for? It's very popular. Are you on it? What do you do with this app?” And we found out far more than we were expecting.
The responses were almost immediate and multiple [were] from older males trying to engage with our 15-year-old persona. One male seemed to prefer younger children with the profile icon of a puppy, and his screen name was an emoji of a tiger and he later asked our research profile to send him a photo of her “in her jammies”, reflecting language that a younger child might use. That in itself was alarming.
Another predator used questions to try to relate to our researcher child, making it very easy then to fabricate ways in which they could be similar and understand and build trust. For instance, he waited until the researcher said she played soccer before he shared that he also played soccer.
Now, in many parts of the world — you asked about trafficking at large — the cultural bias against women creates the market and often places [sexual exploitation] in the open, right out where you can see it. Many countries still consider a woman or a wife as property. Many countries subdue the rights of women. And so, in those countries, we will see sex trafficking, in particular, very much out in the open and very much culturally accepted.
Now, here in the U.S., it's hidden behind the screen of a mobile device or a computer. Pimps seldom have to risk on-the-street negotiations or being in the hallway of a hotel or the parking lot of a business. But instead they can find, recruit, and sell their product of children from the safety of their own home.
[There are] thinly-veiled sites that promote confidential conversations and meetups, such as the Whisper app. That was one that we did some extensive research in. It has a “hot teen girls” byline -- sorry, page -- with a byline that says, “teen girls just looking to talk to sexy guys”. And we find often on a number of the sites, the restriction age is 12, which is not easily enforced. Any kid can say they're older than, you know, whatever. It's asking them to say that.
I'm alarmed because so many of these sites will also snag 18-year-olds. We’re finding [in the] U.S., the laws come down more on the trafficking of minors; that pimps are moving to college campuses and recruiting the college students that look like they're 14. And so, even some of these sites that are restricted to 18 years or older are still dangerous, even though these kids are considered mature enough to have left home.
Justin Davis 15:54
There was a question from the audience real quickly: Which app did you share that you used to pose as a 15-year-old girl?
Jo Lembo 16:01
We used a number of apps, the most common one, which has now been shut down, was Kik. And as you know, if you've researched those apps, Kik was in news articles and headlines a lot for being used to bring kids in, trapping them and using them for various things. There were even murders involved. And so Kik was one that they used.
They also used — as I said — Whisper, and then if you go to [the internet safety page on] our website, there is a whole page of apps that we researched. And we wanted to look at the good things that these apps do, because they are amazing. And the internet is an incredible tool to connect people and open horizons. But the list of apps is there. I think there were 14 of them that were utilized in this research.
Meredith Reed 16:53
Thank you, Jo, for all of that information. It's truly staggering and also really, really obviously upsetting to hear. I think we know that some of this goes on. But when you really look at the data, there's such a prevalence of predators on any sort of social site, and girls and women are disproportionately affected. So it's definitely good to have that awareness of what is actually happening on these platforms.
Justin Davis 17:21
Jo made a good point about it, too. It's just like our offline “safe spaces”. There are lots of great things out in the community, in the real world, that are very positive, where people can go and enjoy themselves and find entertainment and build community. And those places also have [...] challenges around [...] trust and safety or public safety. And so the internet's no different than that. But [...] you know, for all the bad stuff we see, [...] there's a ton of positive experiences that folks get to have on the internet. [...] I don’t want that to get lost in the discussion.
Jo Lembo 17:51
Absolutely. And I'm old enough to remember when there wasn't an internet, and so this is a huge, wonderful world out there. I mean, it connects me to my grandchildren across the nation, which, yay! So yeah, I agree. Thank you.
Meredith Reed 18:05
I guess my next question for you is…I'm sure a lot of people want — like you were saying — the Internet has so many good features to it. We all want to keep using the internet, but we want to do so in a safe way.
So my question is, what can we do as consumers and as people on these platforms to make sure that either ourselves or our children are taking safety measures that can protect them and be the first line of defense against predatory behavior?
Jo Lembo 18:40
That's a great question and one that I get all the time from my parents, teachers, pastors, youth leaders, any caring adult, that's the question they ask. Traffickers use personal information to develop trust and a bond between themselves and [the person] they hope to be their victim — and so eventually, they'll ask [their victim] to meet and and that's when the child is trapped and may disappear.
So one of the first lines of defense is to teach young people — all the way down to the very first time they have an iPhone in their hand — is that they have to be careful about the information they share: their school schedule, who their best friend is, when they feel like running away from home. All of these personal things that kids tend to share without thought are being picked up by predators and then utilized.
We heard one story of a girl — she was mad at her mom, she'd gotten a bad grade, she’d gotten grounded. Mom had gone to work and she had not yet left for school. And she posted on her social media that she was so mad at her mom, she just wanted to run away.
When she left the house about 20 minutes later to walk to school, there was a woman in a car sitting at the curb [who] rolled the window down and said, “Would you like a ride?” and she said, “No, I’m not supposed to ride with strangers,” and she goes, “Well, I know you're really upset with your mom. And I know that you really would, you know, you're thinking about running away and, and, you know, I'm just here to listen, I'm just your friend.” The girl got in the car and then disappeared.
So those are the kind of things that kids need to be so aware of — don't tell everything about who you are, where you are, don't post your class schedule, that kind of thing. But the important thing for the adults in their lives is to set the privacy settings so that even photos don't have [location information on] them [...].
We did an exhaustive study as well on preventative measures, preventative equipment, things that you can utilize, tools. And that's also at [our website]. They are ever-changing. So, we are ever-changing the website — finding new tools and finding new ways.
But that, I think, is the most important thing — for those who care about kids to teach them not to share their whole lives on the internet. And then for their caregivers to be sure that their settings are so that it's not so easy for predators to find them.
Meredith Reed 21:06
Sorry to interrupt. Let's take this one step further. And Justin, looping you into this. Yes, the kids can do certain things to protect themselves. Yes, it's the parents’ responsibility to make sure certain safety measures are in place. But it's also the responsibility of these platforms to have an infrastructure in place that is going to protect the users.
Justin, you work with a variety of different companies — gaming, social media, even peer to peer marketplaces — and their trust and safety teams all have to address these threats. So, my question for you is, what are the best trust and safety teams out there doing to address human trafficking and similar disruptive behaviors.
Justin Davis 21:58
Yeah, and before I answer that, [I’d like to] address the previous question as well. I think research shows that about 18% of people report other users. So I think there's a lot of education that needs to happen around teaching people how to use these reporting features — of all ages, across all platforms.
And for some platforms, it's more challenging. They weren't built with safety by design in mind, so it's really difficult to even understand where these reporting features are on the platform. Or even more importantly, how to monitor the case once you've reported it.
A lot of people feel pretty frustrated, I think, [by] the fact that there's no case management or any effective way of really tracking once you've reported another user, what happens? It could take hours or days or months, and by that time, the damage has been done.
On the other side of that, there has been a lot of work by many different nonprofits and NGOs out there that are trying to educate everyone — from parents, to teachers, to kids -- like the folks at #ICanHelp, they do a lot of great work there, there's international folks like the Internet Watch Foundation, the ADL does a lot of work in that area, Social Media Matters.
So there are a lot of great folks that are very passionate about exactly what you were asking about — around training people how to use social media responsibly, and how to report things when they see something [online that may be suspicious, offensive, or illegal].
We see this in the real world. Back in the day, it used to be, “Don't talk to strangers”. Right? And so, this is just an evolution of that. In certain metropolitan areas you have...or you know, New York, specifically...you have the whole “If you see something, say something” [campaign].
I think [that] right now, folks, for the most part, at least 80% or so, don't necessarily say something. They either [unintelligible] from the platform or [unintelligible] from the conversation or disengage completely. And that doesn't really help the social media platform of any type of dating app, gaming company, marketplace, whatever it is, it doesn't help them identify [the problem]. They need to know when a user is feeling harassed -- that's a helpful signal for them. That feeds into a lot of the data that they're looking at in their content moderation systems.
And so, it really relies on the community -- to not self-moderate or self-police -- but it's an important function that has to be much more robust and easier for consumers and users to use, so that they can feed stuff into social media platforms, so they can effectively moderate and protect their community and protect their brand and so forth.
The question you just asked [...] in terms of teams and structures, and how do these companies think about different building an organization that's staffed appropriately to address these issues? Yeah, I mean, that's a meaty one, right? Like, historically, you've got trust and safety teams that were embedded into marketing, or in engineering, or in product. And those were disenfranchised and disconnected from the policy folks and from the legal teams. And those were even further disconnected from, maybe, the CMO.
And then, even further, you have data science teams that aren't really embedded into those groups either. Over the last couple years, I think you've seen some advancements there. There's been some really nice consolidation [that] we've witnessed and experienced with our customers and partners. [...] We've seen the best team structures are ones that have a pretty solid alignment across folks in policy and marketing, trust and safety, data science, and maybe even corporate social responsibility for things for folks that are looking at; paying attention to things like digital wellness, and the mental health of the moderators.
And the folks that are having to weed through this content on a daily basis, that's increasingly becoming an important factor in the way that these teams are structured. And they're monitored, and they're measured, as well as how we take care of these folks that are having to deal with this stuff on a daily basis.
So [...] there's not necessarily a silver bullet or a panacea for any given organization, because some of these social platforms, they run really lean, like some of the largest dating apps in the world, for example, have 30, 40, 50 people total. And then you have some of the larger conglomerates. They have, you know, 500 people on staff and 55% of their workforce is moderation, is content safety and user safety folks.
So there really isn't a right way. It just depends on, honestly, how the app or the platform is structured. Do they have a robust set of safety features built in, you know, safety by design? And have they done a good job of educating their users and their consumers on how to flag and report abuse or toxic behaviors?
When it comes to human trafficking, it's a little bit more nuanced. There's a lot more things that have to be in place for detecting something [...] as nuanced as human trafficking. The same with things like sexual harassment -- it is not necessarily as cut and dry depending upon the context, you know? Two consenting adults versus two kids. And there's certain sequences of emojis, for example, that on certain platforms are completely benign. But if they're used in other areas, like marketplaces, like a crown and an airplane and a bag of money, for example, that sequence of emojis could indicate there's something more nefarious going on.
And so again, context matters. But you have to have the right people on staff that understand this stuff all the way from the folks setting the policies and community guidelines to the folks that actually go and have to look and review this content.
The last point on this, the thing that's probably not the most important but is increasingly becoming more and more important as folks are more aware of what these issues represent, is having a strong commitment to diversity and inclusion on those teams. You've got to have all sorts of different backgrounds and creeds and ethnicities and religions and genders represented. Because that's the only way to appropriately address or cater to the needs of at least 80% of the population, knowing that you're probably never going to get it 100%, right? But making sure that there's a strong commitment to D&D or D&I on the team makes a big difference. And the ones that we've seen do it, do really, really well.
Jo Lembo 27:57
Absolutely.
Meredith Reed 27:58
That's all really good information. I was wondering if you could maybe take it a step further and give an example of one platform in particular, and some of the processes that they have in place that have proven to be more effective against fighting against human trafficking?
Justin Davis 28:22
I'm not sure I would call out one platform in particular. I mean, every platform struggles to identify this particular behavior. I think what a lot of teams are doing from a process standpoint is they are clearly defining that this behavior is not allowed. I think, you know, someone like Roblox’s Laura Higgins -- if you ever see any of her tweets -- I think she does a phenomenal job of leading the charge over there, around understanding what it is to build a civil environment on their platform, and educating kids and parents around how to have certain types of conversations [like] how to understand when when bad things are happening.
But it's difficult for something like human trafficking. It comes down to education. And it's just really hard to engage with your consumers and help them understand exactly when that stuff's happening. So I wouldn't say that anyone's really solved for these behaviors explicitly. But [...] almost every platform has some team in place that is looking at these types of things and and trying to understand when really nefarious things are happening, like human trafficking.
I think what goes into it, first, is setting up a clear set of guidelines and principles that says [that] these behaviors are allowed or not allowed. Something as cut and dry as human trafficking, obviously, is something that I think some platforms don't even have to explicitly say in their guidelines, because that's illegal.
But I think at a high level, just making sure that your users understand what the expectation is of being on the platform and what kind of conversations or content that they should expect is an important part of it. Because then it helps the users understand when something is kind of out of balance or not. This doesn't seem normal in the normal course of the conversation. [...] Should it be happening on that platform?
Meredith Reed 30:01
And where are we at with the automation of flagging these types of behaviors? Obviously, this is your wheelhouse, Justin. So, can you talk a little bit more to the technology that is being developed to counteract disruptive behaviors on these types of platforms?
Justin Davis 30:19
Yeah, I mean, I've spent almost two decades now on data. I think it comes down to -- for any sort of automation to happen, or across any vertical or industry, much less trust and safety -- it really comes down to the amount of data and labeled data that you have in order to build a detection algorithm that can understand not just general toxicity or severe toxicity, but have specific labeled data that can detect that exact behavior on your platform.
In terms of fully automating [...] I mean, we may never get to a place where the entire ecosystem is completely automated by machines. And I'm not sure that we necessarily want to. I think you're always going to want to have some level of human judgment and moderation and data labeling and oversight in there to make sure that the systems are moving along in the right place.
But I would say that historically, [...] over the last 10 years, we've seen a pretty strong evolution of historical platforms that were really relying on users to flag other users. And again, that's a small percentage, or they were, you know, building out simple keyword detection systems using things like lists of words or terms or regex to identify, Hey, this word is typically associated with bullying or human trafficking.
The problem with that is that it floods the moderation systems with all sorts of false positives and signals that aren't really helpful [and] doesn't help them do their job. And on average, I think it costs the moderation team anywhere from $1 to $6 to review and make a determination on a specific case. So this can get very expensive if you're dealing with a lot of cases that are actually with the system, flagged as what they are.
Over the last couple years, we've seen a lot of data science teams at different platforms, and external vendors, partners, competitors, whatever it might be, build out classifiers or simple classifiers that look at a snapshot in time. So they'll look at a single message or, you know, four or five sentences. And they're typically constrained by a limited amount of text (if you're just talking about text and the conversation that they can review), and that's phenomenally more accurate than keyword detection, which is great. And that's exactly where things should go.
I think the next evolution that you'll see over the next probably two years is a much more robust context sensing type of classification system that understands the nuance of, Hey, this is a sexual conversation. Is it two consenting adults, or is it two children? Being platform aware, understanding things like, you know, time spent on platform, or age, or gender -- things that help provide a better understanding of, What is this conversation about? Is it a conversation that's happened over many, many months, where there's a propensity? They can be built around saying, or we think the system can say, Hey, we think this is starting to look a lot like grooming or human trafficking because of the velocity or the nature of the conversation, and maybe how that particular person or node communicates with other nodes on that platform. Those are all healthy signals.
But again, that doesn't get you to a complete state of automation. It just gets you to a much more efficient way of detecting things.How you ultimately moderate and decide to automate is a whole separate conversation. And that has to be tied effectively to your policies.
The staff -- if you don't have a ton of people that are resources that you can throw. You know, not everyone's got 15,000 people that they can throw at this problem, or can afford to throw at this problem. So I think some of the behaviors can be automated against, like, it's very clear if you don't want certain terms, or certain clear-cut phrases that may indicate it’s cyberbullying or things like that.
But when it comes to human trafficking, that's just such a nuanced and specific set of content that has to happen. And to indicate that maybe even something is starting to go in that direction...I don't know that one can be automated. But what can happen is a major improvement into the way that type of conversation [is detected]. Not just message or sentence or word -- but the way that conversation or thread or community or chat is starting to look like it may may constitute an element of human trafficking. And that stuff is where I think the big part of the improvements will come over the next couple years.
Meredith Reed 34:27
I would also think that there's going to be a massive leap in technology with identifying the person behind the screen. Now, at this point in time, it's relatively easy to create a fake identity or a fake profile, and continue to create fake profiles. And there's a lot of danger, I would think, in human trafficking with having that phenomenon going on.
Justin Davis 34:49
I was just at State of the Net yesterday and that was one of the big topics that came up -- around account verification and identity and understanding who people are online. But if we go down that rabbit hole, which I think is a great [unintelligible] -- we don't do that personally in Spectrum, but at a previous company -- but there are lots of vendors out there that do. And I think that's great. And it's absolutely where the industry should go, in terms of having some level of identifier or way of verifying that the user is a real person.
The problem is -- and maybe this is a topic for another podcast -- but how and where do you draw the line? How far down the rabbit hole does that go, in terms of you having a single identifier across every different platform? Who stores that information? Who holds it? Who sees that global identity is a very tricky conversation to navigate. And whoever does own that space has a ton of power. And we're just not ready for that.
But yes, you're correct, that there does need to be some level of account verification and validation that needs to happen for these platforms. And that definitely gives the folks who do some of that [community management] a much better path of enforcing moderation or putting the liability on the user. Which is what CDA 230 is all about -- making sure that it's not necessarily about indemnifying. The platform is about making sure that the responsibility or the onus of that and various behavior is where it should be, which is on the individual. And account verification is a great step and trying to make sure that that happens.
Meredith Reed 36:17
Yeah, absolutely. And clearly, there's so much here, there's so much to talk about. I think we could probably use two more hours on this topic. Unfortunately, we are getting short on time. And Jo, I want to make sure that I bounce the conversation back to you before we go. So just to wrap this up, would you like to give us a few key takeaways from everything you spoke about today? And also could you let our listeners know how they can support Shared Hope International?
Jo Lembo 36:53
Absolutely. Thank you. First, become educated. We have tools online, we have a National training conference every year. This year, it'll be November 4-6 in Washington, D.C. And workshops include online dangers and law enforcement talking about their investigations and that sort of thing. So it's a really powerful time where advocates and professionals from all across the Nation come together and learn how we can keep kids safe in all different places and platforms. The ambassadors are online, they're trained online, they're volunteers, we provide them [with] a tremendous amount of training. And it's all at no cost. I manage the program, so I'd love to encourage anybody to come on our website and look for Ambassadors of Hope.
Second, motivate others to learn, share what you know, share what you've learned. We just started a new program called Weekend Warriors, and how you can make a difference in 15 minutes each weekend. So we just give you multiple tools and links and films that you can forward to your networks, which is a great, easy way to get involved.
Raising funds. Obviously, we always need funds to do what we do. That's how we offer many of our tools at no charge to our audiences.
Support your legislators in passing stronger laws. Do what you can to protect children and hold predators accountable.
If somebody wants to get their hands dirty locally, [...] find a local transition home that serves this very vulnerable population and ask them what they need and how you can help. So those are my suggestions. I have lots more. You're right -- we could do this for several hours. But thank you so much for inviting Shared Hope International to your forum today.
Meredith Reed 38:35
Thank you for listening to this episode of The Brief. For a full transcript of this episode, links to our sources, and more information on Spectrum Labs’ AI and big data solutions, please visit us at get spectrum.io.
If you enjoyed this episode, please take a minute to rate, review and subscribe to The Brief. Until next time, let's keep our online communities healthy and safe. We'll see you next time on The Brief.