Companies in spaces like social networking, video game development, online marketplaces, and dating apps typically don't have the bandwidth to do their own content moderation. Usually, they outsource it to third-party content moderation firms that are often overseas. This makes perfect sense – they should be directing their resources to the core of their business.
Unfortunately, these large content moderation companies are currently dropping the ball. During a time when more people are communicating online than ever before, it would almost appear that content moderation is disappearing.
For the most part, the content moderators have gone home as their own countries have issued shelter-in-place orders – But their laptops haven’t left with them.
Why?
A few reasons: First, content moderators regularly deal with content that violates laws (like material depicting child sexual abuse). The moderators must work within a highly secure network in order to correctly handle that content — something that can’t be replicated at home. And second, many just don’t work on laptops.
In these trying times, we must adapt.
It is clear that online communities must increase their investment in AI moderation solutions. But some may have trouble doing so:
For these platforms, the next-best step is to turn to an outside technology partner they can trust. Who they select should come down to breadth of capability and speed. The faster a company can offer up a comprehensive solution, the more they can support their users in this trying time.
Lorem ipsum dolor sit amet, consectetur adipiscing elit
These Stories on COVID-19
Sign up for our newsletter
Industries
No Comments Yet
Let us know what you think