At the recent event “Safety Tech 2021: Unleashing the Potential for a Safer Internet,” a group of industry leaders (including Justin Davis, CEO and Co-Founder of Spectrum Labs) got together to talk about the changing Trust & Safety ecosystem.
As online traffic continues to rise, and additional channels like video and voice chat are supported on a variety of platforms, Trust & Safety leaders will dedicate their efforts to ensuring a safe, inclusive environment for their users and for their moderators.
One critical consideration is the intersection of people and technology in the digital world. On one hand, there are the human content moderators and the technological solutions that are being applied to improve moderation efficiency and accuracy. On the other hand, there are the users and the technology that is used to protect the online experience and user safety.
To make the greatest impact, companies should be aligned on Trust and Safety initiatives across the organization. Building this alignment is critical to getting the necessary budget to support strategies – and for promoting safety by design.
Learn More About: Online Child Safety
“If you don't have the budget for Trust and Safety, then you don't necessarily have the budget for a social platform.”
The idea that Trust and Safety should be built into a platform’s budget from the start speaks to two key concepts: building Trust and Safety alignment, and safety by design. Often, we see a lack of alignment that goes all the way to the executive level at a platform. This means that the platform doesn’t have internal agreement around their tools, resources, and people needed to reach their Trust & Safety goals, which is a significant barrier to ever reaching them.
“When you talk to executives at these companies, they’ll say, ‘Hey, we obviously don’t want these things on the platform,’ but there is still a disconnect between what is said at the highest levels and what actually gets implemented. Trust the second one.”
Safety by design is important, because it builds Trust and Safety into a platform, or a new feature, from the beginning. This ensures that your people, users and moderators, are protected from harm even before product launch.
It gives a platform the space to develop, test, and improve processes ahead of time, instead of developing solutions retroactively after a crisis. Implementing safety by design requires buy-in at all levels of the company; without it, product features will be prioritized to the detriment of achieving Trust and Safety goals.
Discussing Trust and Safety initiatives should take into account both of the audiences affected by content that is outside community guidelines – the users, and the content moderators. Protecting content moderators falls into three general categories: tooling, external optics, and burnout.
Tooling refers to developing solutions that will relieve the burden of manual moderation. This lightens the pressure on human moderators, allowing them to focus their efforts on high-priority objectives. It can also be used to limit exposure to harmful content. External optics focus on >protecting content moderators from exposure or physical harm, as they may be vulnerable to social pressure or doxing in the course of doing their jobs.
Finally, content moderators should be protected from burnout.
“As a society, we haven’t fully understood the societal impacts of seeing toxic content all day and what that does to stability, democracy, and society at large.”
One of the biggest challenges that platforms face is managing burnout for content moderators. It’s grueling work, and companies that engage content moderators must consider how to mitigate the risk of vicarious trauma or PTSD for their employees. This may include partitioning work, giving breaks, and engaging in proactive employee wellness programs to provide support and relief.
Moving forward, challenges may include building alignment for Trust & Safety initiatives, implementing safety by design, and dealing with a lack of accepted standards throughout the digital world. Understanding the benefits of technology today can help to lay the groundwork to have these conversations at the highest levels of your platform, and throughout the organization, to manage these challenges and develop the technological solutions to support your people – users and moderators.