All Content Moderation Trust & Policy frameworks include the Three “P's”:
Principles: Who are we as a company? As a brand? What does our company stand for? What are its values? How do we want our users to behave within our community and platform? This exercise will guide all future decisions.
Policies: These are guidelines or the rules that govern what's allowed and what's not allowed within your community or platform. Your principles need to be much more specific than your principles.
Playbook. A playbook is a platform’s enforcement guidelines, and it’s critical because it outlines how your platform will operationalize your principles and policies.
Additionally, there are two main points to keep in mind when creating a content moderation and Trust & Safety policy:
Regional Policies: Establishing regional policies is a resource-intensive exercise, but often is critical for global platforms. Do you need one? Or will one set of guidelines apply to all users everywhere? The answer depends on your audience and the purpose of your platform.
Underage Users: Children aged 13 years and younger are protected under multiple laws around the world. For instance, in the U.S. the Children’s Online Privacy Protection Act (COPPA) bars advertisers from targeting children without their parents’ consent. Other regulations, such as mandatory reporting laws, require you to report any instance of child abuse, or risk criminal penalties for failing to report such instances.
Download our whitepaper for more insight
This paper asks noted experts in the field:
Where does one begin when setting out to develop a trust and safety policy for a platform?
Is there a framework in which we can start the work?