
Facebook and other social media outlets have been under the microscope for their content moderation policies. Content moderation is a necessary evil to create a positive user experience within software and apps who have forums and communities that allow users to communicate openly.
Although free expression of thoughts and ideas is encouraged, when free expression impedes on other users' ability to express themselves freely, it forces Facebook and other internet-based platforms and apps to strike a balance between free speech and protecting users. When malicious users and trolls unload in a community, they take away the opportunity for meaningful interactions.
Yet, some feel Facebook and other social media companies have too much control to censor online. Facebook's answer to this criticism is to create an independent Oversight Board. Facebook's solution might not be right for your company, but it serves as one example of the type of action you can take to build safe communities within your marketplace app. Keep reading to learn about the specifics of Facebook's Oversight Board.
What Are the Functions of Facebook's Oversight Board?
The Oversight Board is a quasi-judicial body which serves three main functions:
- Moderation oversight. In some cases, content moderation is a simple task. Some Facebook posts and comments clearly go over the line, but it's those which are questionable that cause debate because removing a post or comment or suspending a user becomes more subjective. The Oversight Board will provide more consistency in moderation and make the entire review process more objective.
- Listen to appeals. In the event a user feels a poor decision has been made, the Oversight Board can review a particular case. This ensures all users have the opportunity to share their side of the story and argue their case, making the community safe space for those who participate, while at the same time promoting free expression.
- Make rulings. The Oversight Board works like United States Supreme Court to the extent that when they make rulings on appeals, their opinions will inform Facebook's policies. Rulings might lead to changes in current policies or the implementation of new policies which help keep the platform safe for users.
Ultimately, the Oversight Board will provide more consistency in moderating content for Facebook, and provide users a way to voice their concerns if they feel they have been wrongly censored, suspended, or banned from the community.
Who Is Facebook's Oversight Board?
In theory, Facebook's new Oversight Board is an independent body making binding decisions about posts and comments, but the board will be funded by a trust separate from Facebook. The board will have up to 40 members serving three-year terms from a wide variety of backgrounds to moderate content, listen to appeals, and make rulings, but the board can only function with a minimum of 11 members. Each member will have three-year terms and participate in multiple five-member panels to deliberate cases.
What Happens When Facebook Disagrees with a Board Decision?
Some have criticized the creation of this board as a fruitless endeavor because the Oversight Board's decisions aren't legal binding. This means if Facebook disagrees with a board decision, they don't have to uphold the ruling or comply in similar cases in the future. Yet, Facebook executives claim they will accept the board's decisions as binding, even if their leadership disagrees.
Facebook will not, however, uphold or enforce any decisions or associated policies which violate state or federal laws. The company also intends to provide the resources necessary for board members to make informed decisions about content also adding to the credibility and objectiveness of rulings and associated policies which emerge.