Daily, content moderators sift through reported content to determine if it is inappropriate for viewing or meets policy guidelines. In the midst of this content moderation, nudity, sex, and violent acts come across their screens. Having to view these videos showing abuse, self-harm, or inappropriate activity will cause a level of mental issues for these content moderators.
Types Of Mental Issues That Develop
In most of these cases, a form of PTSD begins to develop after reviewing numerous videos and images throughout the days and weeks. After coming across these visuals, their minds experience a level of trauma similar to healthcare, counseling, and law enforcement professionals. With this trauma comes a breakdown in their work and the assistance from a counseling professional is essential for their mental health.
Anxiety is also another sign of mental issues that develops among these content moderators. Watching others be attacked or even murdered can have them paranoid about their surroundings, forcing social anxieties on individuals that were once comfortable in their daily lives. Also, these content moderators maintain their employment based on audits from their superiors. Ensuring that they are not making mistakes on top of the content keeps their adrenaline pumping through their shift.
As these types of mental issues start to break down an individual, they begin to develop more serious issues such as depression. These moderators will begin withdrawing from their friends and colleagues, finding a way to keep to themselves, and appear miserable daily.
Working Through These Mental Issues
If these content moderators are now offered quality counseling through their employer, the chances of them seeking it out on their own is slim to none. In some cases, these content moderators do not trust the counseling team available or find them to be unhelpful. In these cases, they seek out their own ways of dealing with their emotions so that they can function through the day.
Whether it is drugs, alcohol, or an increase in sexual activity, these content moderators are looking for ways to escape their reality and the content that they are viewing. Unfortunately, these acts can cross-over into the workplace, especially when their mental issues have continued without professional treatment.
Instead of self-medicating, other content moderators start to recognize the effects that the job has on them, and they leave the industry, pursuing a new career. This leads to a high level of turnover in content moderation when the employees decide that the job they are asking is not worth the stress and anxiety.
Guiding Your Content Moderators Through Their Job
This position of content moderation is meant to be an entry point for the industry, but the turnover that it is seeing is preventing quality employees from generating at the rate they should be. Since the context of content moderation has shifted into a time of hostility, violence, and inappropriate content at large levels, we have to start seeing it for what it is and not what it once was.
In order to protect the industry and those employed, we must be empathetic to their mental health and invest in resources that will guide them through their time as content moderators. The content they are viewing daily is not the same as it was a decade ago and even before. Take this month of May, Mental Health Awareness Month, to find ways to inspire your staff and invest in their mental health moving forward.
Discover How to Deliver Safe and Engaging User Experiences in this On-Demand Webinar