
“This is the most important election of our lifetime!”
That’s a refrain Americans have heard in every election over the past 200 years. Why has that phrase survived through so many election cycles? Because it's effective at riling people up.
In today’s highly polarized and extremely online world, politics can quickly become toxic and lead to widespread harassment or even real-world violence. As a result, online platforms must be proactive (rather than reactive) in addressing the community challenges that arise during election season.
Politics are everywhere
No platform is immune from politicization. It doesn’t matter if you run a social media platform, a gaming site, or a dating app – to some degree, politics will spill over into your community.
It is critical to know how politics will affect your platform.
Politicization brings inherent risks to online communities. Researchers at Facebook found that polarizing political content receives more clicks and engagement than non-political posts. However, the researchers warned that same engagement comes with a big downside:
“Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
– Excerpt from the Wall Street Journal on May 26, 2020
As an algorithmically-driven platform, Facebook was unwittingly fueling toxicity through its own infrastructure:
The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
– Excerpt from the Wall Street Journal on May 26, 2020
By allowing extreme political content to proliferate, Facebook learned the hard way that not all engagement is beneficial. More than 25% of U.S. Facebook users have deleted the app off their phone since 2016, including 44% of users between ages 18-29. As we’ve previously mentioned, users will abandon their accounts if a platform gets too toxic – which doesn’t bode well for key metrics like retention and revenue.
As Election Day draws near, traffic to online communities spikes and so do the risks associated with a more volatile political climate. Online platforms must not assume that more traffic will necessarily bring more healthy engagement.
When does political content become toxic?
Blanket bans on political speech are not the answer to combating toxicity stemming from politicized conversations. Aside from a few hyperfocused niche communities, banning discussion of broad topics is not conducive to a thriving online platform.
However, Trust & Safety teams must be attuned to the heightened risk of toxicity that comes with increased politicization during election years. Some major forms of toxicity include:
- Brigading/Targeted Harassment: When political debate goes beyond disagreement, it usually devolves into personal attacks. Such harassment often is coordinated by groups or individuals with large followings by demonizing an ideological foe. In an overheated political climate, these online attacks can lead to offline threats of violence.
- Disinformation: There’s plenty of truth-bending in politics, but disinformation is the coordinated dissemination of falsehoods in order to provoke a visceral response from people. The end-game for disinfo operations usually is to gain clout or grift off gullible supporters. However, widespread disinformation has led to real-world violence.
- Threats to Safety/Calls for Violence: When political opponents are portrayed as an existential threat, people are more likely to respond with dangerous behavior. This can be exploited by bad actors through doxxing or thinly veiled calls for violence.
Trust & Safety during election season is not about policing unpopular political views. Rather, it’s about minimizing the heightened risk of real-world harm that comes from an inflamed political climate.
Online platforms must have a plan to swiftly recognize and remove attempts to exploit political division for illegal or destructive means. As several examples have shown over the past years, online polarization can have devastating offline consequences.
To learn more, check out our whitepaper: Managing Hate Speech and Extremism on Your Platform